Category Archives: Uncategorized

The Open Group London 2014 – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

Despite gusts of 70mph hitting the capital on Day Two of this year’s London event, attendees were not disheartened as October 21 kicked off with an introduction from The Open Group President and CEO Allen Brown. He provided a recap of The Open Group’s achievements over the last quarter including successful events in Bratislava, Slovakia and Kuala Lumpur, Malaysia. Allen also cited some impressive membership figures, with The Open Group now boasting 468 member organizations across 39 countries with the latest member coming from Nigeria.

Dave Lounsbury, VP and CTO at The Open Group then introduced the panel debate of the day on The Open Group Open Platform 3.0™ and Enterprise Architecture, with participants Ron Tolido, SVP and CTO, Applications Continental Europe, Capgemini; Andras Szakal, VP and CTO, IBM U.S. Federal IMT; and TJ Virdi, Senior Enterprise IT Architect, The Boeing Company.

After a discussion around the definition of Open Platform 3.0, the participants debated the potential impact of the Platform on Enterprise Architecture. Tolido noted that there has been an explosion of solutions, typically with a much shorter life cycle. While we’re not going to be able to solve every single problem with Open Platform 3.0, we can work towards that end goal by documenting its requirements and collecting suitable case studies.

Discussions then moved towards the theme of machine-to-machine (M2M) learning, a key part of the Open Platform 3.0 revolution. TJ Virdi cited figures from Gartner that by the year 2017, machines will soon be learning more than processing, an especially interesting notion when it comes to the manufacturing industry according to Szakal. There are three different areas whereby manufacturing is affected by M2M: New business opportunities, business optimization and operational optimization. With the products themselves now effectively becoming platforms and tools for communication, they become intelligent things and attract others in turn.

PanelRon Tolido, Andras Szakal, TJ Virdi, Dave Lounsbury

Henry Franken, CEO at BizzDesign, went on to lead the morning session on the Pitfalls of Strategic Alignment, announcing the results of an expansive survey into the development and implementation of a strategy. Key findings from the survey include:

  • SWOT Analysis and Business Cases are the most often used strategy techniques to support the strategy process – many others, including the Confrontation Matrix as an example, are now rarely used
  • Organizations continue to struggle with the strategy process, and most do not see strategy development and strategy implementation intertwined as a single strategy process
  • 64% indicated that stakeholders had conflicting priorities regarding reaching strategic goals which can make it very difficult for a strategy to gain momentum
  • The majority of respondents believed the main constraint to strategic alignment to be the unknown impact of the strategy on the employees, followed by the majority of the organization not understanding the strategy

The wide-ranging afternoon tracks kicked off with sessions on Risk, Enterprise in the Cloud and Archimate®, an Open Group standard. Key speakers included Ryan Jones at Blackthorn Technologies, Marc Walker at British Telecom, James Osborn, KPMG, Anitha Parameswaran, Unilever and Ryan Betts, VoltDB.

To take another look at the day’s plenary or track sessions, please visit The Open Group on livestream.com.

The day ended in style with an evening reception of Victorian architecture at the Victoria & Albert Museum, along with a private viewing of the newly opened John Constable exhibition.

IMG_3976Victoria & Albert Museum

A special mention must go to Terry Blevins who, after years of hard work and commitment to The Open Group, was made a Fellow at this year’s event. Many congratulations to Terry – and here’s to another successful day tomorrow.

Join the conversation! #ogchat #ogLON

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Cloud, Enterprise Architecture, Enterprise Transformation, Internet of Things, Open Platform 3.0, Professional Development, Uncategorized

The Open Group London 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

On a crisp October Monday in London yesterday, The Open Group hosted the first day of its event at Central Methodist Hall, Westminster. Almost 200 attendees from 32 countries explored how to “Empower Your Business; Enabling Boundaryless Information Flow™”.

Just across the way from another landmark in the form of Westminster Abbey, the day began with a welcome from Allen Brown, President and CEO of The Open Group, before Magnus Lindkvist, the Swedish trendspotter and futurologist, began his keynote on “Competition and Creation in Globulent Times”.

In a very thought-provoking talk, Magnus pondered on how quickly the world now moves, declaring that we now live in a 47 hour world, where trends can spread quicker than ever before. Magnus argued that this was a result of an R&D process – rip off and duplicate, rather than organic innovation occurring in multiple places.

Magnus went on to consider the history of civilization which he described as “nothing, nothing, a little bit, then everything” as well as providing a comparison of vertical and horizontal growth. Magnus posited that while we are currently seeing a lot of horizontal growth globally (the replication of the same activity), there is very little vertical growth, or what he described as “magic”. Magnus argued that in business we are seeing companies less able to create as they are focusing so heavily on simply competing.

To counter this growth, Magnus told attendees that they should do the following in their day-to-day work:

  • Look for secrets – Whether it be for a certain skill or a piece of expertise that is as yet undiscovered but which could reap significant benefit
  • Experiment – Ensure that there is a place for experimentation within your organization, while practicing it yourself as well
  • Recycle failures – It’s not always the idea that is wrong, but the implementation, which you can try over and over again
  • Be patient and persistent – Give new ideas time and the good ones will eventually succeed

Following this session was the long anticipated launch of The Open Group IT4IT™ Forum, with Christopher Davis from the University of South Florida detailing the genesis of the group before handing over to Georg Bock from HP Software who talked about the Reference Architecture at the heart of the IT4IT Forum.

Hans Van Kesteren, VP & CIO of Global Functions at Shell, then went into detail about how his company has helped to drive the growth of the IT4IT Forum. Starting with an in-depth background to the company’s IT function, Hans described how as a provider of IT on a mass scale, the changing technology landscape has had a significant impact on Shell and the way it manages IT. He described how the introduction of the IT4IT Forum will help his organization and others like it to adapt to the convergence of technologies, allowing for a more dynamic yet structured IT department.

Subsequently Daniel Benton, Global Managing Director of IT Strategy at Accenture, and Georg Bock, Senior Director IT Management Software Portfolio Strategy at HP, provided their vision for the IT4IT Forum before a session where the speakers took questions from the floor. Those individuals heavily involved in the establishment of the IT4IT Forum received particular thanks from attendees for their efforts, as you can see in the accompanying picture.

In its entirety, the various presentations from the IT4IT Forum members provided a compelling vision for the future of the group. Watch this space for further developments now it has been launched.

IT4IT

The Open Group IT4IT™ Forum Founding Members

In the afternoon, the sessions were split into tracks illustrating the breadth of the material that The Open Group covers. On Monday this provided an opportunity for a range of speakers to present to attendees on topics from the architecture of banking to shaping business transformation. Key presenters included Thomas Obitz, Senior Manager, FSO Advisory Performance Improvement, EY, UK and Dr. Daniel Simon, Managing Partner, Scape Consulting, Germany.

The plenary and many of the track presentations are available at livestream.com.

The day concluded with an evening drinks reception within Central Hall Westminster, where attendees had the opportunity to catch up with acquaintances old and new. More to come on day two!

Join the conversation – @theopengroup #ogLON

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Leave a comment

Filed under architecture, Boundaryless Information Flow™, Business Architecture, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Open Platform 3.0, Professional Development, Standards, Uncategorized

Open FAIR Blog Series – Five Reasons You Should Use the Open FAIR Body of Knowledge

By Jim Hietala, VP, Security and Andrew Josey, Director of Standards, The Open Group

This is the second in our blog series introducing the Open FAIR Body of Knowledge.

In this blog, we provide 5 reasons why you should use the Open FAIR Body of Knowledge for Risk Analysis:

1. Emphasis on Risk

Often the emphasis in such analyses is placed on security threats and controls, without due consideration of impact.  For example, we have a firewall protecting all our customer information – but what if the firewall is breached and the customer information stolen or changed? Risk analysis using Open FAIR evaluates both the probability that bad things will happen, and the impact if they do happen. By using the Open FAIR Body of Knowledge, the analyst measures and communicates the risk, which is what management cares about.

2. Logical and Rational Framework

It provides a framework that explains the how and why of risk analysis. It improves consistency in undertaking analyses.

3. Quantitative

It’s easy to measure things without considering the risk context – for example, the systems should be maintained in full patch compliance – but what does that mean in terms of loss frequency or the magnitude of loss? The Open FAIR taxonomy and method provide the basis for meaningful metrics.

4. Flexible

Open FAIR can be used at different levels of abstraction to match the need, the available resources, and available data.

5. Rigorous

There is often a lack of rigor in risk analysis: statements are made such as: “that new application is high risk, we could lose millions …” with no formal rationale to support them. The Open FAIR risk analysis method provides a more rigorous approach that helps to reduce gaps and analyst bias. It improves the ability to defend conclusions and recommendations.

In our next blog, we will look at how the Open FAIR Body of Knowledge can be used with other Open Group standards.

The Open FAIR Body of Knowledge consists of the following Open Group standards:

  • Risk Taxonomy (O-RT), Version 2.0 (C13K, October 2013) defines a taxonomy for the factors that drive information security risk – Factor Analysis of Information Risk (FAIR).
  • Risk Analysis (O-RA) (C13G, October 2013) describes process aspects associated with performing effective risk analysis.

These can be downloaded from The Open Group publications catalog at http://www.opengroup.org/bookstore/catalog.

Our other publications include a Pocket Guide and a Certification Study Guide.

62940-hietalaJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT Security, Risk Management and Healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on Information Security, Risk Management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

 

andrew-small1Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate® 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX® Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Leave a comment

Filed under Data management, digital technologies, Information security, Open FAIR Certification, RISK Management, Security, Uncategorized

The Open Group London 2014 Preview: A Conversation with RTI’s Stan Schneider about the Internet of Things and Healthcare

By The Open Group

RTI is a Silicon Valley-based messaging and communications company focused on helping to bring the Industrial Internet of Things (IoT) to fruition. Recently named “The Most Influential Industrial Internet of Things Company” by Appinions and published in Forbes, RTI’s EMEA Manager Bettina Swynnerton will be discussing the impact that the IoT and connected medical devices will have on hospital environments and the Healthcare industry at The Open Group London October 20-23. We spoke to RTI CEO Stan Schneider in advance of the event about the Industrial IoT and the areas where he sees Healthcare being impacted the most by connected devices.

Earlier this year, industry research firm Gartner declared the Internet of Things (IoT) to be the most hyped technology around, having reached the pinnacle of the firm’s famed “Hype Cycle.”

Despite the hype around consumer IoT applications—from FitBits to Nest thermostats to fashionably placed “wearables” that may begin to appear in everything from jewelry to handbags to kids’ backpacks—Stan Schneider, CEO of IoT communications platform company RTI, says that 90 percent of what we’re hearing about the IoT is not where the real value will lie. Most of media coverage and hype is about the “Consumer” IoT like Google glasses or sensors in refrigerators that tell you when the milk’s gone bad. However, most of the real value of the IoT will take place in what GE has coined as the “Industrial Internet”—applications working behind the scenes to keep industrial systems operating more efficiently, says Schneider.

“In reality, 90 percent of the real value of the IoT will be in industrial applications such as energy systems, manufacturing advances, transportation or medical systems,” Schneider says.

However, the reality today is that the IoT is quite new. As Schneider points out, most companies are still trying to figure out what their IoT strategy should be. There isn’t that much active building of real systems at this point.

Most companies, at the moment, are just trying to figure out what the Internet of Things is. I can do a webinar on ‘What is the Internet of Things?’ or ‘What is the Industrial Internet of Things?’ and get hundreds and hundreds of people showing up, most of whom don’t have any idea. That’s where most companies are. But there are several leading companies that very much have strategies, and there are a few that are even executing their strategies, ” he said. According to Schneider, these companies include GE, which he says has a 700+ person team currently dedicated to building their Industrial IoT platform, as well as companies such as Siemens and Audi, which already have some applications working.

For its part, RTI is actively involved in trying to help define how the Industrial Internet will work and how companies can take disparate devices and make them work with one another. “We’re a nuts-and-bolts, make-it-work type of company,” Schneider notes. As such, openness and standards are critical not only to RTI’s work but to the success of the Industrial IoT in general, says Schneider. RTI is currently involved in as many as 15 different industry standards initiatives.

IoT Drivers in Healthcare

Although RTI is involved in IoT initiatives in many industries, from manufacturing to the military, Healthcare is one of the company’s main areas of focus. For instance, RTI is working with GE Healthcare on the software for its CAT scanner machines. GE chose RTI’s DDS (data distribution service) product because it will let GE standardize on a single communications platform across product lines.

Schneider says there are three big drivers that are changing the medical landscape when it comes to connectivity: the evolution of standalone systems to distributed systems, the connection of devices to improve patient outcome and the replacement of dedicated wiring with networks.

The first driver is that medical devices that have been standalone devices for years are now being built on new distributed architectures. This gives practitioners and patients easier access to the technology they need.

For example, RTI customer BK Medical, a medical device manufacturer based in Denmark, is in the process of changing their ultrasound product architecture. They are moving from a single-user physical system to a wirelessly connected distributed design. Images will now be generated in and distributed by the Cloud, thus saving significant hardware costs while making the systems more accessible.

According to Schneider, ultrasound machine architecture hasn’t really changed in the last 30 or 40 years. Today’s ultrasound machines are still wheeled in on a cart. That cart contains a wired transducer, image processing hardware or software and a monitor. If someone wants to keep an image—for example images of fetuses in utero—they get carry out physical media. Years ago it was a Polaroid picture, today the images are saved to CDs and handed to the patient.

In contrast, BK’s new systems will be completely distributed, Schneider says. Doctors will be able to carry a transducer that looks more like a cellphone with them throughout the hospital. A wireless connection will upload the imaging data into the cloud for image calculation. With a distributed scenario, only one image processing system may be needed for a hospital or clinic. It can even be kept in the cloud off-site. Both patients and caregivers can access images on any display, wherever they are. This kind of architecture makes the systems much cheaper and far more efficient, Schneider says. The days of the wheeled-in cart are numbered.

The second IoT driver in Healthcare is connecting medical devices together to improve patient outcomes. Most hospital devices today are completely independent and standalone. So, if a patient is hooked up to multiple monitors, the only thing that really “connects” those devices today is a piece of paper at the end of a hospital bed that shows how each should be functioning. Nurses are supposed to check these devices on an hourly basis to make sure they’re working correctly and the patient is ok.

Schneider says this approach is error-ridden. First, the nurse may be too busy to do a good job checking the devices. Worse, any number of things can set off alarms whether there’s something wrong with the patient or not. As anyone who has ever visited a friend or relative in the hospital attest to, alarms are going off constantly, making it difficult to determine when someone is really in distress. In fact, one of the biggest problems in hospital settings today, Schneider says, is a phenomenon known as “alarm fatigue.” Single devices simply can’t reliably tell if there’s some minor glitch in data or if the patient is in real trouble. Thus, 80% of all device alarms in hospitals are turned off. Meaningless alarms fatigue personnel, so they either ignore or turn off the alarms…and people can die.

To deal with this problem, new technologies are being created that will connect devices together on a network. Multiple devices can then work in tandem to really figure out when something is wrong. If the machines are networked, alarms can be set to go off only when multiple distress indicators are indicated rather than just one. For example, if oxygen levels drop on both an oxygen monitor on someone’s finger and on a respiration monitor, the alarm is much more likely a real patient problem than if only one source shows a problem. Schneider says the algorithms to fix these problems are reasonably well understood; the barrier is the lack of networking to tie all of these machines together.

The third area of change in the industrial medical Internet is the transition to networked systems from dedicated wired designs. Surgical operating rooms offer a good example. Today’s operating room is a maze of wires connecting screens, computers, and video. Videos, for instance, come from dynamic x-ray imaging systems, from ultrasound navigation probes and from tiny cameras embedded in surgical instruments. Today, these systems are connected via HDMI or other specialized cables. These cables are hard to reconfigure. Worse, they’re difficult to sterilize, Schneider says. Thus, the surgical theater is hard to configure, clean and maintain.

In the future, the mesh of special wires can be replaced by a single, high-speed networking bus. Networks make the systems easier to configure and integrate, easier to use and accessible remotely. A single, easy-to-sterilize optical network cable can replace hundreds of wires. As wireless gets faster, even that cable can be removed.

“By changing these systems from a mesh of TV-cables to a networked data bus, you really change the way the whole system is integrated,” he said. “It’s much more flexible, maintainable and sharable outside the room. Surgical systems will be fundamentally changed by the Industrial IoT.”

IoT Challenges for Healthcare

Schneider says there are numerous challenges facing the integration of the IoT into existing Healthcare systems—from technical challenges to standards and, of course, security and privacy. But one of the biggest challenges facing the industry, he believes, is plain old fear. In particular, Schneider says, there is a lot of fear within the industry of choosing the wrong path and, in effect, “walking off a cliff” if they choose the wrong direction. Getting beyond that fear and taking risks, he says, will be necessary to move the industry forward, he says.

In a practical sense, the other thing currently holding back integration is the sheer number of connected devices currently being used in medicine, he says. Manufacturers each have their own systems and obviously have a vested interest in keeping their equipment in hospitals, so many have been reluctant to develop or become standards-compliant and push interoperability forward, Schneider says.

This is, of course, not just a Healthcare issue. “We see it in every single industry we’re in. It’s a real problem,” he said.

Legacy systems are also a problematic area. “You can’t just go into a Kaiser Permanente and rip out $2 billion worth of equipment,” he says. Integrating new systems with existing technology is a process of incremental change that takes time and vested leadership, says Schneider.

Cloud Integration a Driver

Although many of these technologies are not yet very mature, Schneider believes that the fundamental industry driver is Cloud integration. In Schneider’s view, the Industrial Internet is ultimately a systems problem. As with the ultrasound machine example from BK Medical, it’s not that an existing ultrasound machine doesn’t work just fine today, Schneider says, it’s that it could work better.

“Look what you can do if you connect it to the Cloud—you can distribute it, you can make it cheaper, you can make it better, you can make it faster, you can make it more available, you can connect it to the patient at home. It’s a huge system problem. The real overwhelming striking value of the Industrial Internet really happens when you’re not just talking about the hospital but you’re talking about the Cloud and hooking up with practitioners, patients, hospitals, home care and health records. You have to be able to integrate the whole thing together to get that ultimate value. While there are many point cases that are compelling all by themselves, realizing the vision requires getting the whole system running. A truly connected system is a ways out, but it’s exciting.”

Open Standards

Schneider also says that openness is absolutely critical for these systems to ultimately work. Just as agreeing on a standard for the HTTP running on the Internet Protocol (IP) drove the Web, a new device-appropriate protocol will be necessary for the Internet of Things to work. Consensus will be necessary, he says, so that systems can talk to each other and connectivity will work. The Industrial Internet will push that out to the Cloud and beyond, he says.

“One of my favorite quotes is from IBM, he says – IBM said, ‘it’s not a new Internet, it’s a new Web.’” By that, they mean that the industry needs new, machine-centric protocols to run over the same Internet hardware and base IP protocol, Schneider said.

Schneider believes that this new web will eventually evolve to become the new architecture for most companies. However, for now, particularly in hospitals, it’s the “things” that need to be integrated into systems and overall architectures.

One example where this level of connectivity will make a huge difference, he says, is in predictive maintenance. Once a system can “sense” or predict that a machine may fail or if a part needs to be replaced, there will be a huge economic impact and cost savings. For instance, he said Siemens uses acoustic sensors to monitor the state of its wind generators. By placing sensors next to the bearings in the machine, they can literally “listen” for squeaky wheels and thus figure out whether a turbine may soon need repair. These analytics let them know when the bearing must be replaced before the turbine shuts down. Of course, the infrastructure will need to connect all of these “things” to the each other and the cloud first. So, there will need to be a lot of system level changes in architectures.

Standards, of course, will be key to getting these architectures to work together. Schneider believes standards development for the IoT will need to be tackled from both horizontal and vertical standpoint. Both generic communication standards and industry specific standards like how to integrate an operating room must evolve.

“We are a firm believer in open standards as a way to build consensus and make things actually work. It’s absolutely critical,” he said.

stan_schneiderStan Schneider is CEO at Real-Time Innovations (RTI), the Industrial Internet of Things communications platform company. RTI is the largest embedded middleware vendor and has an extensive footprint in all areas of the Industrial Internet, including Energy, Medical, Automotive, Transportation, Defense, and Industrial Control.  Stan has published over 50 papers in both academic and industry press. He speaks at events and conferences widely on topics ranging from networked medical devices for patient safety, the future of connected cars, the role of the DDS standard in the IoT, the evolution of power systems, and understanding the various IoT protocols.  Before RTI, Stan managed a large Stanford robotics laboratory, led an embedded communications software team and built data acquisition systems for automotive impact testing.  Stan completed his PhD in Electrical Engineering and Computer Science at Stanford University, and holds a BS and MS from the University of Michigan. He is a graduate of Stanford’s Advanced Management College.

 

Leave a comment

Filed under architecture, Cloud, digital technologies, Enterprise Architecture, Healthcare, Internet of Things, Open Platform 3.0, Standards, Uncategorized

IT Trends Empowering Your Business is Focus of The Open Group London 2014

By The Open Group

The Open Group, the vendor-neutral IT consortium, is hosting an event in London October 20th-23rd at the Central Hall, Westminster. The theme of this year’s event is on how new IT trends are empowering improvements in business and facilitating enterprise transformation.

Objectives of this year’s event:

  • Show the need for Boundaryless Information Flow™, which would result in more interoperable, real-time business processes throughout all business ecosystems
  • Examine the use of developing technology such as Big Data and advanced data analytics in the financial services sector: to minimize risk, provide more customer-centric products and identify new market opportunities
  • Provide a high-level view of the Healthcare ecosystem that identifies entities and stakeholders which must collaborate to enable the vision of Boundaryless Information Flow
  • Detail how the growth of “The Internet of Things” with online currencies and mobile-enabled transactions has changed the face of financial services, and poses new threats and opportunities
  • Outline some of the technological imperatives for Healthcare providers, with the use of The Open Group Open Platform 3.0™ tools to enable products and services to work together and deploy emerging technologies freely and in combination
  • Describe how to develop better interoperability and communication across organizational boundaries and pursue global standards for Enterprise Architecture for all industries

Key speakers at the event include:

  • Allen Brown, President & CEO, The Open Group
  • Magnus Lindkvist, Futurologist
  • Hans van Kesteren, VP & CIO Global Functions, Shell International, The Netherlands
  • Daniel Benton, Global Managing Director, IT Strategy, Accenture

Registration for The Open Group London 2014 is open and available to members and non-members. Please register here.

Join the conversation via Twitter – @theopengroup #ogLON

1 Comment

Filed under architecture, Boundaryless Information Flow™, Business Architecture, Enterprise Architecture, Future Technologies, Governance, Healthcare, Internet of Things, Interoperability, Open Platform 3.0, Standards, Uncategorized

Business Benefit from Public Data

By Dr. Chris Harding, Director for Interoperability, The Open Group

Public bodies worldwide are making a wealth of information available, and encouraging its commercial exploitation. This sounds like a bonanza for the private sector at the public expense, but entrepreneurs are holding back. A healthy market for products and services that use public-sector information would provide real benefits for everyone. What can we do to bring it about?

Why Governments Give Away Data

The EU directive of 2003 on the reuse of public sector information encourages the Member States to make as much information available for reuse as possible. This directive was revised and strengthened in 2013. The U.S. Open Government Directive of 2009 provides similar encouragement, requiring US government agencies to post at least three high-value data sets online and register them on its data.gov portal. Other countries have taken similar measures to make public data publicly available.

Why are governments doing this? There are two main reasons.

One is that it improves the societies that they serve and the governments themselves. Free availability of information about society and government makes people more effective citizens and makes government more efficient. It illuminates discussion of civic issues, and points a searchlight at corruption.

The second reason is that it has a positive effect on the wealth of nations and their citizens. The EU directive highlights the ability of European companies to exploit the potential of public-sector information, and contribute to economic growth and job creation. Information is not just the currency of democracy. It is also the lubricant of a successful economy.

Success Stories

There are some big success stories.

If you drive a car, you probably use satellite navigation to find your way about, and this may use public-sector information. In the UK, for example, map data that can be used by sat-nav systems is supplied for commercial use by a government agency, the Ordnance Survey.

When you order something over the web for delivery to your house, you often enter a postal code and see most of the address auto-completed by the website. Postcode databases are maintained by national postal authorities, which are generally either government departments or regulated private corporations, and made available by them for commercial use. Here, the information is not directly supporting a market, but is contributing to the sale of a range of unrelated products and services.

The data may not be free. There are commercial arrangements for supply of map and postcode data. But it is available, and is the basis for profitable products and for features that make products more competitive.

The Bonanza that Isn’t

These successes are, so far, few in number. The economic benefits of open government data could be huge. The McKinsey Global Institute estimates a potential of between 3 and 5 trillion dollars annually. Yet the direct impact of Open Data on the EU economy in 2010, seven years after the directive was issued, is estimated by Capgemini at only about 1% of that, although the EU accounts for nearly a quarter of world GDP.

The business benefits to be gained from using map and postcode data are obvious. There are other kinds of public sector data, where the business benefits may be substantial, but they are not easy to see. For example, data is or could be available about public transport schedules and availability, about population densities, characteristics and trends, and about real estate and land use. These are all areas that support substantial business activity, but businesses in these areas seldom make use of public sector information today.

Where are the Products?

Why are entrepreneurs not creating these potentially profitable products and services? There is one obvious reason. The data they are interested in is not always available and, where it is available, it is provided in different ways, and comes in different formats. Instead of a single large market, the entrepreneur sees a number of small markets, none of which is worth tackling. For example, the market for an application that plans public transport journeys across a single town is not big enough to justify substantial investment in product development. An application that could plan journeys across any town in Europe would certainly be worthwhile, but is not possible unless all the towns make this data available in a common format.

Public sector information providers often do not know what value their data has, or understand its applications. Working within tight budgets, they cannot afford to spend large amounts of effort on assembling and publishing data that will not be used. They follow the directives but, without common guidelines, they simply publish whatever is readily to hand, in whatever form it happens to be.

The data that could support viable products is not available everywhere and, where it is available, it comes in different formats. (One that is often used is PDF, which is particularly difficult to process as an information source.) The result is that the cost of product development is high, and the expected return is low.

Where is the Market?

There is a second reason why entrepreneurs hesitate. The shape of the market is unclear. In a mature market, everyone knows who the key players are, understands their motivations, and can predict to some extent how they will behave. The market for products and services based on public sector information is still taking shape. No one is even sure what kinds of organization will take part, or what they will do. How far, for example, will public-sector bodies go in providing free applications? Can large corporations buy future dominance with loss-leader products? Will some unknown company become an overnight success, like Facebook? With these unknowns, the risks are very high.

Finding the Answers

Public sector information providers and standards bodies are tackling these problems. The Open Group participates in SHARE-PSI, the European network for the exchange of experience and ideas around implementing open data policies in the public sector. The experience gained by SHARE-PSI will be used by the World-Wide Web Consortium as a basis for standards and guidelines for publication of public sector information. These standards and guidelines may be used, not just by the public sector, but by not-for-profit bodies and even commercial corporations, many of which have information that they want to make freely available.

The Open Group is making a key contribution by helping to map the shape of the market. It is using the Business Scenario technique from its well-known Enterprise Architecture methodology TOGAF® to identify the kinds of organization that will take part, and their objectives and concerns.

There will be a preview of this on October 22 at The Open Group event in London which will feature a workshop session on Open Public Sector Data. This workshop will look at how Open Data can help business, present a draft of the Business Scenario, and take input from participants to help develop its conclusions.

The developed Business Scenario will be presented at the SHARE-PSI workshop in Lisbon on December 3-4. The theme of this workshop is encouraging open data usage by commercial developers. It will bring a wide variety of stakeholders together to discuss and build the relationship between the public and private sectors. It will also address, through collaboration with the EU LAPSI project, the legal framework for use of open public sector data.

Benefit from Participation!

If you are thinking about publishing or using public-sector data, you can benefit from these workshops by gaining an insight into the way that the market is developing. In the long term, you can influence the common standards and guidelines that are being developed. In the short term, you can find out what is happening and network with others who are interested.

The social and commercial benefits of open public-sector data are not being realized today. They can be realized through a healthy market in products and services that process the data and make it useful to citizens. That market will emerge when public bodies and businesses clearly understand the roles that they can play. Now is the time to develop that understanding and begin to profit from it.

Register for The Open Group London 2014 event at http://www.opengroup.org/london2014/registration.

Find out how to participate in the Lisbon SHARE-PSI workshop at http://www.w3.org/2013/share-psi/workshop/lisbon/#Participation

 

Chris HardingDr. Chris Harding is Director for Interoperability at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0™ Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

Leave a comment

Filed under big data, Cloud, digital technologies, Enterprise Architecture, Open Platform 3.0, TOGAF®, Uncategorized

The Open Group London 2014: Open Platform 3.0™ Panel Preview with Capgemini’s Ron Tolido

By The Open Group

The third wave of platform technologies is poised to revolutionize how companies do business not only for the next few years but for years to come. At The Open Group London event in October, Open Group CTO Dave Lounsbury will be hosting a panel discussion on how The Open Group Open Platform 3.0™ will affect Enterprise Architectures. Panel speakers include IBM Vice President and CTO of U.S. Federal IMT Andras Szakal and Capgemini Senior Vice President and CTO for Application Services Ron Tolido.

We spoke with Tolido in advance of the event about the progress companies are making in implementing third platform technologies, the challenges facing the industry as Open Platform 3.0 evolves and the call to action he envisions for The Open Group as these technologies take hold in the marketplace.

Below is a transcript of that conversation.

From my perspective, we have to realize: What is the call to action that we should have for ourselves? If we look at the mission of Boundaryless Information Flow™ and the need for open standards to accommodate that, what exactly can The Open Group and any general open standards do to facilitate this next wave in IT? I think it’s nothing less than a revolution. The first platform was the mainframe, the second platform was the PC and now the third platform is anything beyond the PC, so all sorts of different devices, sensors and ways to access information, to deploy solutions and to connect. What does it mean in terms of Boundaryless Information Flow and what is the role of open standards to make that platform succeed and help companies to thrive in such a new world?

That’s the type of call to action I’m envisioning. And I believe there are very few Forums or Work Groups within The Open Group that are not affected by this notion of the third platform. Firstly, I believe an important part of the Open Platform 3.0 Forum’s mission will be to analyze, to understand, the impacts of the third platform, of all those different areas that we’re evolving currently in The Open Group, and, if you like, orchestrate them a bit or be a catalyst in all the working groups and forums.

In a blog you wrote this summer for Capgemini’s CTO Blog you cited third platform technologies as being responsible for a renewed interest in IT as an enabler of business growth. What is it about the Third Platform is driving that interest?

It’s the same type of revolution as we’ve seen with the PC, which was the second platform. A lot of people in business units—through the PC and client/server technologies and Windows and all of these different things—realized that they could create solutions of a whole new order. The second platform meant many more applications, many more uses, much more business value to be achieved and less direct dependence on the central IT department. I think we’re seeing a very similar evolution right now, but the essence of the move is not that it moves us even further away from central IT but it puts the power of technology right in the business. It’s much easier to create solutions. Nowadays, there are many more channels that are so close in business that it takes business people to understand them. This explains also why business people like the third platform so much—it’s the Cloud, it’s mobile, social, it’s big data, all of these are waves that bring technology closer to the business, and are easy to use with very apparent business value that haven’t seen before, certainly not in the PC era. So we’re seeing a next wave, almost a revolution in terms of how easy it is to create solutions and how widely spread these solutions can be. Because again, as with the PC, it’s many more applications yet again and many more potential uses that can be connected through these applications, so that’s the very nature of the revolution and that also explains why business people like the third platform so much. So what people say to me these days on the business side is ‘We love IT, it’s just these bloody IT people that are the problem.’

Due to the complexities of building the next wave of platform computing, do you think that we may hit a point of fatigue as companies begin to tackle everything that is involved in creating that platform and making it work together?

The way I see it, that’s still the work of the IT community and the Enterprise Architect and the platform designer. It’s the very nature of the platform is that it’s attractive to use it, not to build it. The very nature of the platform is to connect to it and launch from it, but building the platform is an entirely different story. I think it requires platform designers and Enterprise Architects, if you like, and people to do the plumbing and do the architecting and the design underneath. But the real nature of the platform is to use it and to build upon it rather than to create it. So the happy view is that the “business people” don’t have to construct this.

I do believe, by the way, that many of the people in The Open Group will be on the side of the builders. They’re supposed to like complexity and like reducing it, so if we do it right the users of the platform will not notice this effort. It’s the same with the Cloud—the problem with the Cloud nowadays is that many people are tempted to run their own clouds, their own technologies, and before they know it, they only have additional complexity on their agenda, rather than reduced, because of the Cloud. It’s the same with the third platform—it’s a foundation which is almost a no-brainer to do business upon, for the next generation of business models. But if we do it wrong, we only have additional complexity on our hands, and we give IT a bad name yet again. We don’t want to do that.

What are Capgemini customers struggling with the most in terms of adopting these new technologies and putting together an Open Platform 3.0?

What you currently see—and it’s not always good to look at history—but if you look at the emergence of the second platform, the PC, of course there were years in which central IT said ‘nobody needs a PC, we can do it all on the mainframe,’ and they just didn’t believe it and business people just started to do it themselves. And for years, we created a mess as a result of it, and we’re still picking up some of the pieces of that situation. The question for IT people, in particular, is to understand how to find this new rhythm, how to adopt the dynamics of this third platform while dealing with all the complexity of the legacy platform that’s already there. I think if we are able to accelerate creating such a platform—and I think The Open Group will be very critical there—what exactly should be in the third platform, what type of services should you be developing, how would these services interact, could we create some set of open standards that the industry could align to so that we don’t have to do too much work in integrating all that stuff. If we, as The Open Group, can create that industry momentum, that, at least, would narrow the gap between business and IT that we currently see. Right now IT’s very clearly not able to deliver on the promise because they have their hands full with surviving the existing IT landscape, so unless they do something about simplifying it on the one hand and bridging that old world with the new one, they might still be very unpopular in the forthcoming years. That’s not what you want as an IT person—you want to enable business and new business. But I don’t think we’ve been very effective with that for the past ten years as an industry in general, so that’s a big thing that we have to deal with, bridging the old world with the new world. But anything we can do to accelerate and simplify that job from The Open Group would be great, and I think that’s the very essence of where our actions would be.

What are some of the things that The Open Group, in particular, can do to help affect these changes?

To me it’s still in the evangelization phase. Sooner or later people have to buy it and say ‘We get it, we want it, give me access to the third platform.’ Then the question will be how to accelerate building such an actual platform. So the big question is: What does such a platform look like? What types of services would you find on such a platform? For example, mobility services, data services, integration services, management services, development services, all of that. What would that look like in a typical Platform 3.0? Maybe even define a catalog of services that you would find in the platform. Then, of course, if you could use such a catalog or shopping list, if you like, to reach out to the technology suppliers of this world and convince them to pick that up and gear around these definitions—that would facilitate such a platform. Also maybe the architectural roadmap—so what would an architecture look like and what would be the typical five ways of getting there? We have to start with your local situation, so probably also several design cases would be helpful, so there’s an architectural dimension here.

Also, in terms of competencies, what type of competencies will we need in the near future to be able to supply these types of services to the business? That’s, again, very new—in this case, IT Specialist Certification and Architect Certification. These groups also need to think about what are the new competencies inherent in the third platform and how does it affect things like certification criteria and competency profiles?

In other areas, if you look at TOGAF®, and Open Group standard, is it really still suitable in fast paced world of the third platform or do we need a third platform version of TOGAF? With Security, for example, there are so many users, so many connections, and the activities of the former Jericho Forum seem like child’s play compared to what you will see around the third platform, so there’s no Forum or Work Group that’s not affected by this Open Platform 3.0 emerging.

With Open Platform 3.0 touching pretty much every aspect of technology and The Open Group, how do you tackle that? Do you have just an umbrella group for everything or look at it through the lens of TOGAF or security or the IT Specialist? How do you attack something so large?

It’s exactly what you just said. It’s fundamentally my belief that we need to do both of these two things. First, we need a catalyst forum, which I would argue is the Open Platform 3.0 Forum, which would be the catalyst platform, the orchestration platform if you like, that would do the overall definitions, the call to action. They’ve already been doing the business scenarios—they set the scene. Then it would be up to this Forum to reach out to all the other Forums and Work Groups to discuss impact and make sure it stays aligned, so here we have an orchestration function of the Open Platform 3.0 Forum. Then, very obviously, all the other Work Groups and Forums need to pick it up and do their own stuff because you cannot aspire to do all of this with one and the same forum because it’s so wide, it’s so diverse. You need to do both.

The Open Platform 3.0 Forum has been working for a year and a half now. What are some of the things the Forum has accomplished thus far?

They’ve been particularly working on some of the key definitions and some of the business scenarios. I would say in order to create an awareness of Open Platform 3.0 in terms of the business value and the definitions, they’ve done a very good job. Next, there needs to be a call to action to get everybody mobilized and setting tangible steps toward the Platform 3.0. I think that’s currently where we are, so that’s good timing, I believe, in terms of what the forum has achieved so far.

Returning to the mission of The Open Group, given all of the awareness we have created, what does it all mean in terms of Boundaryless Information Flow and how does it affect the Forums and Work Groups in The Open Group? That’s what we need to do now.

What are some of the biggest challenges that you see facing adoption of Open Platform 3.0 and standards for that platform?

They are relatively immature technologies. For example, with the Cloud you see a lot of players, a lot of technology providers being quite reluctant to standardize. Some of them are very open about it and are like ‘Right now we are in a niche, and we’re having a lot of fun ourselves, so why open it up right now?’ The movement would be more pressure from the business side saying ‘We want to use your technology but only if you align with some of these emerging standards.’ That would do it or certainly help. This, of course, is what makes The Open Group as powerful as not only technology providers, but also businesses, the enterprises involved and end users of technology. If they work together and created something to mobilize technology providers, that would certainly be a breakthrough, but these are immature technologies and, as I said, with some of these technology providers, it seems more important to them to be a niche player for now and create their own market rather than standardizing on something that their competitors could be on as well.

So this is a sign of a relatively immature industry because every industry that starts to mature around certain topics begins to work around open standards. The more mature we grow in mastering the understanding of the Open Platform 3.0, the more you will see the need for standards arise. It’s all a matter of timing so it’s not so strange that in the past year and a half it’s been very difficult to even discuss standards in this area. But I think we’re entering that era really soon, so it seems to be good timing to discuss it. That’s one important limiting area; I think the providers are not necessarily waiting for it or committed to it.

Secondly, of course, this is a whole next generation of technologies. With all new generations of technologies there are always generation gaps and people in denial or who just don’t feel up to picking it up again or maybe they lack the energy to pick up a new wave of technology and they’re like ‘Why can’t I stay in what I’ve mastered?’ All very understandable. I would call that a very typical IT generation gap that occurs when we see the next generation of IT emerge—sooner or later you get a generation gap, as well. Which has nothing to do with physical age, by the way.

With all these technologies converging so quickly, that gap is going to have to close quickly this time around isn’t it?

Well, there are still mainframes around, so you could argue that there will be two or even three speeds of IT sooner or later. A very stable, robust and predictable legacy environment could even be the first platform that’s more mainframe-oriented, like you see today. A second wave would be that PC workstation, client/server, Internet-based IT landscape, and it has a certain base and certain dynamics. Then you have this third phase, which is the new platform, that is more dynamic and volatile and much more diverse. You could argue that there might be within an organization multiple speeds of IT, multiple speeds of architectures, multi-speed solutioning, and why not choose your own speed?

It probably takes a decade or more to really move forward for many enterprises.

It’s not going as quickly as the Gartners of this world typically thinks it is—in practice we all know it takes longer. So I don’t see any reason why certain people wouldn’t certainly choose deliberately to stay in second gear and don’t go to third gear simply because they think it’s challenging to be there, which is perfectly sound to me and it would bring a lot of work in many years to companies.

That’s an interesting concept because start-ups can easily begin on a new platform but if you’re a company that has been around for a long time and you have existing legacy systems from the mainframe or PC era, those are things that you have to maintain. How do you tackle that as well?

That’s a given in big enterprises. Not everybody can be a disruptive start up. Maybe we all think that we should be like that but it’s not the case in real life. In real life, we have to deal with enterprise systems and enterprise processes and all of them might be very vulnerable to this new wave of challenges. Certainly enterprises can be disruptive themselves if they do it right, but there are always different dynamics, and, as I said, we still have mainframes, as well, even though we declared their ending quite some time ago. The same will happen, of course, to PC-based IT landscapes. It will take a very long time and will take very skilled hands and minds to keep it going and to simplify.

Having said that, you could argue that some new players in the market obviously have the advantage of not having to deal with that and could possibly benefit from a first-mover advantage where existing enterprises have to juggle several balls at the same time. Maybe that’s more difficult, but of course enterprises are enterprises for a good reason—they are big and holistic and mighty, and they might be able to do things that start-ups simply can’t do. But it’s a very unpredictable world, as we all realize, and the third platform brings a lot of disruptiveness.

What’s your perspective on how the Internet of Things will affect all of this?

It’s part of the third platform of course, and it’s something Andras Szakal will be addressing as well. There’s much more coming, both at the input sites, everything is becoming a sensor essentially to where even your wallpaper or paint is a sensor, but on the other hand, in terms of devices that we use to communicate or get information—smart things that whisper in your ears or whatever we’ll have in the coming years—is clearly part of this Platform 3.0 wave that we’ll have as we move away from the PC and the workstation, and there’s a whole bunch of new technologies around to replace it. The Internet of Things is clearly part of it, and we’ll need open standards as well because there are so many different things and devices, and if you don’t create the right standards and platform services to deal with it, it will be a mess. It’s an integral part of the Platform 3.0 wave that we’re seeing.

What is the Open Platform 3.0 Forum going to be working on over the next few months?

Understanding what this Open Platform 3.0 actually means—I think the work we’ve seen so far in the Forum really sets the way in terms of what is it and definitions are growing. Andras will be adding his notion of the Internet of Things and looking at definitions of what is it exactly. Many people already intuitively have an image of it.

The second will be how we deliver value to the business—so the business scenarios are a crucial thing to consider to see how applicable they are, how relevant they are to enterprises. The next thing to do will pertain to work that still needs to be done in The Open Group, as well. What would a new Open Platform 3.0 architecture look like? What are the platform services? What are the ones we can start working on right now? What are the most important business scenarios and what are the platform services that they will require? So architectural impacts, skills impacts, security impacts—as I said, there are very few areas in IT that are not touched by it. Even the new IT4IT Forum that will be launched in October, which is all about methodologies and lifecycle, will need to consider Agile, DevOps-related methodologies because that’s the rhythm and the pace that we’ve got to expect in this third platform. So the rhythm of the working group—definitions, business scenarios and then you start to thinking about what does the platform consist of, what type of services do I need to create to support it and hopefully by then we’ll have some open standards to help accelerate that thinking to help enterprises set a course for themselves. That’s our mission as The Open Group to help facilitate that.

Tolido-RonRon Tolido is Senior Vice President and Chief Technology Officer of Application Services Continental Europe, Capgemini. He is also a Director on the board of The Open Group and blogger for Capgemini’s multiple award-winning CTO blog, as well as the lead author of Capgemini’s TechnoVision and the global Application Landscape Reports. As a noted Digital Transformation ambassador, Tolido speaks and writes about IT strategy, innovation, applications and architecture. Based in the Netherlands, Mr. Tolido currently takes interest in apps rationalization, Cloud, enterprise mobility, the power of open, Slow Tech, process technologies, the Internet of Things, Design Thinking and – above all – radical simplification.

 

 

1 Comment

Filed under architecture, Boundaryless Information Flow™, Certifications, Cloud, digital technologies, Enterprise Architecture, Future Technologies, Information security, Internet of Things, Open Platform 3.0, Security, Service Oriented Architecture, Standards, TOGAF®, Uncategorized