Tag Archives: Internet of Things

The Emergence of the Third Platform

By Andras Szakal, Vice President and Chief Technology Officer, IBM U.S. Federal

By 2015 there will be more than 5.6 billion personal devices in use around the world. Personal mobile computing, business systems, e-commerce, smart devices and social media are generating an astounding 2.5 billion gigabytes of data per day. Non-mobile network enabled intelligent devices, often referred to as the Internet of Things (IoT), is poised to explode to over 1 trillion devices by 2015.

Rapid innovation and astounding growth in smart devices is driving new business opportunities and enterprise solutions. Many of these new opportunities and solutions are based on deep insight gained through analysis of the vast amount of data being generated.

The expansive growth of personal and pervasive computing power continues to drive innovation that is giving rise to a new class of systems and a pivot to a new generation of computing platform. Over the last fifty years, two generations of computing platform have dominated the business and consumer landscape. The first generation was dominated by the monolithic mainframe, while distributed computing and the Internet characterized the second generation. Cloud computing, Big Data/Analytics, the Internet of Things (IoT), mobile computing and even social media are the core disruptive technologies that are now converging at the cross roads of the emergence of a third generation of computing platform.

This will require new approaches to enterprise and business integration and interoperability. Industry bodies like The Open Group must help guide customers through the transition by facilitating customer requirements, documenting best practices, establishing integration standards and transforming the current approach to Enterprise Architecture, to adapt to the change in which organizations will build, use and deploy the emerging third generation of computing platform.

Enterprise Computing Platforms

An enterprise computing platform provides the underlying infrastructure and operating environment necessary to support business interactions. Enterprise systems are often comprised of complex application interactions necessary to support business processes, customer interactions, and partner integration. These interactions coupled with the underlying operating environment define an enterprise systems architecture.

The hallmark of successful enterprise systems architecture is a standardized and stable systems platform. This is an underlying operating environment that is stable, supports interoperability, and is based on repeatable patterns.

Enterprise platforms have evolved from the monolithic mainframes of the 1960s and 1970s through the advent of the distributed systems in the 1980s. The mainframe-based architecture represented the first true enterprise operating platform, referred to henceforth as the First Platform. The middleware-based distributed systems that followed and ushered in the dawn of the Internet represented the second iteration of platform architecture, referred to as the Second Platform.

While the creation of the Internet and the advent of web-based e-commerce are of historical significance, the underlying platform was still predominantly based on distributed architectures and therefore is not recognized as a distinct change in platform architecture. However, Internet-based e-commerce and service-based computing considerably contributed to the evolution toward the next distinct version of the enterprise platform. This Third Platform will support the next iteration of enterprise systems, which will be born out of multiple simultaneous and less obvious disruptive technology shifts.

The Convergence of Disruptive Technologies

The emergence of the third generation of enterprise platforms is manifested at the crossroads of four distinct, almost simultaneous, disruptive technology shifts; cloud computing, mobile computing, big data-based analytics and the IoT. The use of applications based on these technologies, such as social media and business-driven insight systems, have contributed to both the convergence and rate of adoption.

These technologies are dramatically changing how enterprise systems are architected, how customers interact with business, and the rate and pace of development and deployment across the enterprise. This is forcing vendors, businesses, and governments to shift their systems architectures to accommodate integrated services that leverage cloud infrastructure, while integrating mobile solutions and supporting the analysis of the vast amount of data being generated by mobile solutions and social media. All this is happening while maintaining the integrity of the evolving businesses capabilities, processes, and transactions that require integration with business systems such as Enterprise Resource Planning (ERP) and Customer Relationship Management (CRM).

Cloud computing and the continued commoditization of computer storage are key facilitating elements of this convergence. Cloud computing lowers the complexity of enterprise computing through virtualization and automated infrastructure provisioning, while solid-state and software-based Internet storage has made big data practical and affordable. Cloud computing solutions continue to evolve and offer innovative services like Platform as a Service (PaaS)-based development environments that integrate directly with big data solutions. Higher density, cloud-based and solid-state storage continue to lower the cost and complexity of storage and big data solutions.

The emergence of the smartphone and enterprise mobile computing is a key impetus for the emergence of big data solutions and an explosion of innovative storage technologies. The modern mobile platform, with all its rich applications, device sensors, and access to social networks, is almost single-handedly responsible for the explosion of data and the resulting rush to provide solutions to analyze and act on the insight contained in the vast ocean of personalized information. In turn, this phenomenon has created a big data market ecosystem based on the premise that open data is the new natural resource.

The emergence of sensor-enabled smartphones has foreshadowed the potential value of making everyday devices interconnected and intelligent by adding network-based sensors that allow devices to enhance their performance by interacting with their environment, and through collaboration with other devices and enterprise systems in the IoT. For example, equipment manufacturers are using sensors to gain insight into the condition of fielded equipment. This approach reduces both the mean time to failure and pinpoints manufacturing quality issues and potential design flaws. This system of sensors also integrates with the manufacturer’s internal supply chain systems to identify needed parts, and optimizes the distribution process. In turn, the customer benefits by avoiding equipment downtime through scheduling maintenance before a part fails.

Over time, the IoT will require an operating environment for devices that integrates with existing enterprise business systems. But this will require that smart devices effectively integrate with cloud-based enterprise business systems, the enterprise customer engagement systems, as well as the underlying big data infrastructure responsible for gleaning insight into the data this vast network of sensors will generate. While each of these disruptive technology shifts has evolved separately, they share a natural affinity for interaction, collaboration, and enterprise integration that can be used to optimize an enterprise’s business processes.

Evolving Enterprise Business Systems

Existing enterprise systems (ERP, CRM, Supply Chain, Logistics, etc.) are still essential to the foundation of a business or government and form Systems of Record (SoR) that embody core business capabilities and the authoritative processes based on master data records. The characteristics of SoR are:

  • Encompass core business functions
  • Transactional in nature
  • Based on structured databases
  • Authoritative source of information (master data records)
  • Access is regulated
  • Changes follow a rigorous governance process.

Mobile systems, social media platforms, and Enterprise Market Management (EMM) solutions form another class of systems called Systems of Engagement (SoE). Their characteristics are:

  • Interact with end-users through open collaborative interfaces (mobile, social media, etc.)
  • High percentage of unstructured information
  • Personalized to end-user preferences
  • Context-based analytical business rules and processing
  • Access is open and collaborative
  • Evolves quickly and according to the needs of the users.

The emergence of the IoT is embodied in a new class of system, Systems of Sensors (SoS), which includes pervasive computing and control. Their characteristics are:

  • Based on autonomous network-enabled devices
  • Devices that use sensors to collect information about the environment
  • Interconnected with other devices or enterprise engagement systems
  • Changing behavior based on intelligent algorithms and environmental feedback
  • Developed through formal product engineering process
  • Updates to device firmware follow a continuous lifecycle.

The Third Platform

The Third Platform is a convergence of cloud computing, big data solutions, mobile systems and the IoT integrated into the existing enterprise business systems.

The Three Classes of System

Figure 1: The Three Classes of Systems within the Third Platform

The successful implementation and deployment of enterprise SoR has been embodied in best practices, methods, frameworks, and techniques that have been distilled into enterprise architecture. The same level of rigor and pattern-based best practices will be required to ensure the success of solutions based on Third Platform technologies. Enterprise architecture methods and models need to evolve to include guidance, governance, and design patterns for implementing business solutions that span the different classes of system.

The Third Platform builds upon many of the concepts that originated with Service-Oriented Architecture (SOA) and dominated the closing stanza of the period dominated by the Second Platform technologies. The rise of the Third Platform provides the technology and environment to enable greater maturity of service integration within an enterprise.

The Open Group Service Integration Maturity Model (OSIMM) standard[1] provides a way in which an organization can assess its level of service integration maturity. Adoption of the Third Platform inherently addresses many of the attributes necessary to achieve the highest levels of service integration maturity defined by OSIMM. It will enable new types of application architecture that can support dynamically reconfigurable business and infrastructure services across a wide variety of devices (SoS), internal systems (SoR), and user engagement platforms (SoE).

Solution Development

These new architectures and the underlying technologies will require adjustments to how organizations approach enterprise IT governance, to lower the barrier of entry necessary to implement and integrate the technologies. Current adoption requires extensive expertise to implement, integrate, deploy, and maintain the systems. First market movers have shown the rest of the industry the realm of the possible, and have reaped the rewards of the early adopter.

The influence of cloud and mobile-based technologies has changed the way in which solutions will be developed, delivered, and maintained. SoE-based solutions interact directly with customers and business partners, which necessitates a continuous delivery of content and function to align with the enterprise business strategy.

Most cloud-based services employ a roll-forward test and delivery model. A roll-forward model allows an organization to address functional inadequacies and defects in almost real-time, with minimal service interruptions. The integration and automation of development and deployment tools and processes reduces the risk of human error and increases visibility into quality. In many cases, end-users are not even aware of updates and patch deployments.

This new approach to development and operations deployment and maintenance is referred to as DevOps – which combines development and operations tools, governance, and techniques into a single tool set and management practice. This allows the business to dictate, not only the requirements, but also the rate and pace of change aligned to the needs of the enterprise.

[1] The Open Group Service Integration Maturity Model (OSIMM), Open Group Standard (C117), published by The Open Group, November 2011; refer to: www.opengroup.org/bookstore/catalog/c117.htm

Andras2

Figure 2: DevOps: The Third Platform Solution Lifecycle

The characteristics of an agile DevOps approach are:

  • Harmonization of resources and practices between development and IT operations
  • Automation and integration of the development and deployment processes
  • Alignment of governance practices to holistically address development and operations with business needs
  • Optimization of the DevOps process through continuous feedback and metrics.

In contrast to SoE, SoR have a slower velocity of delivery. Such systems are typically released on fixed, pre-planned release schedules. Their inherent stability of features and capabilities necessitates a more structured and formal development approach, which traditionally equates to fewer releases over time. Furthermore, the impact changes to SoR have on core business functionality limits the magnitude and rate of change an organization is able to tolerate. But the emergence of the Third Platform will continue to put pressure on these core business systems to become more agile and flexible in order to adapt to the magnitude of events and information generated by mobile computing and the IoT.

As the technologies of the Third Platform coalesce, organizations will need to adopt hybrid development and delivery models based on agile DevOps techniques that are tuned appropriately to the class of system (SoR, SoS or SoS) and aligned with an acceptable rate of change.

DevOps is a key attribute of the Third Platform that will shift the fundamental management structure of the IT department. The Third Platform will usher in an era where one monolithic IT department is no longer necessary or even feasible. The line between business function and IT delivery will be imperceptible as this new platform evolves. The lines of business will become intertwined with the enterprise IT functions, ultimately leading to the IT department and business capability becoming synonymous. The recent emergence of the Enterprise Market Management organizations is an example where the marketing capabilities and the IT delivery systems are managed by a single executive – the Enterprise Marketing Officer.

The Challenge

The emergence of a new enterprise computing platform will usher in opportunity and challenge for businesses and governments that have invested in the previous generation of computing platforms. Organizations will be required to invest in both expertise and technologies to adopt the Third Platform. Vendors are already offering cloud-based Platform as a Service (PaaS) solutions that will provide integrated support for developing applications across the three evolving classes of systems – SoS, SoR, and SoE. These new development platforms will continue to evolve and give rise to new application architectures that were unfathomable just a few years ago. The emergence of the Third Platform is sure to spawn an entirely new class of dynamically reconfigurable intelligent applications and devices where applications reprogram their behavior based on the dynamics of their environment.

Almost certainly this shift will result in infrastructure and analytical capacity that will facilitate the emergence of cognitive computing which, in turn, will automate the very process of deep analysis and, ultimately, evolve the enterprise platform into the next generation of computing. This shift will require new approaches, standards and techniques for ensuring the integrity of an organization’s business architecture, enterprise architecture and IT systems architectures.

To effectively embrace the Third Platform, organizations will need to ensure that they have the capability to deliver boundaryless systems though integrated services that are comprised of components that span the three classes of systems. This is where communities like The Open Group can help to document architectural patterns that support agile DevOps principles and tooling as the Third Platform evolves.

Technical standardization of the Third Platform has only just begun; for example, standardization of the cloud infrastructure has only recently crystalized around OpenStack. Mobile computing platform standardization remains fragmented across many vendor offerings even with the support of rigid developer ecosystems and open sourced runtime environments. The standardization and enterprise support for SoS is still nascent but underway within groups like the Allseen Alliance and with the Open Group’s QLM workgroup.

Call to Action

The rate and pace of innovation, standardization, and adoption of Third Platform technologies is astonishing but needs the guidance and input from the practitioner community. It is incumbent upon industry communities like the Open Group to address the gaps between traditional Enterprise Architecture and an approach that scales to the Internet timescales being imposed by the adoption of the Third Platform.

The question is not whether Third Platform technologies will dominate the IT landscape, but rather how quickly this pivot will occur. Along the way, the industry must apply the open standards processes to ensure against the fragmentation into multiple incompatible technology platforms.

The Open Group has launched a new forum to address these issues. The Open Group Open Platform 3.0™ Forum is intended to provide a vendor-neutral environment where members share knowledge and collaborate to develop standards and best practices necessary to help guide the evolution of Third Platform technologies and solutions. The Open Platform 3.0 Forum will provide a place where organizations can help illuminate their challenges in adopting Third Platform technologies. The Open Platform 3.0 Forum will help coordinate standards activities that span existing Open Group Forums and ensure a coordinated approach to Third Platform standardization and development of best practices.

Innovation itself is not enough to ensure the value and viability of the emerging platform. The Open Group can play a unique role through its focus on Boundaryless Information Flow™ to facilitate the creation of best practices and integration techniques across the layers of the platform architecture.

andras-szakalAndras Szakal, VP and CTO, IBM U.S. Federal, is responsible for IBM’s industry solution technology strategy in support of the U.S. Federal customer. Andras was appointed IBM Distinguished Engineer and Director of IBM’s Federal Software Architecture team in 2005. He is an Open Group Distinguished Certified IT Architect, IBM Certified SOA Solution Designer and a Certified Secure Software Lifecycle Professional (CSSLP).  Andras holds undergraduate degrees in Biology and Computer Science and a Masters Degree in Computer Science from James Madison University. He has been a driving force behind IBM’s adoption of government IT standards as a member of the IBM Software Group Government Standards Strategy Team and the IBM Corporate Security Executive Board focused on secure development and cybersecurity. Andras represents the IBM Software Group on the Board of Directors of The Open Group and currently holds the Chair of The Open Group Certified Architect (Open CA) Work Group. More recently, he was appointed chair of The Open Group Trusted Technology Forum and leads the development of The Open Trusted Technology Provider Framework.

Leave a comment

Filed under big data, Cloud, Internet of Things, Open Platform 3.0

IT Trends Empowering Your Business is Focus of The Open Group London 2014

By The Open Group

The Open Group, the vendor-neutral IT consortium, is hosting an event in London October 20th-23rd at the Central Hall, Westminster. The theme of this year’s event is on how new IT trends are empowering improvements in business and facilitating enterprise transformation.

Objectives of this year’s event:

  • Show the need for Boundaryless Information Flow™, which would result in more interoperable, real-time business processes throughout all business ecosystems
  • Examine the use of developing technology such as Big Data and advanced data analytics in the financial services sector: to minimize risk, provide more customer-centric products and identify new market opportunities
  • Provide a high-level view of the Healthcare ecosystem that identifies entities and stakeholders which must collaborate to enable the vision of Boundaryless Information Flow
  • Detail how the growth of “The Internet of Things” with online currencies and mobile-enabled transactions has changed the face of financial services, and poses new threats and opportunities
  • Outline some of the technological imperatives for Healthcare providers, with the use of The Open Group Open Platform 3.0™ tools to enable products and services to work together and deploy emerging technologies freely and in combination
  • Describe how to develop better interoperability and communication across organizational boundaries and pursue global standards for Enterprise Architecture for all industries

Key speakers at the event include:

  • Allen Brown, President & CEO, The Open Group
  • Magnus Lindkvist, Futurologist
  • Hans van Kesteren, VP & CIO Global Functions, Shell International, The Netherlands
  • Daniel Benton, Global Managing Director, IT Strategy, Accenture

Registration for The Open Group London 2014 is open and available to members and non-members. Please register here.

Join the conversation via Twitter – @theopengroup #ogLON

2 Comments

Filed under architecture, Boundaryless Information Flow™, Business Architecture, Enterprise Architecture, Future Technologies, Governance, Healthcare, Internet of Things, Interoperability, Open Platform 3.0, Standards, Uncategorized

The Open Group Boston 2014 Preview: Talking People Architecture with David Foote

By The Open Group

Among all the issues that CIOs, CTOs and IT departments are facing today, staffing is likely near the top of the list of what’s keeping them up at night. Sure, there’s dealing with constant (and disruptive) technological changes and keeping up with the latest tech and business trends, such as having a Big Data, Internet of Things (IoT) or a mobile strategy, but without the right people with the right skills at the right time it’s impossible to execute on these initiatives.

Technology jobs are notoriously difficult to fill–far more difficult than positions in other industries where roles and skillsets may be much more static. And because technology is rapidly evolving, the roles for tech workers are also always in flux. Last year you may have needed an Agile developer, but today you may need a mobile developer with secure coding ability and in six months you might need an IoT developer with strong operations or logistics domain experience—with each position requiring different combinations of tech, functional area, solution and “soft” skillsets.

According to David Foote, IT Industry Analyst and co-founder of IT workforce research and advisory firm Foote Partners, the mash-up of HR systems and ad hoc people management practices most companies have been using for years to manage IT workers have become frighteningly ineffective. He says that to cope in today’s environment, companies need to architect their people infrastructure similar to how they have been architecting their technical infrastructure.

“People Architecture” is the term Foote has coined to describe the application of traditional architectural principles and practices that may already be in place elsewhere within an organization and applying them to managing the IT workforce. This includes applying such things as strategy and capability roadmaps, phase gate blueprints, benchmarks, performance metrics, governance practices and stakeholder management to human capital management (HCM).

HCM components for People Architecture typically include job definition and design, compensation, incentives and recognition, skills demand and acquisition, job and career paths, professional development and work/life balance.

Part of the dilemma for employers right now, Foote says, is that there is very little job title standardization in the marketplace and too many job titles floating around IT departments today. “There are too many dimensions and variability in jobs now that companies have gotten lost from an HR perspective. They’re unable to cope with the complexity of defining, determining pay and laying out career paths for all these jobs, for example. For many, serious retention and hiring problems are showing up for the first time. Work-around solutions used for years to cope with systemic weaknesses in their people management systems have stopped working,” says Foote. “Recruiters start picking off their best people and candidates are suddenly rejecting offers and a panic sets in. Tensions are palpable in their IT workforce. These IT realities are pervasive.”

Twenty-five years ago, Foote says, defining roles in IT departments was easier. But then the Internet exploded and technology became far more customer-facing, shifting basic IT responsibilities from highly technical people deep within companies to roles requiring more visibility and transparency within and outside the enterprise. Large chunks of IT budgets moved into the business lines while traditional IT became more of a business itself.

According to Foote, IT roles became siloed not just by technology but by functional areas such as finance and accounting, operations and logistics, sales, marketing and HR systems, and by industry knowledge and customer familiarity. Then the IT professional services industry rapidly expanded to compete with their customers for talent in the marketplace. Even the architect role changed: an Enterprise Architect today can specialize in applications, security or data architecture among others, or focus on a specific industry such as energy, retail or healthcare.

Foote likens the fragmentation of IT jobs and skillsets that’s happening now to the emergence of IT architecture 25 years ago. Just as technical architecture practices emerged to help make sense of the disparate systems rapidly growing within companies and how best to determine the right future tech investments, a people architecture approach today helps organizations better manage an IT workforce spread through the enterprise with roles ranging from architects and analysts to a wide variety of engineers, developers and project and program managers.

“Technical architecture practices were successful because—when you did them well—companies achieved an understanding of what they have systems-wise and then connected it to where they were going and how they were going to get there, all within a process inclusive of all the various stakeholders who shared the risk in the outcome. It helped clearly define enterprise technology capabilities and gave companies more options and flexibility going forward,” according to Foote.

“Right now employers desperately need to incorporate in human capital management systems and practice the same straightforward, inclusive architecture approaches companies are already using in other areas of their businesses. This can go a long way toward not just lessening staffing shortages but also executing more predictably and being more agile in face of constant uncertainties and the accelerating pace of change. Ultimately this translates into a more effective workforce whether they are full-timers or the contingent workforce of part-timers, consultants and contractors.

“It always comes down to your people. That’s not a platitude but a fact,” insists Foote. “If you’re not competitive in today’s labor marketplace and you’re not an employer where people want to work, you’re dead.”

One industry that he says has gotten it right is the consulting industry. “After all, their assets walk out the door every night. Consulting groups within firms such as IBM and Accenture have been good at architecting their staffing because it’s their job to get out in front of what’s coming technologically. Because these firms must anticipate customer needs before they get the call to implement services, they have to be ahead of the curve in already identifying and hiring the bench strength needed to fulfill demand. They do many things right to hire, develop and keep the staff they need in place.”

Unfortunately, many companies take too much of a just-in-time approach to their workforce so they are always managing staffing from a position of scarcity rather than looking ahead, Foote says. But, this is changing, in part due to companies being tired of never having the people they need and being able to execute predictably.

The key is to put a structure in place that addresses a strategy around what a company needs and when. This applies not just to the hiring process, but also to compensation, training and advancement.

“Architecting anything allows you to be able to, in a more organized way, be more agile in dealing with anything that comes at you. That’s the beauty of architecture. You plan for the fact that you’re going to continue to scale and continue to change systems, the world’s going to continue to change, but you have an orderly way to manage the governance, planning and execution of that, the strategy of that and the implementation of decisions knowing that the architecture provides a more agile and flexible modular approach,” he said.

Foote says organizations such as The Open Group can lend themselves to facilitating People Architecture in a couple different ways. First, through extending the principles of architecture to human capital management, and second through vendor-independent, expertise and experience driven certifications, such as TOGAF® or OpenCA and OpenCITS, that help companies define core competencies for people and that provide opportunities for training and career advancement.

“I’m pretty bullish on many vendor-independent certifications in general, particularly where a defined book of knowledge exists that’s achieved wide acceptance in the industry. And that’s what you’ve got with The Open Group. Nobody’s challenging the architectural framework supremacy of TOGAF that that I’m aware of. In fact, large vendors with their own certifications participated actively in developing the framework and applying it very successfully to their business models,” he said.

Although the process of implementing People Architecture can be difficult and may take several years to master (much like Enterprise Architecture), Foote says it is making a huge difference for companies that implement it.

To learn more about People Architecture and models for implementing it, plan to attend Foote’s session at The Open Group Boston 2014 on Tuesday July 22. Foote’s session will address how architectural principles are being applied to human capital so that organizations can better manage their workforces from hiring and training through compensation, incentives and advancement. He will also discuss how career paths for EAs can be architected. Following the conference, the session proceedings will be available to Open Group members and conference attendees at www.opengroup.org.

Join the conversation – #ogchat #ogBOS

footeDavid Foote is an IT industry research pioneer, innovator, and one of the most quoted industry analysts on global IT workforce trends and multiple facets of the human side of technology value creation. His two decades of groundbreaking deep research and analysis of IT-business cross-skilling and technology/business management integration and leading the industry in innovative IT skills demand and compensation benchmarking has earned him a place on a short list of thought leaders in IT human capital management.

A former Gartner and META Group analyst, David leads the research and analytical practice groups at Foote Partners that reach 2,300 customers on six continents.

1 Comment

Filed under architecture, Conference, Open CA, Open CITS, Professional Development, Standards, TOGAF®, Uncategorized

The Onion & The Open Group Open Platform 3.0™

By Stuart Boardman, Senior Business Consultant, KPN Consulting, and Co-Chair of The Open Group Open Platform 3.0™

Onion1

The onion is widely used as an analogy for complex systems – from IT systems to mystical world views.Onion2

 

 

 

It’s a good analogy. From the outside it’s a solid whole but each layer you peel off reveals a new onion (new information) underneath.

And a slice through the onion looks quite different from the whole…Onion3

What (and how much) you see depends on where and how you slice it.Onion4

 

 

 

 

The Open Group Open Platform 3.0™ is like that. Use-cases for Open Platform 3.0 reveal multiple participants and technologies (Cloud Computing, Big Data Analytics, Social networks, Mobility and The Internet of Things) working together to achieve goals that vary by participant. Each participant’s goals represent a different slice through the onion.

The Ecosystem View
We commonly use the idea of peeling off layers to understand large ecosystems, which could be Open Platform 3.0 systems like the energy smart grid but could equally be the workings of a large cooperative or the transport infrastructure of a city. We want to know what is needed to keep the ecosystem healthy and what the effects could be of the actions of individuals on the whole and therefore on each other. So we start from the whole thing and work our way in.

Onion5

The Service at the Centre of the Onion

If you’re the provider or consumer (or both) of an Open Platform 3.0 service, you’re primarily concerned with your slice of the onion. You want to be able to obtain and/or deliver the expected value from your service(s). You need to know as much as possible about the things that can positively or negatively affect that. So your concern is not the onion (ecosystem) as a whole but your part of it.

Right in the middle is your part of the service. The first level out from that consists of other participants with whom you have a direct relationship (contractual or otherwise). These are the organizations that deliver the services you consume directly to enable your own service.

One level out from that (level 2) are participants with whom you have no direct relationship but on whose services you are still dependent. It’s common in Platform 3.0 that your partners too will consume other services in order to deliver their services (see the use cases we have documented). You need to know as much as possible about this level , because whatever happens here can have a positive or negative effect on you.

One level further from the centre we find indirect participants who don’t necessarily delivery any part of the service but whose actions may well affect the rest. They could just be indirect materials suppliers. They could also be part of a completely different value network in which your level 1 or 2 “partners” participate. You can’t expect to understand this level in detail but you know that how that value network performs can affect your partners’ strategy or even their very existence. The knock-on impact on your own strategy can be significant.

We can conceive of more levels but pretty soon a law of diminishing returns sets in. At each level further from your own organization you will see less detail and more variety. That in turn means that there will be fewer things you can actually know (with any certainty) and not much more that you can even guess at. That doesn’t mean that the ecosystem ends at this point. Ecosystems are potentially infinite. You just need to decide how deep you can usefully go.

Limits of the Onion
At a certain point one hits the limits of an analogy. If everybody sees their own organization as the centre of the onion, what we actually have is a bunch of different, overlapping onions.

Onion6

And you can’t actually make onions overlap, so let’s not take the analogy too literally. Just keep it in mind as we move on. Remember that our objective is to ensure the value of the service we’re delivering or consuming. What we need to know therefore is what can change that’s outside of our own control and what kind of change we might expect. At each visible level of the theoretical onion we will find these sources of variety. How certain of their behaviour we can be will vary – with a tendency to the less certain as we move further from the centre of the onion. We’ll need to decide how, if at all, we want to respond to each kind of variety.

But that will have to wait for my next blog. In the meantime, here are some ways people look at the onion.

Onion7   Onion8

 

 

 

 

SONY DSCStuart Boardman is a Senior Business Consultant with KPN Consulting where he leads the Enterprise Architecture practice and consults to clients on Cloud Computing, Enterprise Mobility and The Internet of Everything. He is Co-Chair of The Open Group Open Platform 3.0™ Forum and was Co-Chair of the Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by KPN, the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI as well as several Open Group white papers, guides and standards. He is a frequent speaker at conferences on the topics of Open Platform 3.0 and Identity.

2 Comments

Filed under Cloud, Cloud/SOA, Conference, Enterprise Architecture, Open Platform 3.0, Service Oriented Architecture, Standards, Uncategorized

Future Technologies

By Dave Lounsbury, The Open Group

The Open Group is looking toward the future – what will happen in the next five to ten years?

Those who know us think of The Open Group as being all about consensus, creating standards that are useful to the buy and supply side by creating a stable representation of industry experience – and they would be right. But in order to form this consensus, we must keep an eye on the horizon to see if there are areas that we should be talking about now. The Open Group needs to keep eyes on the future in order to keep pace with businesses looking to gain business advantage by incorporating emerging technologies. According to the McKinsey Global institute[1], “leaders need to plan for a range of scenarios, abandoning assumptions about where competition and risk could come from and not to be afraid to look beyond long-established models.”

To make sure we have this perspective, The Open Group has started a series of Future Technologies workshops. We initiated this at The Open Group Conference in Philadelphia with the goal of identifying emerging business and technical trends that change the shape of enterprise IT.  What are the potential disruptors? How should we be preparing?

As always at The Open Group, we look to our membership to guide us. We assembled a fantastic panel of experts on the topic who offered up insights into the future:

  • Dr. William Lafontaine, VP High Performance Computing, Analytics & Cognitive Markets at IBM Research: Global technology Outlook 2013.
  • Mike Walker, Strategy and Enterprise Architecture Advisor at HP: An Enterprise Architecture’s Journey to 2020.

If you were not able to join us in Philadelphia, you can view the Livestream session on-demand.

Dr. William Lafontaine shared aspects of the company’s Global Technology Outlook 2013, naming the top trends that the company is keeping top of mind, starting with a confluence of social, mobile analytics and cloud.

According to Lafontaine and his colleagues, businesses must prepare for not “mobile also” but “mobile first.” In fact, there will be companies that will exist in a mobile-only environment.

  • Growing scale/lower barrier of entry – More data created, but also more people able to create ways of taking advantage of this data, such as companies that excel at personal interface. Multimedia analytics will become a growing concern for businesses that will be receiving swells of information video and images.
  • Increasing complexity – the Confluence of Social, Mobile, Cloud and Big Data / Analytics will result in masses of data coming from newer, more “complex” places, such as scanners, mobile devices and other “Internet of Things”. Yet, these complex and varied streams of data are more consumable and will have an end-product which is more easily delivered to clients or user.  Smaller businesses are also moving closer toward enterprise complexity. For example, when you swipe your credit card, you will also be shown additional purchasing opportunities based on your past spending habits.  These can include alerts to nearby coffee shops that serve your favorite tea to local bookstores that sell mysteries or your favorite genre.
  •  Fast pace – According to Lafontaine, ideas will be coming to market faster than ever. He introduced the concept of the Minimum Buyable Product, which means take an idea (sometimes barely formed) to inventors to test its capabilities and to evaluate as quickly as possible. Processes that once took months or years can now take weeks. Lafontaine used the MOOC innovator Coursera as an example: Eighteen months ago, it had no clients and existed in zero countries. Now it’s serving over 4 million students around the world in over 29 countries. Deployment of open APIs will become a strategic tool for creation of value.
  • Contextual overload – Businesses have more data than they know what to do with: our likes and dislikes, how we like to engage with our mobile devices, our ages, our locations, along with traditional data of record. The next five years, businesses will be attempting to make sense of it.
  • Machine learning – Cognitive systems will form the “third era” of computing. We will see businesses using machines capable of complex reasoning and interaction to extend human cognition.  Examples are a “medical sieve” for medical imaging diagnosis, used by legal firms in suggesting defense / prosecution arguments and in next generation call centers.
  • IT shops need to be run as a business – Mike Walker spoke about how the business of IT is fundamentally changing and that end-consumers are driving corporate behaviors.  Expectations have changed and the bar has been raised.  The tolerance for failure is low and getting lower.  It is no longer acceptable to tell end-consumers that they will be receiving the latest product in a year.  Because customers want their products faster, EAs and businesses will have to react in creative ways.
  • Build a BRIC house: According to Forrester, $2.1 trillion will be spent on IT in 2013 with “apps and the US leading the charge.” Walker emphasized the importance of building information systems, products and services that support the BRIC areas of the world (Brazil, Russia, India and China) since they comprise nearly a third of the global GDP. Hewlett-Packard is banking big on “The New Style of IT”: Cloud, risk management and security and information management.  This is the future of business and IT, says Meg Whitman, CEO and president of HP. All of the company’s products and services presently pivot around these three concepts.
  • IT is the business: Gartner found that 67% of all EA organizations are either starting (39%), restarting (7%) or renewing (21%). There’s a shift from legacy EA, with 80% of organizations focused on how they can leverage EA to either align business and IT standards (25%), deliver strategic business and IT value (39%) or enable major business transformation (16%).

Good as these views are, they only represent two data points on a line that The Open Group wants to draw out toward the end of the decade. So we will be continuing these Future Technologies sessions to gather additional views, with the next session being held at The Open Group London Conference in October.  Please join us there! We’d also like to get your input on this blog.  Please post your thoughts on:

  • Perspectives on what business and technology trends will impact IT and EA in the next 5-10 years
  • Points of potential disruption – what will change the way we do business?
  • What actions should we be taking now to prepare for this future?

[1] McKinsey Global Institute, Disruptive technologies: Advances that will transform life, business, and the global economy. May 2013

Dave LounsburyDave Lounsbury is The Open Group‘s Chief Technology Officer, previously VP of Collaboration Services.  Dave holds three U.S. patents and is based in the U.S.

1 Comment

Filed under Cloud, Enterprise Architecture, Future Technologies, Open Platform 3.0

Three laws of the next Internet of Things – the new platforming evolution in computing

By Mark Skilton, Global Director at Capgemini

There is a wave of new devices and services that are growing in strength extending the boundary of what is possible in today’s internet driven economy and lifestyle.   A striking feature is the link between apps that are on smart phones and tablets and the ability to connect to not just websites but also to data collection sensors and intelligence analytical analysis of that information.   A key driver of this has also been the improvement in the cost-performance curve of information technology not just in CPU and storage but also the easy availability and affordability of highly powerful computing and mass storage in mobile devices coupled with access to complex sensors, advanced optics and screen displays results in a potentially truly immersive experience.  This is a long way from the early days of radio frequency identity tags which are the forerunner of this evolution.   Digitization of information and its interpretation of meaning is everywhere, moving into a range of industries and augmented services that create new possibilities and value. A key challenge in how to understand this growth of devices, sensors, content and services across the myriad of platforms and permutations this can bring.

·         Energy conservation

o   Through home and building energy management

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential
that connecting Devices

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems.

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Examples include Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.  Or Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in harsh or inaccessible environments. Disabled support through robotic prosthetics and communication synthesis.   Swarming robots that fly or mimic group behavior.  Swarming robots that mimic nature and decision making.

These are just the tip of want is possible; the early commercial ventures that are starting to drive new ways to think about information technology and application services.

A key feature I noticed in all these devices are that they augment previous layers of technology by sitting on top of them and adding extra value.   While often the long shadow of the first generation giants of the public internet Apple, Google, Amazon give the impression that to succeed means a controlled platform and investment of millions; these new technologies use existing infrastructure and operate across a federated distributed architecture that represents a new kind of platforming paradigm of multiple systems.

Perhaps a paradigm of new technology cycles is that as the new tech arrives it will cannibalize older technologies. Clearly nothing is immune to this trend, even the cloud,   I’ll call it even the evolution of a  kind a technology laws ( a feature  I saw in by Charles Fine clock speed book http://www.businessforum.com/clockspeed.html  but adapted here as a function of compound cannibalization and augmentation).  I think Big Data is an example of such a shift in this direction as augmented informatics enables major next generation power pays for added value services.

These devices and sensors can work with existing infrastructure services and resources but they also create a new kind of computing architecture that involves many technologies, standards and systems. What was in early times called “system of systems” Integration (Examples seen in the defence sector  http://www.bctmod.army.mil/SoSI/sosi.html  and digital ecosystems in the government sector  http://www.eurativ.com/specialreport-skills/kroes-europe-needs-digital-ecosy-interview-517996 )

While a sensor device can replace the existing thermostat in your house or the lighting or the access locks to your doors, they are offering a new kind of augmented experience that provides information and insight that enabled better control of the wider environment or the actions and decisions within a context.

This leads to a second feature of these device, the ability to learn and adapt from the inputs and environment.  This is probably an even larger impact than the first to use infrastructure in that it’s the ability to change the outcomes is a revolution in information.  The previous idea of static information and human sense making of this data is being replaced by the active pursuit of automated intelligence from the machines we build.   Earlier design paradigms that needed to define declarative services, what IT call CRUD (Create, Read, Update, Delete) as predefined and managed transactions are being replaced by machine learning algorithms that seek to build a second generation of intelligent services  that alter the results and services with the passage of time and usage characteristics.

This leads me to a third effect that became apparent in the discussion of lifestyle services versus medical and active device management.  In the case of lifestyle devices a key feature is the ability to blend in with the personal activity to enable new insight in behavior and lifestyle choices, to passively and actively monitor or tack action, not always to affect they behavior itself. That is to provide unobtrusive, ubiquitous presence.   But moving this idea further it is also about the way the devices could merge in a become integrated within the context of the user or environmental setting.  The example of biomedical devices to augment patient care and wellbeing is one such example that can have real and substantive impact of quality of life as well as efficiency in cost of care programs with an aging population to support.

An interesting side effect of these trends is the cultural dilemma these devices and sensors bring in the intrusion of personal data and privacy. Yet once the meaning and value of if this telemetry on safety , health or material value factors is perceived for the good of the individual and community, the adoption of such services may become more pronounced and reinforced. A virtuous circle of accelerated adoption seen as a key characteristic of successful growth and a kind of conditioning feedback that creates positive reinforcement.     While a key feature that is underpinning these is the ability of the device and sensor to have an unobtrusive, ubiquitous presence this overall effect is central to the idea of effective system of systems integration and borderless information flow TM (The Open Group)

These trends I see as three laws of the next Internet of things describing a next generation platforming strategy and evolution.

Its clear that sensors and devices are merging together in a way that will see cross cutting from one industry to another.  Motion and temperature sensors in one will see application in another industry.   Services from one industry may connect with other industries as combinations of these services, lifestyles and affects.

Iofthings1.jpg

Formal and informal communities both physical and virtual will be connected through sensors and devices that pervade the social, technological and commercial environments. This will drive further growth in the mass of data and digitized information with the gradual semantic representation of this information into meaningful context.  Apps services will develop increasing intelligence and awareness of the multiplicity of data, its content and metadata adding new insight and services to the infrastructure fabric.  This is a new platforming paradigm that may be constructed from one or many systems and architectures from the macro to micro, nano level systems technologies.

The three laws as I describe may be recast in a lighter tongue-in-cheek way comparing them to the famous Isaac Asimov three laws of robotics.   This is just an illustration but in some way implies that the sequence of laws is in some fashion protecting the users, resources and environment by some altruistic motive.  This may be the case in some system feedback loops that are seeking this goal but often commercial micro economic considerations may be more the driver. However I can’t help thinking that this does hint to what maybe the first stepping stone to the eventuality of such laws.

Three laws of the next generation of The Internet of Things – a new platforming architecture

Law 1. A device, sensor or service may operate in an environment if it can augment infrastructure

Law 2.  A device, sensor or service must be able  to learn and adapt its response to the environment as long as  it’s not in conflict with the First law

Law 3. A device, sensor or service  must have unobtrusive ubiquitous presence such that it does not conflict with the First or Second laws

References

 ·       Energy conservation

o   The example of  Nest  http://www.nest.com Learning thermostat, founded by Tony Fadell, ex ipod hardware designer and  Head of iPod and iPhone division, Apple.   The device monitors and learns about energy usage in a building and adapts and controls the use of energy for improved carbon and cost efficiency.

·         Lifestyle activity

o   Motion sensor Accelerometers, ambient light sensors, moisture sensors, gyroscopes, proximity sensors.  Example such as UP Jawbone  https://jawbone/up and Fitbit  http://www.fitbit.com .

·          Lifestyle health

o   Heart rate, blood oxygen levels, respiratory rate, heart rate variability, for cardiorespiratory monitoring are some of the potential that connecting Devices such as Zensorium  http://www.zensorium.com

·         Medical Health

o   Biomedical sensing for patient care and elderly care management,  heart, lung, kidney dialysis,  medial value and organ implants, orthopaedic implants and brain-image scanning.   Examples of devices can monitoring elderly physical activity, blood pressure and other factors unobtrusively and proactively.  http://www.nytimes.com/2010/07/29/garden/29parents.html?pagewanted-all  These aim to drive improvements in prevention, testing, early detection, surgery and treatment helping improve quality of life and address rising medical costs and society impact of aging population.

·         Transport

o   Precision global positioning, local real time image perception interpretation  sensing, dynamic electromechanical control systems. Examples include Toyota  advanced IT systems that will help drivers avoid road accidents.  Http://www.toyota.com/safety/ Google driverless car  http://www.forbes.com/sites/chenkamul/2013/01/22/fasten-your-seatbelts-googles-driverless-car-is-worth-trillions/

·         Materials science engineering and manufacturing

o   Strain gauges, stress sensors, precision lasers, micro and nanoparticle engineering,  cellular manipulation, gene splicing,
3D printing has the potential to revolutionize automated manufacturing but through distributed services over the internet, manufacturing can potentially be accessed by anyone.

·         Physical Safety and security

o   Alpha Blue http://www.alphablue.co.uk Controlling children’s access to their mobile phone via your pc is an example of parental protection of children using web based applications to monitory and control mobile and computing access.

o   Keyless entry using your  phone.  Wiki, Bluetooth and internet network app and device to automate locking of physical; door and entry remotely or in proximity. Examples such as Lockitron  https://www.lockitron.com.

·         Remote activity and swarming robotics

o   The developing of autonomous robotics to respond and support exploration and services in  harsh or inaccessible environments. Examples include the NASA Mars curiosity rover that has active control programs to determine remote actions on the red planet that has a signal delay time round trip (13 minutes, 48 seconds EDL) approximately 30 minutes to detect perhaps react to an event remotely from Earth.  http://blogs.eas.int/mex/2012/08/05/time-delay-betrween-mars-and-earth/  http://www.nasa.gov/mission_pages/mars/main/imdex.html .  Disabled support through robotic prosthetics and communication synthesis.     http://disabilitynews.com/technology/prosthetic-robotic-arm-can-feel/.  Swarming robotc that fly or mimic group behavior.    University of Pennsylvania, http://www.reuters.com/video/2012/03/20/flying-robot-swarms-the-future-of-search?videoId-232001151 Swarming robots ,   Natural Robotics Lab , The University of Sheffield , UK   http://www.sheffield.ac.uk/news/nr/sheffield-centre-robotic-gross-natural-robotics-lab-1.265434

 Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Data management, Platform 3.0

The era of “Internet aware systems and services” – the multiple-data, multi-platform and multi-device and sensors world

By Mark Skilton, Global Director at Capgemini

Communications + Data protocols and the Next Internet of Things Multi-Platform solutions

Much of the discussion on the “internet of things” have been around industry sector examples use of device and sensor services.  Examples of these I have listed at the end of this paper.  What are central to this emerging trend are not just sector point solutions but three key technical issues driving a new Industry Sector Digital Services strategy to bring these together into a coherent whole.

  1. How combinations of system technologies platforms are converging enabling composite business processes that are mobile , content and transactional rich  and with near real time persistence and interactivity
  2. The development of “non-web browser” protocols in new sensor driven machine data that are emerging that extend  new types of data into internet connected business and social integration
  3. The development of “connected systems” that move solutions in a new digital services of multiple services across platforms creating new business and technology services

I want to illustrate this by focusing on three topics:  multi-platforming strategies, communication protocols and examples of connected systems.

I want to show that this is not a simple “three or four step model” that I often see where mobile + applications and Cloud equal a solution but result in silos of data and platform integration challenges. New processing methods for big data platforms, distributed stream computing and in memory data base services for example are changing the nature of business analytics and in particular marketing and sales strategic planning and insight.  New feedback systems collecting social and machine learning data are  creating new types of business growth opportunities in context aware services that work and augment skills and services.

The major solutions in the digital ecosystem today incorporate an ever growing mix of devices and platforms that offer new user experiences and  organization. This can be seen across most all industry sectors and horizontally between industry sectors. This diagram is a simplistic view I want to use to illustrate the fundamental structures that are forming.

Iofthings1.jpg

Multiple devices that offer simple to complex visualization, on-board application services

Multiple Sensors that can economically detect measure and monitor most physical phenomena: light, heat, energy, chemical, radiological in both non-biological and biological systems.

Physical and virtual communities of formal and informal relationships. These human and/ or machine based  associations in the sense that search and discover of data and resources that can now work autonomously across an internet of many different types of data.

Physical and virtual Infrastructure that represent servers, storage, databases, networks and other resources that can constitute one or more platforms and environments. This infrastructure now is more complex in that it is both distributed and federated across multiple domains: mobile platforms, cloud computing platforms, social network platforms, big data platforms and embedded sensor platforms. The sense of a single infrastructure is both correct and incorrect in that is a combined state and set of resources that may or may not be within a span of control of an individual or organization.

Single and multi-tenanted Application services that operate in transactional, semi or non-deterministic ways that drive logical processing, formatting, interpretation, computation and other processing of data and results from one-to-many, many-to-one or many-to-many platforms and endpoints.

The key to thinking in multiple platforms is to establish the context of how these fundamental forces of platform services are driving interactions for many Industries and business and social networks and services. This is changing because they are interconnected altering the very basis of what defines a single platform to a multiple platform concept.

MS2This diagram illustrates some of these relationships and arrangements.   It is just one example of a digital ecosystem pattern, there can be other arrangements of these system use cases to meet different needs and outcomes.

I use this model to illustrate some of the key digital strategies to consider in empowering communities; driving value for money strategies or establishing a joined up device and sensor strategy for new mobile knowledge workers.   This is particularly relevant for key business stakeholders decision making processes today in Sales, Marketing, Procurement, Design, Sourcing, Supply and Operations to board level as well as IT related Strategy and service integration and engineering.

Taking one key stakeholder example, the Chief Marketing Officer (CMO) is interested and central to strategic channel and product development and brand management. The CMO typically seeks to develop Customer Zones, Supplier zones, marketplace trading communities, social networking communities and behavior insight leadership. These are critical drivers for successful company presence, product and service brand and market grow development as well as managing and aligning IT Cost and spend to what is needed for the business performance.  This creates a new kind of Digital Marketing Infrastructure to drive new customer and marketing value.  The following diagram illustrates types of  marketing services that raise questions over the types of platforms needed for single and multiple data sources, data quality and fidelity.

ms3These interconnected issues effect the efficacy and relevancy of marketing services to work at the speed, timeliness and point of contact necessary to add and create customer and stakeholder value.

What all these new converged technologies have in common are communications.  But  communications that are not just HTTP protocols but wider bandwidth of frequencies that are blurring together what is now possible to be connected.

These protocols include Wi-Fi and other wireless systems and standards that are not just in the voice speech band but also in the collection and use of other types of telemetry relating to other senses and detectors.

All these have common issues of Device and sensor compatibility, discovery and paring and security compatibility and controls.

ms4Communication standards examples for multiple services.

  • Wireless: WLAN, Bluetooth, ZigBee, Z-Wave, Wireless USB,
  •  Proximity Smartcard, Passive , Active, Vicinity Card
  • IrDA, Infrared
  • GPS Satellite
  • Mobile 3G, 4GLTE, Cell, Femtocell, GSM, CDMA, WIMAX
  • RFID RF, LF, HFbands
  • Encryption: WEP, WPA, WPA2, WPS, other

These communication protocols impact on the design and connectivity of system- to-system services. These standards relate to the operability of the services that can be used in the context of a platform and how they are delivered and used by consumers and providers..  How does the data and service connect with the platform? How does the service content get collected, formatted, processed and transmitted between the source and target platform?  How do these devices and sensors work to support extended and remote mobile and platform service?  What distributed workloads work best in a mobile platform, sensor platform or distributed to a dedicated or shared platform that may be cloud computing or appliance based for example?

Answering these questions are key to providing a consistent and powerful digital service strategy that is both flexible and capable of exploiting, scaling and operating with these new system and intersystem capabilities.

This becomes central to a new generation of Internet aware data and services that represent the digital ecosystem that deliver new business and consumer experience on and across platforms.ms5

This results in a new kind of User Experience and Presence strategy that moves the “single voice of the Customer” and “Customer Single voice” to a new level that works across mobile, tablets and other devices and sensors that translate and create new forms of information and experience for consumers and providers. Combining this with new sensors that can include for example; positional, physical and biomedical data content become a reality in this new generation of digital services.  Smart phones today have a price-point that includes many built in sensors that are precision technologies measuring physical and biological data sources. When these are built into new feedback and decision analytics creates a whole new set of possibilities in real time and near real time augmented services as well as new levels of resource use and behavior insight.

The scale and range of data types (text, voice, video, image, semi structured, unstructured, knowledge, metadata , contracts, IP ) about social, business and physical environments have moved beyond the early days of RFID tags to encompass new internet aware sensors, systems, devices and services.  ms6This is not just “Tabs and Pads” of mobiles and tablets but a growing presence into “Boards, Places and Spaces” that make up physical environments turning them in part of the interactive experience and sensory input of service interaction. This now extends to the massive scale of terrestrial communications that connect across the planet and beyond in the case of NASA for example; but also right down to the Micro, Nano, Pico and quantum levels in the case of Molecular and Nano tech engineering .   All these are now part of the modern technological landscape that is pushing the barriers of what is possible in today’s digital ecosystem.

The conclusion is that strategic planning needs to have insight into the nature of new infrastructures and applications that will support these new multisystem workloads and digital infrastructures.
I illustrate this in the following diagram in what I call the “multi-platforming” framework that represents this emerging new ecosystem of services.ms7

Digital Service = k ∑ Platforms + ∑ Connections

K= a coefficient measuring how open, closed and potential value of service

Digital Ecosystem = e ∑ Digital Services

e = a coefficient of how diverse and dynamic the ecosystem and its service participants.

I will explore the impact on enterprise architecture and digital strategy in future blogs and how the emergence of a new kind of architecture called Ecosystem Arch.

Examples of new general Industry sector services Internet of Things

 Mark Skilton is Global Director for Capgemini, Strategy CTO Group, Global Infrastructure Services. His role includes strategy development, competitive technology planning including Cloud Computing and on-demand services, global delivery readiness and creation of Centers of Excellence. He is currently author of the Capgemini University Cloud Computing Course and is responsible for Group Interoperability strategy.

4 Comments

Filed under Cloud, Cloud/SOA, Conference, Data management, Platform 3.0