Service Brokering and an Enterprise Standard – Build your Competitive Advantage in the Digital World

By Christian Verstraete, Recently Retired from DXC

Over the last ten years I have focused on cloud computing and seen increased adoption of cloud in enterprises. Companies large and small have adopted Software as a Service (SaaS) and traditional private/public PaaS/IaaS cloud services to expand their digital footprint. In doing so they depend increasingly on an ever-larger supplier community to obtain the digital support required to run their business.

IT departments are evolving from a closely-knit community of professionals within the enterprise to a large community of service providers jointly delivering services to the enterprise, and that is often causing problems. The quality of the delivered services depends on a large and often complex supply chain driven by contractual arrangements that are not always optimally designed.

Users consume services from multiple origins expecting ready access to an integrated IT environment and a top-quality experience. The enterprise IT department is responsible for addressing the users’ expectations working closely with the community of suppliers.

This realization has led me to look for solutions on how to standardize the interactions between enterprises in these supply chains. I got the opportunity to participate to the IT4IT™ vertical service model workgroup of The Open Group a couple years ago and this gave me a solution, combining Service Brokering and Service Integration and Management, using IT4IT as the reference architecture for the interaction between the players. With the great help of the workgroup, I documented these findings in The Open Group Guide to “Service Brokering with the IT4IT™ Standard” (available at www.opengroup.org/library/g18f), that I would like to introduce to you with this blog entry.

A combination of processes and technologies, supported by a reference architecture

Let’s start from the what is required. For the foreseeable future, most IT departments will have to integrate legacy environments and cloud services, requiring the service delivery to bridge both worlds. Any IT provider, internal or external, should see their primary role as delivering services that bring value to users. As organizations evolve from a single IT ecosystem where IT controls all aspects to consuming services from multiple providers – whether private or public cloud services, SaaS services, traditional services, or any combination thereof – IT is becoming more complex to access and manage.

IT4IT

The user expectations for availability and ease of use increase rapidly requiring the IT department to put two elements in place:

  • A service brokering platform enabling the users to access the proposed services in a transparent manner, regardless of where they are sourced from
  • A service integration and management approach enabling the multiple players involved in the delivery of the services to cooperate transparently and efficiently address issues as and when they occur.

The term broker is typically associated with cloud, but I’m using it in a wider sense, realizing that IT delivers all sorts of services, which users would ideally source from the same place. NIST (The National Institute of Standards and Technologies) has defined “Cloud Broker”.  I’d like to propose a more general definition:

Service Broker – An entity that manages the use, performance and delivery of IT services and negotiates relationships between providers and consumers.

Implementing service brokering within an organization requires a fundamental change in culture as the focus needs to evolve from function/technology to service and service delivery. Rather than silos focused around technologies, the organization should rally around teamwork to deliver each service in an optimal way as the broker is central in the integration process between provider and consumer. This is the most difficult aspect when implementing brokering. Changing the way people work, evolving their behaviors to be more user focused takes time. Unfortunately, IT departments have no choice, either they are able to deliver the services required by the users through the supply chain they have developed or they will focus on managing the legacy environments, which may not be seen as a very exciting job.

Multiple service use cases are documented in the guide. For each of them the roles and responsibilities of each of the players differ, but efficient service delivery can only be assured if the providers work smoothly and transparently together. The IT4IT standard can play an important role in delivering this working relationship between the IT department, the providers of the service and the end-users.

The guide describes in detail how the standard can be used to implement both the platform and the service integration and management approach required to create the user experience expected. It builds on the IT4IT Reference Architecture Version 2.1 to demonstrate how the architecture implemented by the IT department interacts with the ones of the service providers. In particular, it focuses on the IT4IT Request to Fulfill value stream and its cross-enterprise interactions. Through examples it describes the interactions between the components for the use cases identified and demonstrates the recursive nature of the processes when third parties are involved.

Further in the document, I address the management of suppliers in a brokering world. This leads to a discussion on Service Integration and Management, often referred to as SIAM®. To achieve the objective of transparency described earlier, I argue the need to go beyond the traditional SIAM approach and the six ITIL® processes it addresses. I demonstrate how, by integrating an additional 9 functions, as described in the guide,  services can be delivered in a transparent and efficient manner to end-users. This implies the IT department properly manages the supplier base. As suppliers are often much larger than the company itself, they each may have their own implementation of standards, often making them inconsistent. Using the IT4IT standard is a non-threatening way to build consistency and enable proper integration between the companies. Right from the initial contacts, the IT department should point to the need of exchanging specific data items with the supplier, leaving it up to the supplier to define how those items are collected and used. Those data items can directly be lifted from the IT4IT standard.

The IT4IT Reference Architecture focuses on the information exchanges and relations between (IT/Business) functions and prescribes the composition of functions in a digital/IT organization. The interoperability between those functions across different organizations can only focus on the unique information exchanged as a service moves through the IT Value Chain. This can only be achieved through the standardization of the information exchange, still providing companies with the freedom to implement their own processes to generate the information required. The IT4IT standard was originally designed to align the focus of the IT function to the entire value chain and solve interoperability between different IT functions, processes, and tools within the enterprise. The Guide demonstrates how to expand this capability to the integrated ecosystem, helping to successfully facilitate the management of relationships for both users and providers of services.

I hope that, through this blog entry, I managed to make you curious and interested in reading the Guide to Service Brokering with the IT4IT Standard ((available at www.opengroup.org/library/g18f). I do hope it can help you and your company improve the relationships with your suppliers and streamline IT operations while giving your users an excellent experience.

Christian Verstraete_Headshot_2019MarchChristian has been working for nearly 40 years in the IT industry, working for HP, HPE, and finally DXC. Over the years he has focused on the manufacturing industry in general and supply chain in particular. The need for communities of enterprises to closely collaborate through integrated platforms drove him to follow cloud computing from the early days. He led the development of the subsequent versions of the HP/HPE Cloud Functional Reference Architecture and the developments of advisory services for hybrid IT. Christian is also the co-author of a book on collaborative sourcing and is a seasoned blogger (www.cloudsourceblog.com). He studied mechanical engineering and industrial management at the UCL/KUL in Belgium. He recently retired from DXC and is pursuing other passions.