Tag Archives: The Open Group

A Reference Architecture for Interoperability in the European Union: A Conversation with Raul Abril

By The Open Group

Moving to a digital infrastructure requires far more interoperability and Boundaryless Information Flow™ than in the past. This is particularly true for digital transformation efforts within governments, many of which are known for being extremely siloed, where information exchange between government branches or agencies can be problematic.

The European Union is currently deploying an Interoperability Reference Architecture as part of its e-Government initiatives. We spoke with Raul Abril, Programme Manager, EU Policies for the European Commission, about how his team is going about building that architecture, which is known as the European Interoperability Reference Architecture. Raul will be a keynote speaker at The Open Group Paris 2016 on October 24.

Tell us a bit about the European Interoperability Reference Architecture. How was it designed and how is it currently being used?

First of all, the European Interoperability Reference Architecture (EIRA) came from a vision. This vision was that it had to fulfill a need, and the need was expressed in the existence of digital barriers across European borders, which is against one of the major political priorities in the European Union: The creation of a real and effective single market. It had to be a reference to build up solutions that are interoperable. This is important at all levels (local, regional, national, European) of public administration because many of the e-Government solutions were created in a silo mode. There was a need to provide a common framework for solutions architects to design their solutions in a way that would allow their solutions to be interoperable. There was a business need obviously, and then the way of implementing the EIRA came from my personal professional experiences. The European Commission in general and the Interoperability Solutions for public Administrations unit, ISA in short, in particular, through the ISA2 Programme, have brought state of the art approaches and talent on board in order to address such needs.     

How does the EIRA help provide a structure for interoperable e-Government solutions?

EIRA helps in different ways. One obvious way is that the EIRA is a controlled vocabulary. There are definitions, there are building blocks and there are relationships and the set for those things is in a controlled vocabulary. Why is that important? Because we need to understand each other, so one way of achieving interoperability is by achieving and expressing our designs in the same way.

I’ll give you an example of it in practicality. When you are a public vendor, when you are a government or a member state and you ask for offers and want to express your terms of reference, if we are all using the same controlled vocabulary, there is no doubt that the conformance will be better. But there are other ways. How is EIRA supporting interoperability? The answer also comes from another important concept—the interoperability specifications. Those interoperability specs should be based in open standards. What makes a building block interoperable should be described using interoperability specifications. This becomes a critical success factor for achieving interoperability between solution A and solution B. Why? Because by doing that then solution A and solution B will be using the same interoperability specs. Does it mean that both will be interoperable? Not necessarily, but if they don’t have that it will be almost impossible for them to be interoperable. That’s where the EIRA supports interoperability.

We have started identifying what interoperability specs, based on standards, should be referred in each of the building blocks in the EIRA. 

How is TOGAF® helping to inform the EIRA?

TOGAF, an Open Group standard, is one well known approach for enterprise architecture frameworks (EAF). Of course, there are other EAFs. The reason for using TOGAF was because we consider it appropriate in terms of openness, which is what we’re looking for. This does not mean that European public administrations will have to use TOGAF.

EIRA is a reference architecture. A reference architecture is basically a reference model married to an architectural style. The architectural style we selected for EIRA was SOA, service oriented architecture. That was a critical decision, which means that we wanted to conceptualize any type of solution as being service based, which means that we also care about the code components. But we are also putting attention on the behavioral part. That’s why we selected SOA.

The reference model explains the ontological properties of the components that you’re going to have. How do you designate names? What are the properties of the relationships between them?, etc. We selected ArchiMate®, an Open Group standard,  as the ontology for our reference model. So, the EIRA is based on ArchiMate as the reference model and in a service oriented architecture as the architecture style.

After explaining the “RA” in the EIRA acronym, we should explain now the “I” for interoperability. In general, reference architectures like the EIRA do not have the same ambition as enterprise architecture frameworks. EAFs like TOGAF have the ambition to provide support for the end-to-end design, implementation and lifecycle of a solution. Reference architectures focus on a specific aspect—in the EIRA case, interoperability—and they need to provide the most salient—not all—architectural building blocks that should be considered to address such specific aspects. The EIRA provides guidance on the most salient architectural building blocks to be considered when designing an interoperable solution.

For example, if you are going to design whatever solution for a business or government, one of the things you’ll consider is back-up services. You’ll consider security measures, and one of those will be backing up your data, files, etc. The EIRA doesn’t care at all about back-up services—it doesn’t mean we don’t care about security, but we don’t care about back-up services because it’s not an essential service for interoperability.

At the beginning we identified the key architectural building blocks that were the most salient for supporting interoperability. Today, the EIRA is the result of a collaborative effort. So far a community of representatives of central public administrations of six European Member States have participated in the design and releases of EIRA  with their feedback and  they’ve been validating  the building blocks that EIRA has as the most salient for interoperability.

Are there challenges that are specific to building Reference Architectures for e-Government? What are they?

The biggest challenges are related to what I said before—getting a consensus in the community on what the interoperability specs should be and the standards for each of them.

Another challenge, I think, is adoption. There is a well-known issue with technology adoption in general and with solutions/frameworks/models in particular. One thing is to have the solution and it’s another to get it adopted. We are not talking about, let’s say, solutions for consumers like smartphones. We’re talking about communities of users that are very special. Generally speaking, solutions architects, portfolio managers, policy makers and also CIOs. Those are the potential users of reference architectures.

There is a lot of effort in communicating and disseminating information, and I don’t underestimate the effort it involves. Our challenge is in demonstrating to users and member states how they can use EIRA to solve national interoperability problems—for example, between their central, regional and local administrations, which in many places are huge. When users realize that the EIRA provides value addressing domestic problems, then they are better equipped to address their interoperability issues in cross-border public services.

What benefits do you expect to be able to receive from using the EIRA?

I expect first of all reusability. With interoperable solutions, you are able to reuse the information that has been produced in another place. So, we support the only once principle. The second one is the elimination of digital barriers. By interoperability of solutions we mean something as complex as to have a solution A in one place that is able to send information to another solution B, and that solution B is able to understand the information that has been communicated without noise and to process it respecting ex ante organizational and legal agreed terms. So, this is a more complex level of interoperability than just a message exchange because, for example, it should support administrative processes across borders. In fact, there has been significant progress understanding what is exchanged (i.e. message, data, documents) not mentioning that the technological aspects are well standardized. The issue remains on what happens in both ends from a legal and organizational perspective.

Coming back to the ultimate benefit, the EIRA will support the digital single market. By having interoperability you eliminate digital barriers. This is a huge expected benefit.

This translates into direct benefits for the citizens and businesses. In Europe there is, by comparison with the U.S., less mobility than in the U.S. If you move from the U.S. East Coast to the West Coast, there may be a three-hour time difference, but you will have less problems with many things. You may need to get a driver’s license in a new state but they recognize each other’s driver’s licenses. Here there are a lot of things to be achieved with the mobility of citizens and their needs in terms of public services. In some cases, if you want to move from one country to another, it is possible to access the public services of our home country via a portal. If we would be able to replicate this approach in all Member States and, very importantly, in a coherent way, then we would provide a huge benefit to citizens and businesses. The EIRA allows implementing holistically interoperability, not just from machine to machine. 

@theopengroup #ogPARIS

by-the-open-groupRaul Mario Abril Jimenez works in the ISA unit as Programme Manager, EU Policies, European Commission. He recently relocated to Brussels from Barcelona. He has had permanent residences in San Diego (USA) where he worked for 6 years and before he was based in Copenhagen (DK) for 7 years. He has +35 years of IT professional services experience on international professional engagements in Financial & Telco industries. His knowledge domains are Research Methods (Quantitative & Qualitative Analysis), Marketing (Research, IS), IT R&D (Portfolio Mgmt, Product Mgmt), Project Mgmt, and IS & Technology (Knowledge Management, DSS, BI, Data Warehousing, DBMS, IS Design). Raul has been professor in several universities and been active publishing his research.

Raul holds a doctoral degree (Henley Management College, UK), a European PhD Certification (European Doctoral School on Knowledge Management, DK), an Ing. Sup. Informatics (UAB, E), and a Master in Project Mgmt (The George Washington University, USA). He is a PMP certified professional.


1 Comment

Filed under ArchiMate®, Boundaryless Information Flow™, Enterprise Architecture, EU, European Commission, Interoperability, Standards, The Open Group, The Open Group Paris 2016, TOGAF®, Uncategorized

Having the Right Conversations: A Q&A with Craig Alexander

By The Open Group

For many years now, IT departments have been accused of being out of alignment with the needs of the business. According to Craig Alexander, a strategic consultant for Hewlett Packard Enterprise in EMEA, IT4IT™ has a chance to finally change all that.

Alexander, who has a background in large business transformations, says that Enterprise Architects (EAs) and IT departments alike should be looking at the successes and failures of past projects to help them better plan for what they need to do today.

We spoke to Alexander in advance of The Open Group Paris 2016 (October 24-27), where he will be speaking, about how the past can inform IT projects today, why ITIL is still relevant despite the new approach the IT4IT standard offers and how to have the right conversations that will move projects forward and better guarantee successful outcomes for everyone.

The title of your session is “To Plan for the Future, Look to the Past.” Why should EAs be looking backwards to look forward?

I’m not an architect. My background is in traditional service management practice moving into transition management and large-scale transformations, all of which have a business outcome.

If we look back through the eyes of IT4IT—whether it be large scale programs or transformations—we can pick up points and things we’ve done in the distant past and see where we’ve learned our lessons that helped to arrive at IT4IT. But moreover, we can project that forward in terms of ‘let’s not forget what we learned in the past and use that knowledge and that information as we move forward with IT4IT programs, so we’ll be better informed and better able to succeed.’

The thing that got me thinking about that was, I reached a certain age recently and started getting interested in history, where I’d never really been interested in it when I was younger. One of the things that comes out all the time when you study history, whether you’re talking about conflicts, financial crashes or similar significant events, is that if you look into the past you can find out what might happen again in the future. History tells us what could happen in the future. That was the somewhat tenuous link I made in my mind in terms of my role and ‘Wouldn’t it have been great if we’d had IT4IT when we did this?’

One of the things I asked my very first customers on a project nearly 20 years ago was ‘Why are we here?’ At the time this drew strange looks and some incredulity with responses of ‘We’re here to do this, we’re here to do that.’ I remarked ‘That’s what we’re here to do—but why are we doing it?’ At this point the team looked puzzled and said ‘Well actually we hadn’t thought of that.’ The customer CIO then said, ‘That’s a good point—we should all understand why we’re doing what we’re doing,’ and proceeded to provide the context of the project. Then we all knew why we were there!

I’ve always used that approach, but it’s only been since IT4IT has come to the fore that common sense has started to prevail in the industry. It’s still very much the minority view, especially within IT teams. It’s not so much within architecture groups, especially those that are adopting IT4IT, but it’s very easy to get entrenched in technology and the benefits that can be most immediately realized with technology as opposed to how it reaches into why and how business plans succeed or fail.

Certainly in my time in the industry both at organizations within IT and at end-user organizations, one of the common things I’ve seen is that it’s very easy for clever, focused or driven people to be a little blinkered when it comes to the point of doing technology. I’ve never been one to advocate that approach. IT is not there for the sake of IT—IT is there for a business purpose. At some point prior to a project starting or a migration or change in supplier, someone made a business decision that led to that occurring. They didn’t make an IT decision. And that’s the realm in which I operate. I try to make sure anyone with an IT focus I work with has that perspective.

In what ways do you see the past of IT now informing the future?

We can look back at the origins of business decisions and what has arisen as a result of them—the standards that could have been used at the time, how they have supported progress and how they helped or restricted any transformation in an organization.

For example, a transformation may be primarily driven from an ITIL or architectural perspective over and above the supplier governance or integration—by aligning these factors differently the transformation results (i.e. business outcomes) could have been manifestly better for no additional cost.

That’s the sort of example of how we can use IT4IT moving forward—think back to how it might have worked elsewhere, what you might have learned and project that forward and don’t be afraid to shout about it. For large transformation projects, the more experience and more wealth of knowledge you have can increase the chances of that transformation succeeding.

Has ITIL then proved to be inadequate for what customers need today?

ITIL is great and has proven to be for as long as I can remember. It was the first thing I did in my post-graduate role. It’s been very powerful for customers and continues to be. I see a similar route for IT4IT 15 years hence in terms of its adoption and development, regardless of industry. With respect to IT4IT, ITIL is much more focused on the delivery end of things as opposed to the strategic end of things and the reference architecture. That’s not to say it can’t touch on it, but it was never really designed to be that.

The observation we see retrospectively when we work within the realms of IT4IT is that ITIL was descriptive in its nature not prescriptive, which is one of the key differences in its nature. That prescriptive approach was very positive up to a point because it allowed organizations to adopt principles and work in a way where things are applied best. I’ve worked with organizations that have been very knowledgeable, astute and mature in that regard where things are very specific to the company. But one of the challenges that has arisen in the past has stemmed from the ability to apply interpretation to the standard.  For example in a multi-supplier environment where various organizations can all be applying ITIL but in ways which require complex integrations and create unnecessary difficulty when technology, legislative or supplier changes are required.

I will never criticize ITIL for what it was if for no other reason than it was the heart of what I did for a number of years and it helped to mature the IT Industry. Now the IT4IT standard has been launched and is being consumed, there is probably more than a fair share of—pun intended—revisionist history being applied to ITIL, which played a role for its time and will continue to play an important role moving forward. IT4IT, however, goes a bit further to make the connection toward business outcomes.

How does IT4IT better address the needs of organizations today?

The approach that I have been taking for the last 18 months within the HPE group I work in is rather than having an initial conversation with customers about a technology solution, something going out of support or more functionality, we’re having a conversation that starts with: ‘What are you trying to achieve? What are the business outcomes you’re trying to realize? We think technology might play a part in that.’ This is usually conducted in conjunction with an IT sponsor (a senior decision-maker or stakeholder) along with someone from the IT department. We’re being told by our customers that we’re having the ‘right’ conversations now. It’s a different conversation, but it’s the right conversation to have because it’s allowing IT to have discussions with leaders in terms that the business understands much more effectively.

An extreme example: One of our customers found themselves justifying funding for IT projects, something they had not really done in the past. Why? The business simply could not understand the value they would get from the projects. Despite all the use of acronyms and IT technology ‘speak,’ the customers’ needs were simple. Deliver value. Tell us what this will be and when we will get it. IT could not articulate this so consequently funding was being withheld.

Because IT4IT is structured around IT as a value chain supported by value streams, when using it logically it drives the conversation to value. Customers love this and realize immediately that the technology conversations they have been having with IT are the wrong ones. They want the value conversations and IT4IT has a major role to play here. Other customers have also told us ‘we’ve been having the wrong conversation’ even before we tell them how IT4IT can specifically address their own particular challenges; it’s like a light has suddenly been switched on. These are game-changing situations.

That’s been the most positive outcome—there’s so many things that historically IT departments never did. They’re starting to think in much more business terms. If we think back about the rhetoric in our industry three years ago there was a lot of ‘What is the position of the CIO? Should they be on the Board?’ There was all this conjecture about what that role should be. Increasingly, the IT department is being looked upon as just another business unit, so if the CIO is able to have the same conversation at a board level as finance or sales or marketing, that puts them at a better advantage;. IT4IT only serves to support that agenda.

In looking toward the past, how large a scope should IT organizations consider? Should they just look at what’s worked for them in the past or do they need to consider the industry as a whole?

For me, it starts at home. What has worked for us in the past? What are the things we know best? What are the parts of the company that are more challenging than others? Are there geographies where projects work? At the same time, in most organizations there will be individuals who have come from different industries, so exploiting all of their experience should always be taken into account. But the primary focus is what is being projected forward and taking that learning and the best knowledge and using it.

The people aspect is the hardest. You can take statistics from a number of years and derive any number of conclusions from that, but the behavior and the culture of the organizations are probably the strongest indicators of what a transformation’s impact will be It’s relatively easy to swap out IT, it’s not easy to change organizational behavior. It’s a lot harder to change the way people think or to motivate them toward certain outcomes. That’s where I would be trying to derive the most information from. It’s easy to prescribe a technology transformation, but if the organization as an entity don’t go along with that, no amount of technology change is going to make difference.

As a standard, how can IT4IT continue to evolve so that it remains relevant into the future?

There is no doubt that the timing for IT4IT is perfect. The industry is crying out for a prescriptive approach to running the business of IT. Value delivery and value realization will the lifeblood of IT in the future. So will IT4IT evolve? Almost certainly. As more organizations adopt IT4IT there will definitely be amendments and improvements. After all the current reference architecture is only version 2. Where I think the biggest impact could be is if organizations start to mandate IT4IT and vendors have to become IT4IT compliant. That’s when we will see even larger scale adoption and greater evolution of IT4IT.

At the end of the day, everything is geared toward digitalization, the digital transformation of organizations. That is the one common thing we see—irrespective of industry, geography, scale, or political environment—the digital agenda is governing everything. It is certainly our view at HPE that IT4IT is a very important means to achieving that. And when we start talking about IT4IT in the context of digital transformation, the resonance of the relevance of the IT4IT architecture and the approach to how an organization aligns with that resonates much more. At the same time, it also helps with the legacy side of things. It’s not just about IT4IT being relevant from a future technology perspective but it also allows organizations to manage the legacy with a forward looking aspect. So we see a lot of enthusiasm around that as well.

Organizations want a common way of running their IT, a common set of standards irrespective of the supplier, irrespective of the maturity of the technology, and IT4IT is giving them that option. We urge our customers to think big and start small. Start with the specifics, start with the most important areas of the business. Where are the needs to be addressed, pains and challenges first, and then progress from there and bring other parts of the organization into that way of thinking.

I use the analogy with my customers that if they’re using an airline’s app on their smartphone to change their flight, change their seat or purchase baggage, that’s not a new system that they’re using on their phone. That’s just the portal through which they view the old system that’s been around for 25-30 years and they want to be able to use that trusted system. So there’s a need to marry the user experience and the technology.

Is there anything that you can point to that accounts for the rapid adoption of IT4IT since its release?

I think for many organizations, IT4IT is bringing things into focus. Customers are usually reluctant to say ‘We’re really struggling to find something that’s working for us.’ Admitting to struggling with something is not something that many organizations like to share. I think for many organizations in the position where the digital agenda and the need to think like customers’ customers is very prominent, they’re making the connection between this standard and the prescriptive approach. IT4IT is industry, supplier and technology agnostic, and customers can take it on and adopt it in whatever appropriate way they see for their own organization; they can make it work regardless of how little or much knowledge they have in their organization because there’s also a community of organizations out there, like ourselves, who will help them with their transformation. I think there is a light bulb moment going on where they say ‘Yes, this could work,’ where instead of marrying two or three standards together to make it work for them, it’s a common way to move forward—that’s the recognition with which the uptake has manifested itself.

We have never had a prescriptive reference architecture for running the business of IT so it’s hardly a surprise that now we have one organizations are interested to find out more and work out how to use IT4IT. As also mentioned earlier, other approaches such as ITIL took a slightly different approach and IT4IT addresses a gap that has yet to be addressed by any other approach. So it really is the right thing at the right time!

For the press release of the launch of the IT4IT standard, click here.

For more information on The Open Group IT4IT™ Forum, please visit here.

The Open Group IT4IT™ Reference Architecture, Version 2.0 is available here.

@theopengroup #ogPARIS

by-the-open-groupCraig Alexander joined HP in December 2011 as a Strategic Transformation Consultant to deliver transformation initiatives linked to the adoption of software solutions with much of this focus was around SIAM-based initiatives for major clients. Since the end of 2014, he has focused on creating and initiating IT4IT-based initiatives for EMEA-based customers. His role consists of consulting with customers to promote the benefits of adopting an IT4IT approach to delivery and transformation whilst leveraging the expertise and capabilities of the wider Hewlett Packard Enterprise organization to deliver true business value.





Leave a comment

Filed under Enterprise Architecture, HPE, IT4IT, Standards, The Open Group, The Open Group Paris 2016, Uncategorized

Tackling Transformation in Government: A Conversation with Roland Genson

By The Open Group

It’s not just industry and corporations that are undergoing massive change due to digital transformation—governments worldwide are being equally affected by the need to create more efficient processes and to provide online services to citizens.

With 28 member states and three branches of government, the European Union (EU) is a prime example of just how complex transformation can be. We spoke with Roland Genson, Director in the General Secretariat of the Council of the European Union—one of the EU’s three branches—in advance of The Open Group Paris 2016 event (October 24 – 27) about the challenges the Council is facing and how they are working with the two other branches of government to achieve interoperability and Boundaryless Information Flow™.

What is the role of the General Secretariat of the Council (GSC) of the European Union? What sort of services does the Council provide?

In a nutshell, in the legislative process level at the European Union you have three institutions. The most known to the public is the European Commission, which has a role to make proposals and draft new legislation and submit it to the two co-legislators. On one side there’s the European Parliament where you have directly elected parliamentarians and on the other side, the Council of the European Union—that’s us. In the Council of the European Union you have the 28 member states represented, much like, for example, if you look at the U.S. Congress, you have the House and the Senate. As the General Secretariat, we are supporting those 28 member states in the negotiation process, meaning providing conferences, logistics and policy advice but also managing and circulating all the information they need to work in 24 European Union languages.

Why has the GSC undertaken a digital transformation? What led to that and made it necessary?

If we look back at the past, until 2014 all the institutions had their own IT strategy, their own development and so on. But today with digital transformation it’s more and more obvious that we need a fit and interoperability framework. In most of the 28 member states you have e-Government initiatives and digital transformation processes ongoing, and we are in the middle of those. We cannot just look around us and find solutions that are competitive with everyone. We believe that we have to work together on common standards and interoperability frameworks to make sure that we are able to connect to all 28 members, to connect to the other institutions, the Commission and the Parliament, otherwise it will be impossible for us—and for them—to work efficiently.

What are some of the challenges that the GSC is facing as part of the transformation?

I see at least three challenges. The first challenge is an internal one. Within our organization we need seamless information flow between all services. That’s the first place where boundaryless needs to kick in to get rid of existing silos, to eliminate disruptions  between services.

The second challenge is “Brussels-based,” which means the need to have Boundaryless Information Flow between EU institutions. When a proposal comes from the Commission, it should enter into the Council and the European Parliament without any new disruption or without any data or format conversion. Our target should be an end-to-end legislative drafting and negotiation process between the Commission, Council and the Parliament.

The third challenge is to become boundaryless with regard to the GSC’s main stakeholders, which are our member states, so that we are able to serve all 28 member states (MS) with standardized content that can immediately be used and linked within each MS subject to national needs, specifications or legal requirements.

Furthermore, as an additional challenge, we also have responsibility with regard to the European citizens, so public information that our organization deals with can easily be made available and understood for further analyses and exploitation by the interested citizen. It’s our challenge to get EU knowledge out to the civil society.

How are those challenges being addressed as part of the project? How long has your transformation project been going on?

I took over the responsibility for this newly created directorate in 2014 with a clear shift from IT to business outcome or value. A lot of organizations had gone on the same path where, until a certain point, the digital environment was mainly designed by IT departments. We really have now a situation where the business needs and expectations come first. Internal clients and our stakeholders outside are our first priority and on the basis of their perspectives we should see what standards and subsequent IT solutions allow us to get there. We started this business driven process in 2014. Moreover my concern was to have it immediately started together with the other institutions, because it doesn’t make sense for the Council alone to try to find a way for “its” future when the European Commission and the European Parliament have the same challenges. I believe progress made on interoperability solutions for European public administrations (ISA) is equally a valid framework for all institutions to set the necessary standards. And with the European Interoperability Framework, the EIF, we would also have another basis for the GSC’s digital developments. Though we started late in 2014, there are quite a number of approaches, standards and tools that we can take on board and consider as viable options for the future.

Are standards being used to address the challenges of the project?

Absolutely. For example, today we write all of our structured documents on a MS Word-based tool, specifically designed for all our services. Today, this doesn’t make sense anymore. We understand that all the content drafting shall be XML-based, and when discussing with the Commission and the European Parliament, we understand that for legislative process marking, AKOMA NTOSO is the right standard, which leads us to explore the  market, where common standards have already been shared and explored by other communities or organizations.

What role is Enterprise Architecture playing in your transformation?

Our task today is to get all key business processes designed and  documented—getting a clear view on this and assessing them according to corporate’s strategy, priorities and challenges. As mentioned before, it’s rather complicated in the sense that we have to get it aligned in-house, but also with the Commission and the European Parliament and eventually with the member states. Somehow we have to find the best and easiest standard and operating model to get there. What I would like to avoid is to set up a new set of processes which would be too rigid and would not allow us to meet the necessary flexibility some services might need.

What advice do you have for those undertaking digital transformation within government? What do people need to think about when they’re working with government entities as opposed to corporations or businesses?

The General Secretariat of the Council is probably one of the smallest organizations in Brussels but when we look at the “Council” and the “European Council,” the two institutions we serve, the challenge ahead is quite impressive. We have to serve hundreds of ministers plus a community of national officials, front line delegates and back office support, which easily covers more than 200,000 people. Obviously we cannot enter into negotiations with 28 member states to see what would be the best standard or framework but we cannot ignore that things are going on. What we try to do is to identify digital champions and undertake a number of exchange of views  to see what to move on.

As an example, we had a visit to Austria, as the Austrian government is already far advanced in digital transformation. We will have next year the Estonian presidency for the European Union. Estonia is also a digital champion, so we will try to learn from their experience and take advantage of their presidency in order to launch new services and test to see if things meet the needs of most member states. If not, we will swiftly adapt and explore something else. It will be a different, an experimental approach. We need to engage with Member States and vice versa, to trigger a greater awareness of what delegations would like to achieve in terms of content and knowledge delivery.

What role can standards play in helping government with transformation efforts?

For me it’s rather obvious, that if we agree on the same standards in our organizations, all stakeholders would know what the criteria would be, for example when launching a public procurement. It would make multilateral interactions a lot easier. We would not just look at one specific tool or software and see what is compatible, we’d just refer to the standards as a basis. Everyone would know about that standard and subsequently be ensured that products based on that standard are interoperable, are compatible with the institutions or with my neighbor states. Standards also offer semantics. We work in 24 languages. If we want to be sure that one terminology is always used in the same way in different languages, we also need to invest a lot in semantic interoperability.

What standards are you looking at or currently using?

We plan to use TOGAF®, an Open Group standard, in close collaboration with our colleagues on the IT-side for business process management. We want to have a well-documented process map of the organization to allow this smart integration, interoperability and processing of the information. It’s the business architecture part of TOGAF.

Are there other things that governments need to consider when doing transformation projects?

In my view, what is crucial is to have a genuine engagement of all stakeholders at the highest level.. The three Secretary-Generals from the European Parliament, the Commission and the Council expressed their commitment in this respect. To make the expected progress, we equally need a full commitment by all Council members, e.g. national delegations. So we will learn from them and they will learn from us and we will be able to achieve results together to transform our organization. For me, this is crucial. It’s a change in the mindset, but we need to adapt to be able to quickly exchange best practices, lessons and failures, as a way to make progress.

@theopengroup #ogPARIS

by-the-open-groupRoland Genson is director at the General Secretariat of the Council of the European Union, in charge of the Council’s document processing, recording, archiving, transparency and of the GSC’s libraries. He drives the redesign of the GSC’s knowledge and information management in order to align the organisation with digital innovation and with Member States expectations in this respect.

Until 2014, he was a GSC director covering Schengen, judicial cooperation and internal security cooperation under the Justice and Home Affairs policy framework.

From 1987 to 2007, he served in the Luxembourg law enforcement sector and than at the Ministry of Justice.

He is also a lecturer at the Universities of Luxemburg and of Liège.

Mr. Genson will be a keynote speaker at The Open Group Paris 2016 event on October 24.


Leave a comment

Filed under Boundaryless Information Flow™, Business Transformation, Digital Transformation, e-Government, EA, Enterprise Architecture, EU, European Commission, European Union, Interoperability, IT, standards, The Open Group, The Open Group Paris 2016, TOGAF®, Uncategorized

Transitioning The Open Group Examinations from Prometric to Pearson VUE

By Andrew Josey, VP, Standards & Certification and Deborah Schoonover, Director, Certification, The Open Group

The Open Group is moving to Pearson VUE as its new examination provider for IT certification exams.

At the time of writing this article (October 2016), we are in a period of dual operation, with most exams available at both Prometric and Pearson VUE.  Through January 31, 2017, you will have the option to take exams at Prometric testing centers, as you have in the past, and must do so if you are holding a Prometric voucher. There’s no change to the exam registration process with Prometric. Effective February 1, 2017, Pearson VUE will be the sole provider of our certification exams.

As part of the transition to Pearson VUE, we are changing the registration process. To take an exam at a Pearson VUE testing center, you will need an Open Group web account, even if you plan on registering for an exam by phone or in person, so that we can ensure your certification history is kept in sync. You can register for an Open Group web account at www.opengroup.org (select login).

In the rest of this article we cover a number of key questions about the transition.

Q: When are The Open Group exams moving to Pearson VUE?

A: Most Open Group exams are currently available at Pearson VUE today and the remaining exams will be soon. We are running a dual operation, with many exams being offered at both Prometric and Pearson VUE during this transition period. See our exam registration page for a current listing of where different exams are being offered: https://certification.opengroup.org/take-exam.

The Open Group exams will be offered at Prometric through January 31, 2017. After that date, exams will only be available at Pearson VUE.

Q: Can I use my exam voucher at either exam provider today?

A: No, you must use your voucher at the designated exam provider. If your exam voucher code starts with “OG”, then it is for Pearson VUE, otherwise it must be used at Prometric, for exams scheduled through January 31, 2017.

Q: Am I required to have an exam voucher to take an exam at Pearson VUE?

A: No, you can also pay by credit card when registering for the exams.

Q: How do I know if my exam voucher is for Prometric or Pearson VUE?

A: If your exam voucher code starts with “OG”, then it is for Pearson VUE. All other codes are Prometric vouchers.

Q: Can I exchange my Prometric voucher for a Pearson VUE voucher?

A: No. Prior to January 31, 2017 you should use your Prometric voucher to book an exam at Prometric. Starting February 1, 2017, your Prometric voucher will be automatically accepted at Pearson VUE (if the code starts with any of the following two characters: 23, 50, 93, 95, 96, 98, 2X, 9X, SX, ZC, ZX). You won’t need to exchange your Prometric voucher, you will be able to use it directly within the Pearson VUE exam registration system.

Q: Will my Prometric voucher be accepted at Pearson VUE?

A: If you received your voucher from an Open Group Accredited Training Course Provider, then yes, your Prometric voucher will be accepted at Pearson VUE after January 2017. If your voucher is valid and unused, then starting February 1, 2017 you will be able to use you voucher to book an exam with Pearson VUE.

Prometric vouchers beginning with any of the following two characters: 23, 50, 93, 95, 96, 98, 2X, 9X, SX, ZC, ZX will be automatically usable at and accepted for exam registration at Pearson VUE after January 31, 2017.

If your voucher begins with any of the following two characters: ER, G2, G3, GP, or P2, then NO, your voucher will not be accepted at Pearson VUE. These vouchers must be used at Prometric by January 31, 2017.

Q: How do I use my Prometric voucher at Pearson VUE?

A: Starting February 1, 2017, if your Prometric voucher is unexpired and unredeemed, you will be able to use it directly when registering at Pearson VUE. Go to https://certification.opengroup.org/take-exam for instructions on how to register. When you get to the payment screen, enter your Prometric voucher number.

Q: What do I do if my Prometric voucher expires after January 31, 2017?

A: If your Prometric voucher has an expiration date after January 31, 2017 and the voucher code starts with the any of the following two characters, your voucher will be accepted at Pearson VUE starting February 1, 2017:

23, 50, 93, 95, 96, 98, 2X, 9X, SX, ZC, ZX

If you have one of the above voucher codes and wish to take your exam before February, you must schedule your exam at a Prometric test center.

If your Prometric voucher starts with any of the codes listed below, the voucher was purchased directly from Prometric and must be used at a Prometric test center by January 31, 2017:

ER, G2, G3, GP, P2

Any vouchers starting with code ER, G2, G3, GP, or P2 that are not used by January 31, 2017 will cease to be valid.

Q: Can I use my Prometric voucher to register in January for an exam in February?

A: No. Your Prometric voucher will not be accepted at Pearson VUE until February 1, 2017.

Q: If I failed the Combined exam at Prometric, can I retake the failed part at Pearson VUE?

A: Yes, you can retake the failed part at Pearson Vue. If the account you use to log in to Pearson VUE contains the email address you used when you took your exam at Prometric, then we will be able to match your new exam results to your prior results.

See our Pearson VUE Frequently Asked Questions for more information about taking an exam at Pearson VUE or our exam registration page to Register for an Exam at Pearson VUE.


by-andrew-josey-and-deborah-schoonoverAndrew Josey is VP, Standards and Certification overseeing all certification and testing programs of The Open Group. He also manages the standards process for The Open Group.

Since joining the company in 1996, Andrew has been closely involved with the standards development, certification and testing activities of The Open Group. He has led many standards development projects including specification and certification development for the ArchiMate®, TOGAF®, POSIX® and UNIX® programs.

He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects (AEA).  He holds an MSc in Computer Science from University College London.



by-andrew-josey-and-deborah-schoonoverDeborah Schoonover is the Director of Certification at The Open Group, responsible for the development and operation of The Open Group’s certification and accreditation programs. In this role, she engages with various working groups to define each program and the policies and legal documents that underpin the program, defines the business requirements for and oversees development of the underlying software systems, and oversees operational delivery of the certification services.

Prior to joining The Open Group, Deborah held development, quality management, and marketing roles at Cadence Design Systems. Deborah holds a Bachelor of Science degree in Computer Science from Lehigh University and a Master of Business Administration (MBA) degree from Boston University.



Leave a comment

Filed under Certifications, Pearson VUE, Prometric, Standards, The Open Group, Uncategorized

The Role of Enterprise Architecture in Platform 3.0 Transformation

By Stuart Macgregor, CEO, Real IRM and The Open Group South Africa

Our transition to the highly-connected realm of Platform 3.0 will radically disrupt the way that we approach Enterprise Architecture (EA).

The current architectures and methodologies will simply not hold up in the era of Platform 3.0 – characterised by the forces of big data, mobility, the Internet of Things, and social media colliding.

In the Platform 3.0 era, power shifts to the customer – as we choose from a range of services offered conveniently via digital channels. By embracing Platform 3.0, organisations can respond to newly-empowered customers. New entrants can scale at unprecedented rates, and incumbents can pivot business models rapidly, while entering and exiting new markets as opportunities emerge.

EA plays an essential role in making these possibilities a reality. EA infuses IT into the DNA of the business. No longer is it about ‘IT’ and ‘business’. Technology is absolutely integral to the entire business, and business leaders are quickly realising the fundamental truth that ‘if you can’t change the system, you can’t change the business’.

A new and exciting Platform 3.0 architectural reality is emerging. It’s composed of microservices and platforms that are combined in radical new ways to serve point-in-time needs – powering new-found business opportunities and revenue streams, dramatically transforming your organisation.

Platform 3.0 refers to radically different ways for the organisation to securely engage with partners, suppliers, and others in your value chain or ecosystem.”

Managing volatile change

But, while driven by an urgent need to transform, to become faster and more agile, large organisations are often constrained by legacy infrastructure.

With an EA-focused approach, organisations can take a step back, and design a set of architectures to manage the volatile change that’s inherent in today’s quickly-digitising industries. EA allows business systems in different departments to be united, creating what The Open Group (the vendor-neutral global IT standards and certifications consortium) aptly describes as a “boundaryless” flow of information throughout the organisation.

Platform 3.0 refers to radically different ways for the organisation to securely engage with partners, suppliers, and others in your value chain or ecosystem. For a retailer, stock suppliers could access real-time views of your inventory levels and automatically prepare new orders. Or a factory, for example, could allow downstream distributors a view of the production facility, to know when the latest batch run will be ready for collection.

In almost every industry, there are a number of new disruptors offering complementary service offerings to incumbent players (such as Fintech players in the Banking industry). To embrace partnerships, venture-capital opportunities, and acquisitions, organisations need extensible architectural platforms.

More and more transactions are moving between organisations via connected, instantaneous, automated platforms. We’re seeing the fulfilment of The Open Group vision of Boundaryless Information Flow™ between organisations and fuels greater efficiencies.

Architecting for an uncertain future

We need to architect for an uncertain future, resigning ourselves to not always knowing what will come next, but being prepared with an architectural approach that enables the discovery of next-generation digital business opportunities.

By exploring open standards, this transformation can be accelerated. The concept of ‘openness’ is at the very heart of Platform 3.0-based business transformation. As different business systems fall into and out of favour, you’ll want to benefit from new innovations by quickly unplugging one piece of the infrastructure, and plugging in a new piece.

Open standards allow us to evolve from our tired and traditional applications, to dynamic catalogues of microservices and APIs that spark continuous business evolution and renewal. Open standards help up to reach a state of radical simplicity with our architecture.

The old-world view of an application is transformed into new applications – volatile and continually morphing – combining sets of APIs that run microservices, and serve a particular business need at a particular point-in-time. These APIs and microservices will form the basis for whatever application we’d like to build on top of it.

Architects need to prepare themselves and their organisations for an uncertain future, where technology’s evolution and businesses’ changing demands are not clearly known. By starting with a clear understanding of the essential building blocks, and the frameworks to re-assemble these in new ways in the future, one can architect for the uncertain future lying in wait.

Platform 3.0 requires a shift towards “human-centered architectures”: where we start acknowledging that there’s no single version of the truth. Depending on one’s role and skill-set, and the level of detail they require, everyone will perceive the organisation’s structure and processes differently.

But ultimately, it’s not about the user, or the technology, or the architecture itself. The true value resides in the content, and not the applications that house, transmit or present that content. Human-centered architectural principles place the emphasis on the content, and the way in which different individuals (from inside or outside the organisation) need to use that content in their respective roles.

As the EA practice formalises intellectual capital in the form of business models and rules, we create an environment for machine learning and artificial intelligence to play an essential role in the future of the organisation. Many describe this as the future of Platform 3.0, perhaps even the beginning of Platform 4.0?

Where this will eventually lead us is both exciting and terrifying.



Stuart Macgregor is the CEO, Real IRM Solutions and  The Open Group South Africa. Through his personal achievements, he has gained the reputation of an Enterprise Architecture and IT Governance specialist, both in South Africa and internationally.

Macgregor participated in the development of the Microsoft Enterprise Computing Roadmap in Seattle. He was then invited by John Zachman to Scottsdale, Arizona to present a paper on using the Zachman framework to implement ERP systems. In addition, Macgregor was selected as a member of both the SAP AG Global Customer Council for Knowledge Management, and of the panel that developed COBIT 3rd Edition Management Guidelines. He has also assisted a global Life Sciences manufacturer to define their IT Governance framework, a major financial institution to define their global, regional and local IT organizational designs and strategy. He was also selected as a core member of the team that developed the South African Breweries (SABMiller) plc global IT strategy.

Stuart, as the lead researcher, assisted the IT Governance Institute map CobiT 4.0 to TOGAF®, an Open Group standard. This mapping document was published by ISACA and The Open Group. He participated in the COBIT 5 development workshop held in London in 2010.

1 Comment

Filed under architecture, Boundaryless Information Flow™, digital business, EA, Enterprise Architecture, Future Technologies, Internet of Things, interoperability, Open Platform 3.0, Platform 3.0, Standards, The Open Group, Uncategorized

The Open Group Paris Event to Take Place in October 2016

The Open Group, the vendor-neutral IT consortium, is hosting its next global event in Paris, France, between October 24-27, 2016. The event, taking place at the Hyatt Regency Paris Étoile, will focus on e-Government, as well as how to address the dimensions of e-Society, e-Technology and e-Management.

Industry experts will look at issues surrounding business transformation, business analysis, information sharing, e-Health, privacy and cybersecurity. Sessions will examine the strategic execution and the application of emerging technologies and management techniques to e-Government. Presentations will also include the latest on the European Interoperability Reference Architecture (EIRA) and the Regulatory Impact of the General Data Protection Regulation (GDPR) on Personal Data Architecture.

The event features key industry speakers including:

  • Rob Akershoek, ‎Solution Architect (IT4IT), Shell
  • Robert Weisman, University of Ottawa
  • Roland Genson, Director, General Secretariat of the Council of the European Union
  • Olivier Flous, Vice President of Engineering, Thales Group

Full details on the agenda and speakers can be found here.

The focus of Monday’s keynote sessions will be Standardized Boundaryless Information Flow™ and how Enterprise Architecture can be used in e-Government. There will also be a significant emphasis on business transformation, with the Tuesday plenary and tracks looking at successful case studies, standards as enablers, and architecting the digital business.

Further topics to be covered at the event include:

  • IT4IT™ – managing the businesses of IT, vendor adoption of IT4IT™ and a CIO-level view of the standard
  • Open Platform 3.0™ – the customer experience and digital business, architecting Smart Cities and how to use IoT technologies
  • ArchiMate® – new features of ArchiMate® 3.0 and a look at open standards in practice
  • Open Business Architecture – examining the new Open Business Architecture standard and how to address enterprise transformation

Member meetings will take place throughout the course of the three-day event for ArchiMate®, Architecture, Healthcare, IT4IT™, Open Platform 3.0™, Open Trusted Technology and Security Forum members.

Registration for The Open Group Paris event is open now, is available to members and non-members, and can be found here.

@theopengroup #ogPARIS



Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, digital strategy, Digital Transformation, e-Government, Enterprise Architecture, Healthcare, Interoperability, IoT, IT4IT, Open Platform 3.0, Security, Standards, The Open Group, The Open Group Paris 2016, Uncategorized

The Enviable Pedigree of UNIX® and POSIX®

By Andrew Josey, VP, Standards and Certification, The Open Group

Technology can be a fickle thing. Spurred by perpetual innovation, the one constant in the tech industry is change. As such, we can expect that whatever is the hottest thing in the industry today—Cloud, Big Data, Mobile, Social, what have you—will be yesterday’s news within a few years’ time. That is how the industry moves and sustains itself, with constant development and creativity—all of which is only getting faster and faster.

But today’s breakthroughs would be nowhere and would not have been possible without what came before them—a fact we sometimes forget. Mainframes led to personal computers, which gave way to laptops, then tablets and smartphones, and now the Internet of Things. Today much of the interoperability we enjoy between our devices and systems—whether at home, the office or across the globe—owes itself to efforts in the 1980s and 1990s to make an interoperable operating system (OS) that could be used across diverse computing environments—the UNIX operating system.

Created at AT&T Bell Laboratories in the early 1970s, the UNIX operating system was developed as a self-contained system that could be easily adapted and run on commodity hardware. By the 1980s, UNIX workstations were widely used in academia and commercially, with a large number of system suppliers, such as HP, IBM, and Sun Microsystems (now Oracle), developing their own flavors of the OS.

At the same time, a number of organizations began standardization efforts around the system. By the late 1980s, three separate organizations were publishing different standards for the UNIX operating system, including IEEE, ISO/IEC JTC1 and X/Open (which eventually became The Open Group).

As part of the standardization efforts undertaken by IEEE, it developed a small set of application programming interfaces (APIs). This effort was known as POSIX, or Portable Operation System Interface. Published in 1988, the POSIX.1 standard was the first attempt outside the work at AT&T and BSD (the UNIX derivative developed at the University of California at Berkeley) to create common APIs for UNIX systems. In parallel, X/Open (an industry consortium consisting at that time of over twenty UNIX suppliers) began developing a set of standards aligned with POSIX that consisted of a superset of the POSIX APIs.  The X/Open standard was known as the X/Open Portability Guide and had an emphasis on usability. ISO also got involved in the efforts, by taking the POSIX standard and internationalizing it.

In 1995, the Single UNIX Specification was created to represent the core of the UNIX brand. Born of a superset of POSIX APIs, the specification provided a richer set of requirements than POSIX for functionality, scalability, reliability and portability for multiuser computing systems. At the same time, the UNIX trademark was transferred to X/Open (now The Open Group). Today, The Open Group holds the trademark in trust for the industry, and suppliers that develop UNIX systems undergo certification, which includes over 40,000 tests, to assure their compatibility and conformance to the standard.

These tri-furcated efforts by separate standards organizations continued through most of the 1990s, with the people involved in developing the standards constantly bouncing between organizations and separate meetings. In late 1997, a number of vendors became tired of having three separate parallel efforts to keep track of and they suggested all three organizations come together to work on one standard.

In 1998, The Open Group, which had formed through the merger of X/Open and the Open Software Foundation, met with the ISO/IEC JTC 1 and IEEE technical experts for an inaugural meeting at IBM’s offices in Austin, Texas. At this meeting, it was agreed that they would work together on a single set of standards that each organization could approve and publish. Since then the approach to specification development has been “write once, adopt everywhere,” with the deliverables being a set of specifications that carry the IEEE POSIX designation, The Open Group Technical Standard designation, and the ISO/IEC designation. Known as the Austin Group, the three bodies still work together today to progress both the joint standard. The new standard not only streamlined the documentation needed to work with the APIs but simplified what was available to the market under one common standard.

A constant evolution

As an operating system that forms the foundational underpinnings of many prominent computing systems, the UNIX OS has always had a number of advantages over other operating systems. One of the advantages is that those APIs have made it possible to write code that conforms to the standard that can run on multiple systems made by different vendors. If you write your code to the UNIX standard, it will run on systems made by IBM, HP, Oracle and Apple, since they all follow the UNIX standard and have submitted their operating systems for formal certification. Free OSs such as Linux and BSD also support the majority of the UNIX and POSIX APIs, so those systems are also compatible with all the others. That level of portability is key for the industry and users, enabling application portability across a wide range of systems.

In addition, UNIX is known for its stability and reliability—even at great scale. Apple claims over 80 million Mac OS X systems in use today – all of them UNIX certified. In addition, the UNIX OS forms the basis for many “big iron” systems. The operating systems’ high through-put and processing power have made it an ideal OS for everything from supercomputing to systems used by the government and financial sectors—all of which require high reliability, scale and fast data processing.

The standard has also been developed such that it allows users to “slice and dice” portions of it for use even when they don’t require the full functionality of the system, since one size does not fit all. Known as “profiles,” these subsets of the standard API sets can be used for any number of applications or devices. So although not full UNIX systems, we see a lot of devices out there with the standard APIs inside them, notably set top boxes, home routers, in-flight entertainment systems and many smart phones.

Although the UNIX and POSIX standards tend to be hidden, deeply embedded in the technologies and devices they enable today, they have been responsible for a great many advances across industries from science to entertainment. Consider the following:

  • Apple’s Mac OS X, the second widely most used desktop system today is a certified UNIX system
  • The first Internet server for the World Wide Web developed by Tim Berners Lee was developed on a UNIX system
  • The establishment of the World Wide Web was driven by the availability of connected UNIX systems
  • IBM’s Deep Blue supercomputer, a UNIX system, was the first computer to beat World Chess Champion Gary Kasparov in 1997
  • Both DNA and RNA were sequenced using a UNIX system
  • For eight consecutive years (1995-2002), each film nominated for an Academy Award for Distinguished Achievement in Visual Effects was created on Silicon Graphics computers running the UNIX OS.

Despite what one might think, both the UNIX and POSIX standards are continually under development still even today.  The community for each is very active—meeting more than 40 times a year to continue developing the specifications.

Things are always changing, so there are new areas of functionality to standardize. The standard is also large so there is a lot of maintenance and ways to improve clarity and portability across systems.

Although it might seem that once a technology becomes standardized it becomes static, standardization usually has the opposite effect—once there is a standard, the market tends to grow even more because organizations know that the technology is trusted and stable enough to build upon. Once the platform is there, you can add things to it and run things above it. We have about 2,000 application interfaces in UNIX today.

And as Internet-worked devices continue to proliferate in today’s connected world, chances are many of these systems that need big processing power, high reliability and huge scale are going to have a piece of the UNIX standard behind them—even if it’s deep beneath the covers.

By Andrew JoseyAndrew Josey is VP, Standards and Certification at The Open Group overseeing all certification and testing programs. He also manages the standards process for The Open Group.

Since joining the company in 1996, Andrew has been closely involved with the standards development, certification and testing activities of The Open Group. He has led many standards development projects including specification and certification development for the ArchiMate®, TOGAF®, POSIX® and UNIX® programs.

He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects (AEA).  He holds an MSc in Computer Science from University College London.


1 Comment

Filed under Association of Enterprise Architects, Certifications, digital business, EA, enterprise architecture, Internet of Things, IoT, IT, operating system, Oracle, Single UNIX Specification, standards, Uncategorized, UNIX