Category Archives:

ArchiMate® Q&A with Phil Beauvoir

By The Open Group

The Open Group’s upcoming Amsterdam Summit in May will feature a full day on May 14 dedicated to ArchiMate®, an open and independent modeling language for Enterprise Architecture, supported by tools that allow Enterprise Architects to describe, analyze and visualize relationships among business domains in an unambiguous way.

One of the tools developed to support ArchiMate is Archi, a free, open-source tool created by Phil Beauvoir at the University of Bolton in the UK as part of a Jisc-funded Enterprise Architecture project that ran from 2009-2012. Since its development, Archi has grown from a relatively small, home-grown tool to become a widely used open-source resource that averages 3000 downloads per month and whose community ranges from independent practitioners to Fortune 500 companies. Here we talk with Beauvoir about how Archi was developed, the problems inherent in sustaining an open source product, its latest features and whether it was named after the Archie comic strip.

Beauvoir will be a featured speaker during the ArchiMate Day in Amsterdam.

Tell us about the impetus for creating the Archi tool and how it was created…
My involvement with the ArchiMate language has mainly been through the development of the software tool, Archi. Archi has, I believe, acted as a driver and as a hub for activity around the ArchiMate language and Enterprise Architecture since it was first created.

I’ll tell you the story of how Archi came about. Let’s go back to the end of 2009. At that point, I think ArchiMate and Enterprise Architecture were probably being used quite extensively in the commercial sector, especially in The Netherlands. The ArchiMate language had been around for a while at that point but was a relatively new thing to many people, at least here in the UK. If you weren’t part of the EA scene, it would have been a new thing to you. In the UK, it was certainly new for many in higher education and universities, which is where I come in.

Jisc, the UK funding body, funded a number of programs in higher education exploring digital technologies and other initiatives. One of the programs being funded was to look at how to improve systems using Enterprise Architecture within the university sector. Some of the universities had already been led to ArchiMate and Enterprise Architecture and were trying it out for themselves – they were new to it and, of course, one of the first things they needed were tools. At that time, and I think it’s still true today, a lot of the tools were quite expensive. If you’re a big commercial organization, you might be able to afford the licensing costs for tools and support, but for a small university project it can be prohibitive, especially if you’re just dipping your toe into something like this. So some colleagues within Jisc and the university I worked at said, ‘well, what about creating a small, open source project tool which isn’t over-complicated but does enough to get people started in ArchiMate? And we can fund six months of money to do this as a proof of concept tool’.

That takes us into 2010, when I was working for the university that was approached to do this work. After six months, by June 2010, I had created the first 1.0 version of Archi and it was (and still is) free, open source and cross-platform. Some of the UK universities said ‘well, that’s great, because now the barrier to entry has been lowered, we can use this tool to start exploring the ArchiMate language and getting on board with Enterprise Architecture’. That’s really where it all started.

So some of the UK universities that were exploring ArchiMate and Enterprise Architecture had a look at this first version of Archi, version 1.0, and said ‘it’s good because it means that we can engage with it without committing at this stage to the bigger tooling solutions.’ You have to remember, of course, that universities were (and still are) a bit strapped for cash, so that’s a big issue for them. At the time, and even now, there really aren’t any other open-source or free tools doing this. That takes us to June 2010. At this point we got some more funding from the Jisc, and kept on developing the tool and adding more features to it. That takes us through 2011 and then up to the end of 2012, when my contract came to an end.

Since the official funding ended and my contract finished, I’ve continued to develop Archi and support the community that’s built up around it. I had to think about the sustainability of the software beyond the project, and sometimes this can be difficult, but I took it upon myself to continue to support and develop it and to engage with the Archi/ArchiMate community.

How did you get involved with The Open Group and bringing the tool to them?
I think it was inevitable really due to where Archi originated, and because the funding came from the Jisc, and they are involved with The Open Group. So, I guess The Open Group became aware of Archi through the Jisc program and then I became involved with the whole ArchiMate initiative and The Open Group. I think The Open Group is in favor of Archi, because it’s an open source tool that provides a neutral reference implementation of the ArchiMate language. When you have an open standard like ArchiMate, it’s good to have a neutral reference model implementation.

How is this tool different from other tools out there and what does it enable people to do?
Well, firstly Archi is a tool for modeling Enterprise Architecture using the ArchiMate language and notation, but what really makes it stand out from the other tools is its accessibility and the fact that it is free, open source and cross-platform. It can do a lot of, if not all of, the things that the bigger tools provide without any financial or other commitment. However, free is not much use if there’s no quality. One thing I’ve always strived for in developing Archi is to ensure that even if it only does a few things compared with the bigger tools, it does those things well. I think with a tool that is free and open-source, you have a lot of support and good-will from users who provide positive encouragement and feedback, and you end up with an interesting open development process.

I suppose you might regard Archi’s relationship to the bigger ArchiMate tools in the same way as you’d compare Notepad to Microsoft Word. Notepad provides the essential writing features, but if you want to go for the full McCoy then you go and buy Microsoft Word. The funny thing is, this is where Archi was originally targeted – at beginners, getting people to start to use the ArchiMate language. But then I started to get emails — even just a few months after its first release — from big companies, insurance companies and the like saying things like ‘hey, we’re using this tool and it’s great, and ‘thanks for this, when are we going to add this or that feature?’ or ‘how many more features are you going to add?’ This surprised me somewhat since I wondered why they hadn’t invested in one of the available commercial tools. Perhaps ArchiMate, and even Enterprise Architecture itself, was new to these organizations and they were using Archi as their first software tool before moving on to something else. Having said that, there are some large organizations out there that do use Archi exclusively.

Which leads to an interesting dilemma — if something is free, how do you continue developing and sustaining it? This is an issue that I’m contending with right now. There is a PayPal donation button on the front page of the website, but the software is open source and, in its present form, will remain open source; but how do you sustain something like this? I don’t have the complete answer right now.

Given that it’s a community product, it helps that the community contributes ideas and develops code, but at the same time you still need someone to give their time to coordinate all of the activity and support. I suppose the classic model is one of sponsorship, but we don’t have that right now, so at the moment I’m dealing with issues around sustainability.

How much has the community contributed to the tool thus far?
The community has contributed a lot in many different ways. Sometimes a user might find a bug and report it or they might offer a suggestion on how a feature can be improved. In fact, some of the better features have been suggested by users. Overall, community contributions seem to have really taken off more in the last few months than in the whole lifespan of Archi. I think this may be due to the new Archi website and a lot more renewed activity. Lately there have been more code contributions, corrections to the documentation and user engagement in the future of Archi. And then there are users who are happy to ask ‘when is Archi going to implement this big feature, and when is it going to have full support for repositories?’ and of course they want this for free. Sometimes that’s quite hard to accommodate, because you think ‘sure, but who’s going to do all this work and contribute the effort.’ That’s certainly an interesting issue for me.

How many downloads of the tool are you getting per month? Where is it being used?
At the moment we’re seeing around 3,000 downloads a month of the tool — I think that’s a lot actually. Also, I understand that some EA training organizations use Archi for their ArchiMate training, so there are quite a few users there, as well.

The number one country for downloading the app and visiting the website is the Netherlands, followed by the UK and the United States. In the past three months, the UK and The Netherlands have been about equal in numbers in their visits to the website and downloads, followed by the United States, France, Germany, Canada, then Australia, Belgium, and Norway. We have some interest from Russia too. Sometimes it depends on whether ArchiMate or Archi is in the news at any given time. I’ve noticed that when there’s a blog post about ArchiMate, for example, you’ll see a spike in the download figures and the number of people visiting the website.

How does the tool fit into the overall schema of the modeling language?
It supports all of the ArchiMate language concepts, and I think it offers the core functionality of you’d want from an ArchiMate modeling tool — the ability to create diagrams, viewpoints, analysis of model objects, reporting, color schemes and so on. Of course, the bigger ArchiMate tools will let you manipulate the model in more sophisticated ways and create more detailed reports and outputs. This is an area that we are trying to improve, and the people who are now actively contributing to Archi are full-time Enterprise Architects who are able to contribute to these areas. For example, we have a user and contributor from France, and he and his team use Archi, and so they are able to see first-hand where Archi falls short and they are able to say ‘well, OK, we would like it to do this, or that could be improved,’ so now they’re working towards strengthening any weak areas.

How did you come up with the name?
What happens is you have pet names for projects and I think it just came about that we started calling it “Archie,” like the guy’s name. When it was ready to be released I said, ‘OK, what should we really call the app?’ and by that point everyone had started to refer to it as “Archie.” Then somebody said ‘well, everybody’s calling it by that name so why don’t we just drop the “e” from the name and go with that?’ – so it became “Archi.” I suppose we could have spent more time coming up with a different name, but by then the name had stuck and everybody was calling it that. Funnily enough, there’s a comic strip called ‘Archie’ and an insurance company that was using the software at the time told me that they’d written a counterpart tool called ‘Veronica,’ named after a character in the comic strip.

What are you currently working on with the tool?
For the last few months, I’ve been adding new features – tweaks, improvements, tightening things up, engaging with the user community, listening to what’s needed and trying to implement these requests. I’ve also been adding new resources to the Archi website and participating on social media like Twitter, spreading the word. I think the use of social media is really important. Twitter, the User Forums and the Wikis are all points where people can provide feedback and engage with me and other Archi developers and users. On the development side of things, we host the code at GitHub, and again that’s an open resource that users and potential developers can go to. I think the key words are ‘open’ and ‘community driven.’ These social media tools, GitHub and the forums all contribute to that. In this way everyone, from developer to user, becomes a stakeholder – everyone can play their part in the development of Archi and its future. It’s a community product and my role is to try and manage it all.

What will you be speaking about in Amsterdam?
I think the angle I’m interested in is what can be achieved by a small number of people taking the open source approach to developing software and building and engaging with the community around it. For me, the interesting part of the Archi story is not so much about the software itself and what it does, but rather the strong community that’s grown around it, the extent of the uptake of the tool and the way in which it has enabled people to get on board with Enterprise Architecture and ArchiMate. It’s the accessibility and agility of this whole approach that I like and also the activity and buzz around the software and from the community – that for me is the interesting thing about this process.

For more information on ArchiMate, please visit:
http://www.opengroup.org/subjectareas/enterprise/archimate

For information on the Archi tool, please visit: http://www.archimatetool.com/

For information on joining the ArchiMate Forum, please visit: http://www.opengroup.org/getinvolved/forums/archimate

philbeauvoirPhil Beauvoir has been developing, writing, and speaking about software tools and development for over 25 years. He was Senior Researcher and Developer at Bangor University, and, later, the Institute for Educational Cybernetics at Bolton University, both in the UK. During this time he co-developed a peer-to-peer learning management and groupware system, a suite of software tools for authoring and delivery of standards-compliant learning objects and meta-data, and tooling to create IMS Learning Design compliant units of learning.  In 2010, working with the Institute for Educational Cybernetics, Phil created the open source ArchiMate Modelling Tool, Archi. Since 2013 he has been curating the development of Archi independently. Phil holds a degree in Medieval English and Anglo-Saxon Literature.

Leave a comment

Filed under ArchiMate®, Certifications, Conference, Enterprise Architecture, Uncategorized

Q&A with Allen Brown, President and CEO of The Open Group

By The Open Group

Last month, The Open Group hosted its San Francisco 2014 conference themed “Toward Boundaryless Information Flow™.” Boundaryless Information Flow has been the pillar of The Open Group’s mission since 2002 when it was adopted as the organization’s vision for Enterprise Architecture. We sat down at the conference with The Open Group President and CEO Allen Brown to discuss the industry’s progress toward that goal and the industries that could most benefit from it now as well as The Open Group’s new Dependability through Assuredness™ Standard and what the organization’s Forums are working on in 2014.

The Open Group adopted Boundaryless Information Flow as its vision in 2002, and the theme of the San Francisco Conference has been “Towards Boundaryless Information Flow.” Where do you think the industry is at this point in progressing toward that goal?

Well, it’s progressing reasonably well but the challenge is, of course, when we established that vision back in 2002, life was a little less complex, a little bit less fast moving, a little bit less fast-paced. Although organizations are improving the way that they act in a boundaryless manner – and of course that changes by industry – some industries still have big silos and stovepipes, they still have big boundaries. But generally speaking we are moving and everyone understands the need for information to flow in a boundaryless manner, for people to be able to access and integrate information and to provide it to the teams that they need.

One of the keynotes on Day One focused on the opportunities within the healthcare industry and The Open Group recently started a Healthcare Forum. Do you see Healthcare industry as a test case for Boundaryless Information Flow and why?

Healthcare is one of the verticals that we’ve focused on. And it is not so much a test case, but it is an area that absolutely seems to need information to flow in a boundaryless manner so that everyone involved – from the patient through the administrator through the medical teams – have all got access to the right information at the right time. We know that in many situations there are shifts of medical teams, and from one medical team to another they don’t have access to the same information. Information isn’t easily shared between medical doctors, hospitals and payers. What we’re trying to do is to focus on the needs of the patient and improve the information flow so that you get better outcomes for the patient.

Are there other industries where this vision might be enabled sooner rather than later?

I think that we’re already making significant progress in what we call the Exploration, Mining and Minerals industry. Our EMMM™ Forum has produced an industry-wide model that is being adopted throughout that industry. We’re also looking at whether we can have an influence in the airline industry, automotive industry, manufacturing industry. There are many, many others, government and retail included.

The plenary on Day Two of the conference focused on The Open Group’s Dependability through Assuredness standard, which was released last August. Why is The Open Group looking at dependability and why is it important?

Dependability is ultimately what you need from any system. You need to be able to rely on that system to perform when needed. Systems are becoming more complex, they’re becoming bigger. We’re not just thinking about the things that arrive on the desktop, we’re thinking about systems like the barriers at subway stations or Tube stations, we’re looking at systems that operate any number of complex activities. And they bring an awful lot of things together that you have to rely upon.

Now in all of these systems, what we’re trying to do is to minimize the amount of downtime because downtime can result in financial loss or at worst human life, and we’re trying to focus on that. What is interesting about the Dependability through Assuredness Standard is that it brings together so many other aspects of what The Open Group is working on. Obviously the architecture is at the core, so it’s critical that there’s an architecture. It’s critical that we understand the requirements of that system. It’s also critical that we understand the risks, so that fits in with the work of the Security Forum, and the work that they’ve done on Risk Analysis, Dependency Modeling, and out of the dependency modeling we can get the use cases so that we can understand where the vulnerabilities are, what action has to be taken if we identify a vulnerability or what action needs to be taken in the event of a failure of the system. If we do that and assign accountability to people for who will do what by when, in the event of an anomaly being detected or a failure happening, we can actually minimize that downtime or remove it completely.

Now the other great thing about this is it’s not only a focus on the architecture for the actual system development, and as the system changes over time, requirements change, legislation changes that might affect it, external changes, that all goes into that system, but also there’s another circle within that system that deals with failure and analyzes it and makes sure it doesn’t happen again. But there have been so many evidences of failure recently. In the banks for example in the UK, a bank recently was unable to process debit cards or credit cards for customers for about three or four hours. And that was probably caused by the work done on a routine basis over a weekend. But if Dependability through Assuredness had been in place, that could have been averted, it could have saved an awfully lot of difficulty for an awful lot of people.

How does the Dependability through Assuredness Standard also move the industry toward Boundaryless Information Flow?

It’s part of it. It’s critical that with big systems the information has to flow. But this is not so much the information but how a system is going to work in a dependable manner.

Business Architecture was another featured topic in the San Francisco plenary. What role can business architecture play in enterprise transformation vis a vis the Enterprise Architecture as a whole?

A lot of people in the industry are talking about Business Architecture right now and trying to focus on that as a separate discipline. We see it as a fundamental part of Enterprise Architecture. And, in fact, there are three legs to Enterprise Architecture, there’s Business Architecture, there’s the need for business analysts, which are critical to supplying the information, and then there are the solutions, and other architects, data, applications architects and so on that are needed. So those three legs are needed.

We find that there are two or three different types of Business Architect. Those that are using the analysis to understand what the business is doing in order that they can inform the solutions architects and other architects for the development of solutions. There are those that are more integrated with the business that can understand what is going on and provide input into how that might be improved through technology. And there are those that can actually go another step and talk about here we have the advances and the technology and here are the opportunities for advancing our competitiveness and organization.

What are some of the other key initiatives that The Open Group’s forum and work groups will be working on in 2014?

That kind question is like if you’ve got an award, you’ve got to thank your friends, so apologies to anyone that I leave out. Let me start alphabetically with the Architecture Forum. The Architecture Forum obviously is working on the evolution of TOGAF®, they’re also working with the harmonization of TOGAF with Archimate® and they have a number of projects within that, of course Business Architecture is on one of the projects going on in the Architecture space. The Archimate Forum are pushing ahead with Archimate—they’ve got two interesting activities going on at the moment, one is called ArchiMetals, which is going to be a sister publication to the ArchiSurance case study, where the ArchiSurance provides the example of Archimate is used in the insurance industry, ArchiMetals is going to be used in a manufacturing context, so there will be a whitepaper on that and there will be examples and artifacts that we can use. They’re also working on in Archimate a standard for interoperability for modeling tools. There are four tools that are accredited and certified by The Open Group right now and we’re looking for that interoperability to help organizations that have multiple tools as many of them do.

Going down the alphabet, there’s DirecNet. Not many people know about DirecNet, but Direcnet™ is work that we do around the U.S. Navy. They’re working on standards for long range, high bandwidth mobile networking. We can go to the FACE™ Consortium, the Future Airborne Capability Environment. The FACE Consortium are working on their next version of their standard, they’re working toward accreditation, a certification program and the uptake of that through procurement is absolutely amazing, we’re thrilled about that.

Healthcare we’ve talked about. The Open Group Trusted Technology Forum, where they’re working on how we can trust the supply chain in developed systems, they’ve released the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program, that was launched this week, and we already have one accredited vendor and two certified test labs, assessment labs. That is really exciting because now we’ve got a way of helping any organization that has large complex systems that are developed through a global supply chain to make sure that they can trust their supply chain. And that is going to be invaluable to many industries but also to the safety of citizens and the infrastructure of many countries. So the other part of the O-TTPS is that standard we are planning to move toward ISO standardization shortly.

The next one moving down the list would be Open Platform 3.0™. This is really exciting part of Boundaryless Information Flow, it really is. This is talking about the convergence of SOA, Cloud, Social, Mobile, Internet of Things, Big Data, and bringing all of that together, this convergence, this bringing together of all of those activities is really something that is critical right now, and we need to focus on. In the different areas, some of our Cloud computing standards have already gone to ISO and have been adopted by ISO. We’re working right now on the next products that are going to move through. We have a governance standard in process and an ecosystem standard has recently been published. In the area of Big Data there’s a whitepaper that’s 25 percent completed, there’s also a lot of work on the definition of what Open Platform 3.0 is, so this week the members have been working on trying to define Open Platform 3.0. One of the really interesting activities that’s gone on, the members of the Open Platform 3.0 Forum have produced something like 22 different use cases and they’re really good. They’re concise and they’re precise and the cover a number of different industries, including healthcare and others, and the next stage is to look at those and work on the ROI of those, the monetization, the value from those use cases, and that’s really exciting, I’m looking forward to peeping at that from time to time.

The Real Time and Embedded Systems Forum (RTES) is next. Real-Time is where we incubated the Dependability through Assuredness Framework and that was where that happened and is continuing to develop and that’s really good. The core focus of the RTES Forum is high assurance system, and they’re doing some work with ISO on that and a lot of other areas with multicore and, of course, they have a number of EC projects that we’re partnering with other partners in the EC around RTES.

The Security Forum, as I mentioned earlier, they’ve done a lot of work on risk and dependability. So they’ve not only their standards for the Risk Taxonomy and Risk Analysis, but they’ve now also developed the Open FAIR Certification for People, which is based on those two standards of Risk Analysis and Risk Taxonomy. And we’re already starting to see people being trained and being certified under that Open FAIR Certification Program that the Security Forum developed.

A lot of other activities are going on. Like I said, I probably left a lot of things out, but I hope that gives you a flavor of what’s going on in The Open Group right now.

The Open Group will be hosting a summit in Amsterdam May 12-14, 2014. What can we look forward to at that conference?

In Amsterdam we have a summit – that’s going to bring together a lot of things, it’s going to be a bigger conference that we had here. We’ve got a lot of activity in all of our activities; we’re going to bring together top-level speakers, so we’re looking forward to some interesting work during that week.

 

 

 

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Cybersecurity, EMMMv™, Enterprise Architecture, FACE™, Healthcare, O-TTF, RISK Management, Standards, TOGAF®

The Open Group Amsterdam Summit to Discuss Enabling Boundaryless Information Flow™

By The Open Group

The next Open Group Summit will cover the major issues and trends surrounding Boundaryless Information Flow™ on May 12-14 in Amsterdam. The event will feature presentations from leading companies, including IBM and Philips, on the key challenges facing effective information integration and enabling boundaryless information, as well as a day dedicated to ArchiMate®, a modeling language for Enterprise Architecture.

Boundaryless Information Flow™:

Boundaryless Information Flow, a shorthand representation of “access to integrated information to support business process improvements,” represents a desired state of an enterprise’s infrastructure that provides services to customers in an extended enterprise with the right information, at the right time and in the right context.

The Amsterdam Summit will bring together many individuals from throughout the globe to discuss key areas to enable Boundaryless Information Flow, including:

  • How EA and business processes can be used to facilitate integrated access to integrated information by staff, customers, suppliers and partners, to support the business
  • How organizations can achieve their business objectives by adopting new technologies and processes as part of the Enterprise Transformation management principles – making the whole process more a matter of design than of chance
  • How organizations move towards the interoperable enterprise, switching focus from IT-centric to enterprise-centric

ArchiMate Day:

On May 14, there will be an entire day dedicated to ArchiMate®, an Open Group standard. ArchiMate is an open and independent modelling language for enterprise architecture that is supported by different tool vendors and consulting firms. ArchiMate provides instruments to enable enterprise architects to describe, analyze and visualize the relationships among business domains in an unambiguous way. ArchiMate Day is appropriately located, as The Netherlands ranks as the number 1 country in the world for the number of ArchiMate® 2 certified individuals and as the number 3 country in the world for the number of TOGAF® 9 certified individuals.

The ArchiMate Day will provide the opportunity for attendees to:

  • Interact directly with other ArchiMate users and tool providers
  • Listen and understand how ArchiMate can be used to develop solutions to common industry problems
  • Learn about the future directions and meet with key users and developers of the language and tools
  • Interact with peers to broaden your expertise and knowledge in the ArchiMate language

Don’t wait to register! Early Bird registration ends March 30, 2014 Register now!

 

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Enterprise Architecture

The Open Group and APMG Work Together to Promote TOGAF® and ArchiMate®

The APM Group (APMG) and The Open Group have announced a new partnership whereby APMG will support the accreditation services of The Open Group’s products. The arrangement will initially focus on TOGAF® and ArchiMate®, both standards of The Open Group.

APMG’s team of global assessors will be supporting The Open Group’s internal accreditation team in conducting their assessment activities. The scope of the assessments will focus on organizations, materials and training delivery.

“A significant value to The Open Group in this new venture is the ability to utilize APMG’s team of experienced multi-lingual assessors who are based throughout the world.  This will help The Open Group establish new markets and ensure quality support of existing markets, “ said James de Raeve, Vice President of Certification at The Open Group.

Richard Pharro, CEO of APMG said, “This agreement presents an excellent opportunity to APMG Accredited Training Organizations which are interested in training in The Open Group’s products, as their existing APMG accredited status will be recognized by The Open Group. We believe our global network will significantly enhance the awareness and take up of TOGAF and ArchiMate.”

About The Open Group

The Open Group is an international vendor- and technology-neutral consortium upon which organizations rely to lead the development of IT standards and certifications, and to provide them with access to key industry peers, suppliers and best practices. The Open Group provides guidance and an open environment in order to ensure interoperability and vendor neutrality. Further information on The Open Group can be found at http://opengroup.org.

About APM Group

The APM Group is one of the world’s largest certification bodies for knowledge based workers. As well as the certifications mentioned above, we offer competency-based assessments for specialist roles in the security and aerospace industries. We work with government agencies to help develop people who can achieve great things for the organizations they work for.

4 Comments

Filed under ArchiMate®, Certifications, Professional Development, Standards, TOGAF®

The ArchiMate® Certification for People Program 2014 Updates

By Andrew Josey, The Open Group

Following on from the news in December of the 1000th certification in the ArchiMate certification program, The Open Group has made some changes to the program that will make the certification program more accessible. As of January 2014, it is now possible to self study for both certification levels.  Previously to achieve the Level 2 certification, known as ArchiMate 2 Certified, attendance at a course was mandatory.

To accommodate this, a revised examination structure has been introduced as shown in the diagram below:ArchiMate_2_exam

There are two levels of certification:

  • ArchiMate Foundation: Knowledge of the notation, terminology, structure, and concepts of the ArchiMate modeling language.
  • ArchiMate Certified: In addition to Knowledge and comprehension, the ability to analyze and apply the ArchiMate modeling language.

Candidates are able to choose whether they wish to become certified in a stepwise manner by starting with ArchiMate 2 Foundation and then at a later date ArchiMate 2 Certified, or bypass ArchiMate 2 Foundation and go directly to ArchiMate 2 Certified.

For those going directly to ArchiMate 2 Certified there is a choice of taking the two examinations separately or a Combined examination. The advantage of taking the two examinations over the single Combined examination is that if you pass Part 1 but fail Part 2 you can still qualify for ArchiMate 2 Foundation.

The ArchiMate 2 Part 1 examination comprises 40 questions in simple multiple choice format. The ArchiMate 2 Part 2 examination comprises 8 question using a gradient scored, scenario based format. Practice examinations are included as part of an Accredited ArchiMate Training course and available with the Study Guide.

The examinations are delivered either at Prometric test centers or by Accredited Training Course Providers through The Open Group Internet Based Testing portal.

You can find an available accredited training course either by viewing the public Calendar of Accredited Training Courses or by contacting a provider using the Register of Accredited Training Courses.

The ArchiMate 2 Certification Self-Study Pack is available at http://www.opengroup.org/bookstore/catalog/b132.htm.

The hardcopy of the ArchiMate 2 Certification Study Guide is available to order from Van Haren Publishing at http://www.vanharen.net/9789401800020

ArchiMate is a registered trademark of The Open Group.

 Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.1, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Comments Off

Filed under ArchiMate®, Certifications, Enterprise Architecture

ArchiMate® 2 Certification reaches the 1000th certification milestone

By Andrew Josey, The Open Group

We’re pleased to announce that the ArchiMate Certification for People program has reached the significant milestone of 1,000 individual certifications and there are individuals certified in 30 different countries as shown in the world map below.

ArchiMate 1000

The top 10 countries are:

Netherlands 458 45.8%
UK 104 10.4%
Belgium 76 7.6%
Australia 35 3.5%
Germany 32 3.2%
Norway 30 3%
Sweden 30 3%
USA 27 2.7%
Poland 16 1.6%
Slovakia 13 1.3%
 

The vision for the ArchiMate 2 Certification Program is to define and promote a market-driven education and certification program to support the ArchiMate modeling language Standard.

More information on the program is available at the ArchiMate 2 Certification site at http://www.opengroup.org/certifications/archimate/

Details of the ArchiMate 2 Examinations are available at: http://www.opengroup.org/certifications/archimate/docs/exam

The calendar of Accredited ArchiMate 2 Training courses is available at: http://www.opengroup.org/archimate/training-calendar/

The ArchiMate 2 Certification register can be found at https://archimate-cert.opengroup.org/certified-individuals

ArchiMate is a registered trademark of The Open Group.

 Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Comments Off

Filed under ArchiMate®, Certifications, Enterprise Architecture

ArchiMate® 2.1 Specification Maintenance Release

By Andrew Josey, The Open Group

We’re pleased to announce the latest release of the ArchiMate modeling language specification.

ArchiMate® 2.1, an Open Group standard, is a full updated release of the ArchiMate Specification addressing comments raised since the introduction of Issue 2.0 in 2012. It retains the major features and structure of ArchiMate 2.0 adding further detail and clarification, thereby preserving existing investment in the ArchiMate modeling language. In this blog, we take a brief look at what has changed[1].

The changes in this release are as follows:

  1. Additional explanatory text has been added in section 2.6 describing the ArchiMate Framework, its layers and aspects.
  2. Corrections have been made to figures throughout the specification for consistency with the text, including metamodel diagrams, concept diagrams and example models.
  3. An explanation has been added describing the use of colors within the specification. This makes it clear that the metamodel diagrams use colors to distinguish the different aspects of the ArchiMate Framework, and that within the models there are no formal semantics assigned to colors.
  4. Within the three layers, the concepts are now classified according to the aspects of the ArchiMate Framework: Active Structure Concepts (instead of Structural Concepts), Behavioral Concepts, and Passive Structure Concepts (instead of Informational Concepts).
  5. Duplicate text has been removed from the layers; for example meaning was defined in Section 3.4 and also in Section 3.4.2).
  6. In the Layers, a number of concept diagrams have been corrected to show all the permitted symbols for the concept; for example, Business Interface, Application Service, and Infrastructure Service.
  7. In the Architecture Viewpoints, the aspects for each viewpoint are now classified as per the ArchiMate Framework into Active Structure, Behavior, or Passive Structure.
  8. In the Architecture Viewpoints, a number of Concepts and Relationships diagrams have been updated to correct the relationships shown, similarly a number of example diagrams have corrections (for example use of a Communication Path to connect two nodes).
  9. In the Language Extension Mechanisms chapter, it has been made clear that specialization can also be applied to Relationships.
  10. In the Motivation Extension, it has been made clear that the association relationship can be used to connect motivation elements.
  11. The status of the appendices has been made clear; Appendix A is informative, whereas Appendix B is normative.
  12. Appendix B, the Relationship Tables has a number of corrections applied.

More information on the ArchiMate 2.1 Specification, including additional resources, can be obtained from The Open Group website here: http://www.opengroup.org/subjectareas/enterprise/archimate

[1] A detailed listing of the changes is available separately as Document U132, ArchiMate® 2.0 Technical Corrigendum 1 http://www.opengroup.org/bookstore/catalog/U132

Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

1 Comment

Filed under ArchiMate®, Enterprise Architecture

Three Things We Learned at The Open Group, London

By Manuel Ponchaux, Senior Consultant, Corso

The Corso team recently visited London for The Open Group’s “Business Transformation in Finance, Government & Healthcare” conference (#ogLON). The event was predominantly for learning how experts address organisational change when aligning business needs with information technology – something very relevant in today’s climate. Nonetheless, there were a few other things we learnt as well…

1. Lean Enterprise Architecture

We were told that Standard Frameworks are too complex and multidimensional – people were interested in how we use them to provide simple working guidelines to the architecture team.

There were a few themes that frequently popped up, one of them being the measurement of Enterprise Architecture (EA) complexity. There seemed to be a lot of talk about Lean Enterprise Architecture as a solution to complexity issues.

2. Risk Management was popular

Clearly the events of the past few years e.g. financial crisis, banking regulations and other business transformations mean that managing risk is increasingly more important. So, it was no surprise that the Risk Management and EA sessions were very popular and probably attracted the biggest crowd. The Corso session showcasing our IBM/CIO case study was successful with 40+ attending!

3. Business challenges

People visited our stand and told us they were having trouble generating up to date heat maps. There was also a large number of attendee’s interested in Software as a Service as an alternative to traditional on-premise licensing.

So what did we learn from #ogLON?

Attendees are attracted to the ease of use of Corso’s ArchiMate plugin. http://www.corso3.com/products/archimate/

Together with the configurable nature of System Architect, ArchiMate® is a simple framework to use and makes a good starting point for supporting Lean Architecture.

Roadmapping and performing impact analysis reduces the influence of risk when executing any business transformation initiative.

We also learnt that customers in the industry are starting to embrace the concept of SaaS offerings as it provides them with a solution that can get them up and running quickly and easily – something we’re keen to pursue – which is why we’re now offering IBM Rational tools on the Corso cloud. Visit our website at http://www.corsocloud.com

http://info.corso3.com/blog/bid/323481/3-interesting-things-we-learned-at-The-Open-Group-London

Manuel Poncheau Manuel Ponchaux, Senior Consultant, Corso

1 Comment

Filed under ArchiMate®, Enterprise Architecture, Standards, Uncategorized

The Open Group London – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications

We eagerly jumped into the second day of our Business Transformation conference in London on Tuesday October 22nd!  The setting is the magnificent Central Hall Westminster.

Steve Nunn, COO of The Open Group and CEO of Association of Enterprise Architects (AEA), started off the morning introducing our plenary based on Healthcare Transformation.  Steve noted that the numbers in healthcare spend are huge and bringing Enterprise Architecture (EA) to healthcare will help with efficiencies.

The well-renowned Dr. Peter Sudbury, Healthcare Specialist with HP Enterprise Services, discussed the healthcare crisis (dollars, demand, demographics), the new healthcare paradigm, barriers to change and innovation. Dr. Sudbury also commented on the real drivers of healthcare costs: healthcare inflation is higher intrinsically; innovation increases cost; productivity improvements lag other industries.

IMG_sudburyDr. Peter Sudbury

Dr. Sudbury, Larry Schmidt (Chief Technologist, HP) and Roar Engen (Head of Enterprise Architecture, Helse Sør-Øst RHF, Norway) participated in the Healthcare Transformation Panel, moderated by Steve Nunn.  The group discussed opportunities for improvement by applying EA in healthcare.  They mentioned that physicians, hospitals, drug manufacturers, nutritionists, etc. should all be working together and using Boundaryless Information Flow™ to ensure data is smoothly shared across all entities.  It was also stated that TOGAF® is beneficial for efficiencies.

Following the panel, Dr. Mario Tokoro (Founder & Executive Advisor of Sony Computer Science Laboratories, Inc. Japanese Science & Technology Agency, DEOS Project Leader) reviewed the Dependability through Assuredness™ standard, a standard of The Open Group.

The conference also offered many sessions in Finance/Commerce, Government and Tutorials/Workshops.

Margaret Ford, Consult Hyperion, UK and Henk Jonkers of BIZZdesign, Netherlands discussed “From Enterprise Architecture to Cyber Security Risk Assessment”.  The key takeaways were: complex cyber security risks require systematic, model-based risk assessment; attack navigators can provide this by linking ArchiMate® to the Risk Taxonomy.

“Applying Service-Oriented Architecture within a Business Technology Environment in the Finance Sector” was presented by Gerard Peters, Managing Consultant, Capgemini, The Netherlands. This case study is part of a white paper on Service-Oriented Architecture for Business Technology (SOA4BT).

You can view all of the plenary and many of the track presentations at livestream.com.  And for those who attended, full conference proceedings will be available.

The night culminated with a spectacular experience on the London Eye, the largest Ferris wheel in Europe located on the River Thames.

Comments Off

Filed under ArchiMate®, Cloud/SOA, Enterprise Architecture, Enterprise Transformation, Healthcare, Professional Development, Service Oriented Architecture, TOGAF®

Redefining traceability in Enterprise Architecture and implementing the concept with TOGAF 9.1 and/or ArchiMate 2.0

By Serge Thorn, Architecting the Enterprise

One of the responsibilities of an Enterprise Architect is to provide complete traceability from requirements analysis and design artefacts, through to implementation and deployment.

Along the years, I have found out that the term traceability is not always really considered in the same way by different Enterprise Architects.

Let’s start with a definition of traceability. Traceable is an adjective; capable of being traced. Trying to find a definition even from a dictionary is a challenge and the most relevant one I found on Wikipedia which may be used as a reference could be “The formal definition of traceability is the ability to chronologically interrelate uniquely identifiable entities in a way that is verifiable.”

In Enterprise Architecture, traceability may mean different things to different people.

Some people refer to

  • Enterprise traceability which proves alignment to business goals
  • End-to-end traceability to business requirements and processes
  • A traceability matrix, the mapping of systems back to capabilities or of system functions back to operational activities
  • Requirements traceability which  assists  in quality  solutions that meets the business needs
  • Traceability between requirements and TOGAF artifacts
  • Traceability across artifacts
  • Traceability of services to business processes and architecture
  • Traceability from application to business function to data entity
  • Traceability between a technical component and a business goal
  • Traceability of security-related architecture decisions
  • Traceability of IT costs
  • Traceability to tests scripts
  • Traceability between  artifacts from business and IT strategy to solution development and delivery
  • Traceability from the initial design phase through to deployment
  • And probably more

The TOGAF 9.1 specification rarely refers to traceability and the only sections where the concept is used are in the various architecture domains where we should document a requirements traceability report or traceability from application to business function to data entity.

The most relevant section is probably where in the classes of architecture engagement it says:

“Using the traceability between IT and business inherent in enterprise architecture, it is possible to evaluate the IT portfolio against operational performance data and business needs (e.g., cost, functionality, availability, responsiveness) to determine areas where misalignment is occurring and change needs to take place.”

And how do we define and document Traceability from an end user or stakeholder perspective? The best approach would probably to use a tool which would render a view like in this diagram:

serge1In this diagram, we show the relationships between the components from the four architecture domains. Changing one of the components would allow doing an impact analysis.

Components may have different meanings as illustrated in the next diagram:

serge2Using the TOGAF 9.1 framework, we would use concepts of the Metamodel. The core metamodel entities show the purpose of each entity and the key relationships that support architectural traceability as stipulated in the section 34.2.1 Core Content Metamodel Concepts.

So now, how do we build that traceability? This is going to happen along the various ADM cycles that an enterprise will support. It is going to be quite a long process depending on the complexity, the size and the various locations where the business operates.

There may be five different ways to build that traceability:

  • Manually using an office product
  • With an enterprise architecture tool not linked to the TOGAF 9.1 framework
  • With an enterprise architecture tool using the TOGAF 9.1 artifacts
  • With an enterprise architecture tool using ArchiMate 2.0
  • Replicating the content of an Enterprise Repository such as a CMDB in an Architecture repository

1. Manually using an office product

You will probably document your architecture with the use of word processing, spread sheets and diagramming tools and store these documents in a file structure on a file server, ideally using some form of content management system.

Individually these tools are great but collectively they fall short in forming a cohesive picture of the requirements and constraints of a system or an enterprise. The links between these deliverables soon becomes non manageable and in the long term impact analysis of any change will become quite impossible. Information will be hard to find and to trace from requirements all the way back to the business goal that drives it. This is particularly difficult to achieve when requirements are stored in spread sheets and use cases and business goals are contained in separate documents. Other issues such as maintenance and consistency would have to be considered.

serge3

2. With an enterprise architecture tool not linked to the TOGAF 9.1 framework

Many enterprise architecture tools or suites provide different techniques to support traceability but do not really describe how things work and focus mainly on describing requirements traceability.  In the following example, we use a traceability matrix between user requirements and functional specifications, use cases, components, software artifacts, test cases, business processes, design specifications and more.

Mapping the requirements to use cases and other information can be very labor-intensive.

serge4

Some tools also allow for the creation of relationships between the various layers using grids or allowing the user to create the relationships by dragging lines between elements.

Below is an example of what traceability would look like in an enterprise architecture tool after some time.  That enterprise architecture ensures appropriate traceability from business architecture to the other allied architectures.

serge5

3. With an enterprise architecture tool using the TOGAF 9.1 artifacts

The TOGAF 9.1 core metamodel provides a minimum set of architectural content to support traceability across artifacts. Usually we use catalogs, matrices and diagrams to build traceability independently of dragging lines between elements (except possibly for the diagrams). Using catalogs and matrices are activities which may be assigned to various stakeholders in the organisation and theoretically can sometimes hide the complexity associated with an enterprise architecture tool.

serge6Using artifacts creates traceability. As an example coming from the specification; “A Business Footprint diagram provides a clear traceability between a technical component and the business goal that it satisfies, while also demonstrating ownership of the services identified”. There are other artifacts which also describe other traceability: Data Migration Diagram and Networked Computing/Hardware Diagram.

4. With an enterprise architecture tool using ArchiMate 2.0

Another possibility could be the use of the ArchiMate standard from The Open Group. Some of the that traceability could  also be achievable in some way using BPMN and UML for specific domains such as process details in Business Architecture or building the bridge between Enterprise Architecture and Software architecture.

With ArchiMate 2.0 we can define the end to end traceability and produce several viewpoints such as the Layered Viewpoint which shows several layers and aspects of an enterprise architecture in a single diagram. Elements are modelled in five different layers when displaying the enterprise architecture; these are then linked with each other using relationships. We differentiate between the following layers and extensions:

  • Business layer
  • Application layer
  • Technology layer
  • Motivation extension
  • Implementation and migration extension

The example from the specification below documents the various architecture layers.

serge7
As you will notice, this ArchiMate 2.0 viewpoint looks quite similar to the TOGAF 9.1 Business Footprint Diagram which provides a clear traceability between a technical component and the business goal that it satisfies, while also demonstrating ownership of the services identified.

Another example could be the description of the traceability among business goals, technical capabilities, business benefits and metrics.  The key point about the motivation extension is to work with the requirement object.

Using the motivation viewpoint from the specification as a reference (motivation extension), you could define business benefits / expectations within the business goal object, and then define sub-goals as KPIs to measure the benefits of the plan and list all of the identified requirements of the project / program.  Finally, you could link these requirements with either application or infrastructure service object representing software or technical capabilities. (Partial example below).

serge8
One of the common questions I have recently received from various enterprise architects is “Now that I know TOGAF and ArchiMate… how should I model my enterprise? Should I use the TOGAF 9.1 artifacts to create that traceability? Should I use ArchiMate 2.0? Should I use both? Should I forget the artifacts…”. These are good questions and I’m afraid that there is not a single answer.

What I know is that if I select an enterprise architecture tool supporting both TOGAF 9.1 and ArchiMate 2.0, I would like to be able to be able to have a full synchronization. If I model a few ArchiMate models I would like my TOGAF 9.1 artifacts to be created at the same time (catalogs and matrices) and if I create artifacts from the taxonomy, I would like my ArchiMate models also to be created.

Unfortunately I do not know the current level of tools maturity and whether tools vendors provide that synchronization. This would obviously require some investigation and should be one of the key criteria if you were currently looking for a product supporting both standards.

5. Replicating the content of an Enterprise Repository such as a CMDB in an Architecture repository

This other possibility requires that you have an up to date Configuration Management Database and that you developed an interface with your Architecture Repository, your enterprise architecture tool. If you are able to replicate the relationships between the infrastructure components and applications (CIs) into your enterprise architecture tool that would partially create your traceability.

If I summarise the various choices to build that enterprise architecture traceability, I potentially have three main possibilities:

serge9
Achieving traceability within an Enterprise Architecture is key because the architecture needs to be understood by all participants and not just by technical people.  It helps to incorporate the enterprise architecture efforts into the rest of the organization and it takes it to the board room (or at least the CIO’s office) where it belongs.

  • Describe your traceability from your Enterprise Architecture to the system development and project documentation.
  • Review that traceability periodically, making sure that it is up to date, and produce analytics out of it.

If a development team is looking for a tool that can help them document, and provide end to end traceability throughout the life cycle EA is the way to go, make sure you use the right standard and platform. Finally, communicate and present to your stakeholders the results of your effort.

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

2 Comments

Filed under ArchiMate®, Enterprise Architecture, Standards, TOGAF, TOGAF®, Uncategorized

Gaining Dependability Across All Business Activities Requires Standard of Standards to Tame Dynamic Complexity, Says The Open Group CEO

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

Hello, and welcome to a special BriefingsDirect Thought Leadership

Interview series, coming to you in conjunction with The Open Group Conference on July 15, in Philadelphia.

88104-aaadanaI’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on enterprise transformation in the finance, government, and healthcare sector.

We’re here now with the President and CEO of The Open Group, Allen Brown, to explore the increasingly essential role of standards, in an undependable, unpredictable world. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

Welcome back, Allen.

Allen Brown: It’s good to be here, Dana. abrown

Gardner: What are the environmental variables that many companies are facing now as they try to improve their businesses and assess the level of risk and difficulty? It seems like so many moving targets.

 Brown: Absolutely. There are a lot of moving targets. We’re looking at a situation where organizations are having to put in increasingly complex systems. They’re expected to make them highly available, highly safe, highly secure, and to do so faster and cheaper. That’s kind of tough.

Gardner: One of the ways that organizations have been working towards a solution is to have a standardized approach, perhaps some methodologies, because if all the different elements of their business approach this in a different way, we don’t get too far too quickly, and it can actually be more expensive.

Perhaps you could paint for us the vision of an organization like The Open Group in terms of helping organizations standardize and be a little bit more thoughtful and proactive towards these changed elements?

Brown: With the vision of The Open Group, the headline is “Boundaryless Information Flow.” That was established back in 2002, at a time when organizations were breakingdown the stovepipes or the silos within and between organizations and getting people to work together across functioning. They found, having done that, or having made some progress towards that, that the applications and systems were built for those silos. So how can we provide integrated information for all those people?

As we have moved forward, those boundaryless systems have become bigger

and much more complex. Now, boundarylessness and complexity are giving everyone different types of challenges. Many of the forums or consortia that make up The Open Group are all tackling it from their own perspective, and it’s all coming together very well.

We have got something like the Future Airborne Capability Environment (FACE) Consortium, which is a managed consortium of The Open Group focused on federal aviation. In the federal aviation world they’re dealing with issues like weapons systems.

New weapons

Over time, building similar weapons is going to be more expensive, inflation happens. But the changing nature of warfare is such that you’ve then got a situation where you’ve got to produce new weapons. You have to produce them quickly and you have to produce them inexpensively.

So how can we have standards that make for more plug and play? How can the avionics within a cockpit of whatever airborne vehicle be more interchangeable, so that they can be adapted more quickly and do things faster and at lower cost.

After all, cost is a major pressure on government departments right now.

We’ve also got the challenges of the supply chain. Because of the pressure on costs, it’s critical that large, complex systems are developed using a global supply chain. It’s impossible to do it all domestically at a cost. Given that, countries around the world, including the US and China, are all concerned about what they’re putting into their complex systems that may have tainted or malicious code or counterfeit products.

The Open Group Trusted Technology Forum (OTTF) provides a standard that ensures that, at each stage along the supply chain, we know that what’s going into the products is clean, the process is clean, and what goes to the next link in the chain is clean. And we’re working on an accreditation program all along the way.

We’re also in a world, which when we mention security, everyone is concerned about being attacked, whether it’s cybersecurity or other areas of security, and we’ve got to concern ourselves with all of those as we go along the way.

Our Security Forum is looking at how we build those things out. The big thing about large, complex systems is that they’re large and complex. If something goes wrong, how can you fix it in a prescribed time scale? How can you establish what went wrong quickly and how can you address it quickly?

If you’ve got large, complex systems that fail, it can mean human life, as it did with the BP oil disaster at Deepwater Horizon or with Space Shuttle Challenger. Or it could be financial. In many organizations, when something goes wrong, you end up giving away service.

An example that we might use is at a railway station where, if the barriers don’t work, the only solution may be to open them up and give free access. That could be expensive. And you can use that analogy for many other industries, but how can we avoid that human or financial cost in any of those things?

A couple of years after the Space Shuttle Challenger disaster, a number of criteria were laid down for making sure you had dependable systems, you could assess risk, and you could know that you would mitigate against it.

What The Open Group members are doing is looking at how you can get dependability and assuredness through different systems. Our Security Forum has done a couple of standards that have got a real bearing on this. One is called Dependency Modeling, and you can model out all of the dependencies that you have in any system.

Simple analogy

A very simple analogy is that if you are going on a road trip in a car, you’ve got to have a competent driver, have enough gas in the tank, know where you’re going, have a map, all of those things.

What can go wrong? You can assess the risks. You may run out of gas or you may not know where you’re going, but you can mitigate those risks, and you can also assign accountability. If the gas gauge is going down, it’s the driver’s accountability to check the gauge and make sure that more gas is put in.

We’re trying to get that same sort of thinking through to these large complex systems. What you’re looking at doing, as you develop or evolve large, complex systems, is to build in this accountability and build in understanding of the dependencies, understanding of the assurance cases that you need, and having these ways of identifying anomalies early, preventing anything from failing. If it does fail, you want to minimize the stoppage and, at the same time, minimize the cost and the impact, and more importantly, making sure that that failure never happens again in that system.

The Security Forum has done the Dependency Modeling standard. They have also provided us with the Risk Taxonomy. That’s a separate standard that helps us analyze risk and go through all of the different areas of risk.

Now, the Real-time & Embedded Systems Forum has produced the Dependability through Assuredness, a standard of The Open Group, that brings all of these things together. We’ve had a wonderful international endeavor on this, bringing a lot of work from Japan, working with the folks in the US and other parts of the world. It’s been a unique activity.

Dependability through Assuredness depends upon having two interlocked cycles. The first is a Change Management Cycle that says that, as you look at requirements, you build out the dependencies, you build out the assurance cases for those dependencies, and you update the architecture. Everything has to start with architecture now.

You build in accountability, and accountability, importantly, has to be accepted. You can’t just dictate that someone is accountable. You have to have a negotiation. Then, through ordinary operation, you assess whether there are anomalies that can be detected and fix those anomalies by new requirements that lead to new dependabilities, new assurance cases, new architecture and so on.

The other cycle that’s critical in this, though, is the Failure Response Cycle. If there is a perceived failure or an actual failure, there is understanding of the cause, prevention of it ever happening again, and repair. That goes through the Change Accommodation Cycle as well, to make sure that we update the requirements, the assurance cases, the dependability, the architecture, and the accountability.

So the plan is that with a dependable system through that assuredness, we can manage these large, complex systems much more easily.

Gardner: Allen, many of The Open Group activities have been focused at the enterprise architect or business architect levels. Also with these risk and security issues, you’re focusing at chief information security officers or governance, risk, and compliance (GRC), officials or administrators. It sounds as if the Dependability through Assuredness standard shoots a little higher. Is this something a board-level mentality or leadership should be thinking about, and is this something that reports to them?

Board-level issue

Brown: In an organization, risk is a board-level issue, security has become a board-level issue, and so has organization design and architecture. They’re all up at that level. It’s a matter of the fiscal responsibility of the board to make sure that the organization is sustainable, and to make sure that they’ve taken the right actions to protect their organization in the future, in the event of an attack or a failure in their activities.

The risks to an organization are financial and reputation, and those risks can be very real. So, yes, they should be up there. Interestingly, when we’re looking at areas like business architecture, sometimes that might be part of the IT function, but very often now we’re seeing as reporting through the business lines. Even in governments around the world, the business architects are very often reporting up to business heads.

Gardner: Here in Philadelphia, you’re focused on some industry verticals, finance, government, health. We had a very interesting presentation this morning by Dr. David Nash, who is the Dean of the Jefferson School of Population Health, and he had some very interesting insights about what’s going on in the United States vis-à-vis public policy and healthcare.

One of the things that jumped out at me was, at the end of his presentation, he was saying how important it was to have behavior modification as an element of not only individuals taking better care of themselves, but also how hospitals, providers, and even payers relate across those boundaries of their organization.

That brings me back to this notion that these standards are very powerful and useful, but without getting people to change, they don’t have the impact that they should. So is there an element that you’ve learned and that perhaps we can borrow from Dr. Nash in terms of applying methods that actually provoke change, rather than react to change?

Brown: Yes, change is a challenge for many people. Getting people to change is like taking a horse to water, but will it drink? We’ve got to find methods of doing that.

One of the things about The Open Group standards is that they’re pragmatic and practical standards. We’ve seen’ in many of our standards’ that where they apply to product or service, there is a procurement pull through. So the FACE Consortium, for example, a $30 billion procurement means that this is real and true.

In the case of healthcare, Dr. Nash was talking about the need for boundaryless information sharing across the organizations. This is a major change and it’s a change to the culture of the organizations that are involved. It’s also a change to the consumer, the patient, and the patient advocates.

All of those will change over time. Some of that will be social change, where the change is expected and it’s a social norm. Some of that change will change as people and generations develop. The younger generations are more comfortable with authority that they perceive with the healthcare professionals, and also of modifying the behavior of the professionals.

The great thing about the healthcare service very often is that we have professionals who want to do a number of things. They want to improve the lives of their patients, and they also want to be able to do more with less.

Already a need

There’s already a need. If you want to make any change, you have to create a need, but in healthcare, there is already a pent-up need that people see that they want to change. We can provide them with the tools and the standards that enable it to do that, and standards are critically important, because you are using the same language across everyone.

It’s much easier for people to apply the same standards if they are using the same language, and you get a multiplier effect on the rate of change that you can achieve by using those standards. But I believe that there is this pent-up demand. The need for change is there. If we can provide them with the appropriate usable standards, they will benefit more rapidly.

Gardner: Of course, measuring the progress with the standards approach helps as well. We can determine where we are along the path as either improvements are happening or not happening. It gives you a common way of measuring.

The other thing that was fascinating to me with Dr. Nash’s discussion was that he was almost imploring the IT people in the crowd to come to the rescue. He’s looking for a cavalry and he’d really seemed to feel that IT, the data, the applications, the sharing, the collaboration, and what can happen across various networks, all need to be brought into this.

How do we bring these worlds together? There is this policy, healthcare and population statisticians are doing great academic work, and then there is the whole IT world. Is this something that The Open Group can do — bridge these large, seemingly unrelated worlds?

Brown: At the moment, we have the capability of providing the tools for them to do that and the processes for them to do that. Healthcare is a very complex world with the administrators and the healthcare professionals. You have different grades of those in different places. Each department and each organization has its different culture, and bringing them together is a significant challenge.

In some of that processes, certainly, you start with understanding what it is you’re trying to address. You start with what are the pain points, what are the challenges, what are the blockages, and how can we overcome those blockages? It’s a way of bringing people together in workshops. TOGAF, a standard of The Open Group, has the business scenario method, bringing people together, building business scenarios, and understanding what people’s pain points are.

As long as we can then follow through with the solutions and not disappoint people, there is the opportunity for doing that. The reality is that you have to do that in small areas at a time. We’re not going to take the entire population of the United States and get everyone in the workshop and work altogether.

But you can start in pockets and then generate evangelists, proof points, and successful case studies. The work will then start emanating out to all other areas.

Gardner: It seems too that, with a heightened focus on vertical industries, there are lessons that could be learned in one vertical industry and perhaps applied to another. That also came out in some of the discussions around big data here at the conference.

The financial industry recognized the crucial role that data plays, made investments, and brought the constituencies of domain expertise in finance with the IT domain expertise in data and analysis, and came up with some very impressive results.

Do you see that what has been the case in something like finance is now making its way to healthcare? Is this an enterprise or business architect role that opens up more opportunity for those individuals as business and/or enterprise architects in healthcare? Why don’t we see more enterprise architects in healthcare?

Good folks

Brown: I don’t know. We haven’t run the numbers to see how many there are. There are some very competent enterprise architects within the healthcare industry around the world. We’ve got some good folks there.

The focus of The Open Group for the last couple of decades or so has always been on horizontal standards, standards that are applicable to any industry. Our focus is always about pragmatic standards that can be implemented and touched and felt by end-user consumer organizations.

Now, we’re seeing how we can make those even more pragmatic and relevant by addressing the verticals, but we’re not going to lose the horizontal focus. We’ll be looking at what lessons can be learned and what we can build on. Big data is a great example of the fact that the same kind of approach of gathering the data from different sources, whatever that is, and for mixing it up and being able to analyze it, can be applied anywhere.

The challenge with that, of course, is being able to capture it, store it, analyze it, and make some sense of it. You need the resources, the storage, and the capability of actually doing that. It’s not just a case of, “I’ll go and get some big data today.”

I do believe that there are lessons learned that we can move from one industry to another. I also believe that, since some geographic areas and some countries are ahead of others, there’s also a cascading of knowledge and capability around the world in a given time scale as well.

Gardner: Well great. I’m afraid we’ll have to leave it there. We’ve been talking about the increasingly essential role of standards in a complex world, where risk and dependability become even more essential. We have seen how The Open Group is evolving to meet these challenges through many of its activities and through many of the discussions here at the conference.

Please join me now in thanking our guest, Allen Brown, President and CEO of The Open Group. Thank you.

Brown: Thanks for taking the time to talk to us, Dana.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Conference, Enterprise Architecture, Healthcare, Open Platform 3.0, Professional Development, Service Oriented Architecture, TOGAF, TOGAF®

The Open Group Philadelphia – Day Three Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

We are winding down Day 3 and gearing up for the next two days of training and workshops.  Today’s subject areas included TOGAF®, ArchiMate®, Risk Management, Innovation Management, Open Platform 3.0™ and Future Trends.

The objective of the Future Trends session was to discuss “emerging business and technical trends that will shape enterprise IT”, according to Dave Lounsbury, Chief Technical Officer of The Open Group.

This track also featured a presentation by Dr. William Lafontaine, VP High Performance Computing, Analytics & Cognitive Markets, IBM Research, who gave an overview of the “Global Technology Outlook 2013”.  He stated the Mega Trends are:  Growing Scale/Lower Barrier of Entry; Increasing Complexity/Yet More Consumable; Fast Pace; Contextual Overload.  Mike Walker, Strategies & Enterprise Architecture Advisor for HP, noted the key disrupters that will affect our future are the business of IT, technology itself, expectation of consumers and globalization.

The session concluded with an in-depth Q&A with Bill, Dave, Mike (as shown below) and Allen Brown, CEO of The Open Group.Philly Day 3

Other sessions included presentations by TJ Virdi (Senior Enterprise Architect, Boeing) on Innovation Management, Jack Jones (President, CXOWARE, Inc.) on Risk Management and Stephen Bennett (Executive Principal, Oracle) on Big Data.

A special thanks goes to our many sponsors during this dynamic conference: Windstream, Architecting the Enterprise, Metaplexity, BIZZdesign, Corso, Avolution, CXOWARE, Penn State – Online Program in Enterprise Architecture, and Association of Enterprise Architects.

Stay tuned for post-conference proceedings to be posted soon!  See you at our conference in London, October 21-24.

Comments Off

Filed under ArchiMate®, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Open Platform 3.0, RISK Management, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

philly 2.jpgDay 2 at The Open Group conference in the City of Brotherly Love, as Philadelphia is also known, was another busy and remarkable day.

The plenary started with a fascinating presentation, “Managing the Health of the Nation” by David Nash, MD, MBA, Dean of Jefferson School of Population Health.  Healthcare is the number one industry in the city of Philadelphia, with the highest number of patients in beds in the top 10 US cities. The key theme of his thought-provoking speech was “boundaryless information sharing” (sound familiar?), which will enable a healthcare system that is “safe, effective, patient-centered, timely, equitable, efficient”.

Following Dr. Nash’s presentation was the Healthcare Transformation Panel moderated by Allen Brown, CEO of The Open Group.  Participants were:  Gina Uppal (Fulbright-Killam Fellow, American University Program), Mike Lambert (Open Group Fellow, Architecting the Enterprise), Rosemary Kennedy (Associate Professor, Thomas Jefferson University), Blaine Warkentine, MD, MPH and Fran Charney (Pennsylvania Patient Safety Authority). The group brought different sets of experiences within the healthcare system and provided reaction to Dr. Nash’s speech.  All agree on the need for fundamental change and that technology will be key.

The conference featured a spotlight on The Open Group’s newest forum, Open Platform 3.0™ by Dr. Chris Harding, Director of Interoperability.  Open Platform 3.0 was formed to advance The Open Group vision of Boundaryless Information Flow™ to help enterprises in the use of Cloud, Social, Mobile Computing and Big Data.  For more info; http://www.opengroup.org/getinvolved/forums/platform3.0

The Open Group flourishes because of people interaction and collaboration.  The accolades continued with several members being recognized for their outstanding contributions to The Open Group Trusted Technology Forum (OTTF) and the Service-Oriented Architecture (SOA) and Cloud Computing Work Groups.  To learn more about our Forums and Work Groups and how to get involved, please visit http://www.opengroup.org/getinvolved

Presentations and workshops were also held in the Healthcare, Finance and Government vertical industries. Presenters included Larry Schmidt (Chief Technologist, HP), Rajamanicka Ponmudi (IT Architect, IBM) and Robert Weisman (CEO, Build the Vision, Inc.).

2 Comments

Filed under ArchiMate®, Business Architecture, Cloud/SOA, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Open Platform 3.0, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day One Highlights

By Loren K.  Baynes, Director, Global Marketing Communications at The Open Group.

PhillyOn Monday, July 15th, we kicked off our conference in Philadelphia. As Allen Brown, CEO of The Open Group, commented in his opening remarks, Philadelphia is the birthplace of American democracy.  This is the first time The Open Group has hosted a conference in this historical city.

Today’s plenary sessions featured keynote speakers covering topics ranging from an announcement of a new Open Group standard, appointment of a new Fellow, Enterprise Architecture and Transformation, Big Data and spotlights on The Open Group forums, Real-time Embedded Systems and Open Trusted Technology, as well as a new initiative on Healthcare.

Allen Brown noted that The Open Group has 432 member organizations with headquarters in 32 countries and over 40,000 individual members in 126 countries.

The Open Group Vision is Boundaryless Information Flow™ achieved through global interoperability in a secure, reliable and timely manner.  But as stated by Allen, “Boundaryless does not mean there are no boundaries.  It means that boundaries are permeable to enable business”

Allen also presented an overview of the new “Dependability Through Assuredness™ Standard.  The Open Group Real-time Embedded Systems Forum is the home of this standard. More news to come!

Allen introduced Dr. Mario Tokoro, (CEO of Sony Computer Systems Laboratories) who began this project in 2006. Dr. Tokoro stated, “Thank you from the bottom of my heart for understanding the need for this standard.”

Eric Sweden, MSIH MBA, Program Director, Enterprise Architecture & Governance\National Association of State CIOs (NASCIO) offered a presentation entitled “State of the States – NASCIO on Enterprise Architecture: An Emphasis on Cross-Jurisdictional Collaboration across States”.  Eric noted “Enterprise Architecture is a blueprint for better government.” Furthermore, “Cybersecurity is a top priority for government”.

Dr. Michael Cavaretta, Technical Lead and Data Scientist with Ford Motor Company discussed “The Impact of Big Data on the Enterprise”.  The five keys, according to Dr. Cavaretta, are “perform, analyze, assess, track and monitor”.  Please see the following transcript from a Big Data analytics podcast, hosted by The Open Group, Dr. Cavaretta participated in earlier this year. http://blog.opengroup.org/2013/01/28/the-open-group-conference-plenary-speaker-sees-big-data-analytics-as-a-way-to-bolster-quality-manufacturing-and-business-processes/

The final presentation during Monday morning’s plenary was “Enabling Transformation Through Architecture” by Lori Summers (Director of Technology) and Amit Mayabhate (Business Architect Manager) with Fannie Mae Multifamily.

Lori stated that their organization had adopted Business Architecture and today they have an integrated team who will complete the transformation, realize value delivery and achieve their goals.

Amit noted “Traceability from the business to architecture principles was key to our design.”

In addition to the many interesting and engaging presentations, several awards were presented.  Joe Bergmann, Director, Real-time and Embedded Systems Forum, The Open Group, was appointed Fellow by Allen Brown in recognition of Joe’s major achievements over the past 20+ years with The Open Group.

Other special recognition recipients include members from Oracle, IBM, HP and Red Hat.

In addition to the plenary session, we hosted meetings on Finance, Government and Healthcare industry verticals. Today is only Day One of The Open Group conference in Philadelphia. Please stay tuned for more exciting conference highlights over the next couple days.

Comments Off

Filed under ArchiMate®, Business Architecture, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Security Architecture, Standards, TOGAF®

The Open Group Conference to Emphasize Healthcare as Key Sector for Ecosystem-Wide Interactions

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15, in Philadelphia. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.

Gardner

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these discussions on enterprise transformation in the finance, government, and healthcare sector.

We’re here now with a panel of experts to explore how new IT trends are empowering improvements, specifically in the area of healthcare. We’ll learn how healthcare industry organizations are seeking large-scale transformation and what are some of the paths they’re taking to realize that.

We’ll see how improved cross-organizational collaboration and such trends as big data and cloud computing are helping to make healthcare more responsive and efficient.

With that, please join me in welcoming our panel, Jason Uppal, Chief Architect and Acting CEO at clinicalMessage. Welcome, Jason.

Jason Uppal: Thank you, Dana.

Inside of healthcare and inside the healthcare ecosystem, information either doesn’t flow well or it only flows at a great cost.

Gardner: And we’re also joined by Larry Schmidt, Chief Technologist at HP for the Health and Life Sciences Industries. Welcome, Larry.

Larry Schmidt: Thank you.

Gardner: And also, Jim Hietala, Vice President of Security at The Open Group. Welcome back, Jim. [Disclosure: The Open Group and HP are sponsors of BriefingsDirect podcasts.]

Jim Hietala: Thanks, Dana. Good to be with you.

Gardner: Let’s take a look at this very interesting and dynamic healthcare sector, Jim. What, in particular, is so special about healthcare and why do things like enterprise architecture and allowing for better interoperability and communication across organizational boundaries seem to be so relevant here?

Hietala: There’s general acknowledgement in the industry that, inside of healthcare and inside the healthcare ecosystem, information either doesn’t flow well or it only flows at a great cost in terms of custom integration projects and things like that.

Fertile ground

From The Open Group’s perspective, it seems that the healthcare industry and the ecosystem really is fertile ground for bringing to bear some of the enterprise architecture concepts that we work with at The Open Group in order to improve, not only how information flows, but ultimately, how patient care occurs.

Gardner: Larry Schmidt, similar question to you. What are some of the unique challenges that are facing the healthcare community as they try to improve on responsiveness, efficiency, and greater capabilities?

Schmidt: There are several things that have not really kept up with what technology is able to do today.

For example, the whole concept of personal observation comes into play in what we would call “value chains” that exist right now between a patient and a doctor. We look at things like mobile technologies and want to be able to leverage that to provide additional observation of an individual, so that the doctor can make a more complete diagnosis of some sickness or possibly some medication that a person is on.

We want to be able to see that observation in real life, as opposed to having to take that in at the office, which typically winds up happening. I don’t know about everybody else, but every time I go see my doctor, oftentimes I get what’s called white coat syndrome. My blood pressure will go up. But that’s not giving the doctor an accurate reading from the standpoint of providing great observations.

Technology has advanced to the point where we can do that in real time using mobile and other technologies, yet the communication flow, that information flow, doesn’t exist today, or is at best, not easily communicated between doctor and patient.

There are plenty of places that additional collaboration and communication can improve the whole healthcare delivery model.

If you look at the ecosystem, as Jim offered, there are plenty of places that additional collaboration and communication can improve the whole healthcare delivery model.

That’s what we’re about. We want to be able to find the places where the technology has advanced, where standards don’t exist today, and just fuel the idea of building common communication methods between those stakeholders and entities, allowing us to then further the flow of good information across the healthcare delivery model.

Gardner: Jason Uppal, let’s think about what, in addition to technology, architecture, and methodologies can bring to bear here? Is there also a lag in terms of process thinking in healthcare, as well as perhaps technology adoption?

Uppal: I’m going to refer to a presentation that I watched from a very well-known surgeon from Harvard, Dr. Atul Gawande. His point was is that, in the last 50 years, the medical industry has made great strides in identifying diseases, drugs, procedures, and therapies, but one thing that he was alluding to was that medicine forgot the cost, that everything is cost.

At what price?

Today, in his view, we can cure a lot of diseases and lot of issues, but at what price? Can anybody actually afford it?

Uppal

His view is that if healthcare is going to change and improve, it has to be outside of the medical industry. The tools that we have are better today, like collaborative tools that are available for us to use, and those are the ones that he was recommending that we need to explore further.

That is where enterprise architecture is a powerful methodology to use and say, “Let’s take a look at it from a holistic point of view of all the stakeholders. See what their information needs are. Get that information to them in real time and let them make the right decisions.”

Therefore, there is no reason for the health information to be stuck in organizations. It could go with where the patient and providers are, and let them make the best decision, based on the best practices that are available to them, as opposed to having siloed information.

So enterprise-architecture methods are most suited for developing a very collaborative environment. Dr. Gawande was pointing out that, if healthcare is going to improve, it has to think about it not as medicine, but as healthcare delivery.

There are definitely complexities that occur based on the different insurance models and how healthcare is delivered across and between countries.

Gardner: And it seems that not only are there challenges in terms of technology adoption and even operating more like an efficient business in some ways. We also have very different climates from country to country, jurisdiction to jurisdiction. There are regulations, compliance, and so forth.

Going back to you, Larry, how important of an issue is that? How complex does it get because we have such different approaches to healthcare and insurance from country to country?

Schmidt: There are definitely complexities that occur based on the different insurance models and how healthcare is delivered across and between countries, but some of the basic and fundamental activities in the past that happened as a result of delivering healthcare are consistent across countries.

As Jason has offered, enterprise architecture can provide us the means to explore what the art of the possible might be today. It could allow us the opportunity to see how innovation can occur if we enable better communication flow between the stakeholders that exist with any healthcare delivery model in order to give us the opportunity to improve the overall population.

After all, that’s what this is all about. We want to be able to enable a collaborative model throughout the stakeholders to improve the overall health of the population. I think that’s pretty consistent across any country that we might work in.

Ongoing work

Gardner: Jim Hietala, maybe you could help us better understand what’s going on within The Open Group and, even more specifically, at the conference in Philadelphia. There is the Population Health Working Group and there is work towards a vision of enabling the boundaryless information flow between the stakeholders. Any other information and detail you could offer would be great.[Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Hietala: On Tuesday of the conference, we have a healthcare focus day. The keynote that morning will be given by Dr. David Nash, Dean of the Jefferson School of Population Health. He’ll give what’s sure to be a pretty interesting presentation, followed by a reactors’ panel, where we’ve invited folks from different stakeholder constituencies.

Hietala

We are going to have clinicians there. We’re going to have some IT folks and some actual patients to give their reaction to Dr. Nash’s presentation. We think that will be an interesting and entertaining panel discussion.

The balance of the day, in terms of the healthcare content, we have a workshop. Larry Schmidt is giving one of the presentations there, and Jason and myself and some other folks from our working group are involved in helping to facilitate and carry out the workshop.

The goal of it is to look into healthcare challenges, desired outcomes, the extended healthcare enterprise, and the extended healthcare IT enterprise and really gather those pain points that are out there around things like interoperability to surface those and develop a work program coming out of this.

We want to be able to enable a collaborative model throughout the stakeholders to improve the overall health of the population.

So we expect it to be an interesting day if you are in the healthcare IT field or just the healthcare field generally, it would definitely be a day well spent to check it out.

Gardner: Larry, you’re going to be talking on Tuesday. Without giving too much away, maybe you can help us understand the emphasis that you’re taking, the area that you’re going to be exploring.

Schmidt: I’ve titled the presentation “Remixing Healthcare through Enterprise Architecture.” Jason offered some thoughts as to why we want to leverage enterprise architecture to discipline healthcare. My thoughts are that we want to be able to make sure we understand how the collaborative model would work in healthcare, taking into consideration all the constituents and stakeholders that exist within the complete ecosystem of healthcare.

This is not just collaboration across the doctors, patients, and maybe the payers in a healthcare delivery model. This could be out as far as the drug companies and being able to get drug companies to a point where they can reorder their raw materials to produce new drugs in the case of an epidemic that might be occurring.

Real-time model

It would be a real-time model that allows us the opportunity to understand what’s truly happening, both to an individual from a healthcare standpoint, as well as to a country or a region within a country and so on from healthcare. This remixing of enterprise architecture is the introduction to that concept of leveraging enterprise architecture into this collaborative model.

Then, I would like to talk about some of the technologies that I’ve had the opportunity to explore around what is available today in technology. I believe we need to have some type of standardized messaging or collaboration models to allow us to further facilitate the ability of that technology to provide the value of healthcare delivery or betterment of healthcare to individuals. I’ll talk about that a little bit within my presentation and give some good examples.

It’s really interesting. I just traveled from my company’s home base back to my home base and I thought about something like a body scanner that you get into in the airport. I know we’re in the process of eliminating some of those scanners now within the security model from the airports, but could that possibly be something that becomes an element within healthcare delivery? Every time your body is scanned, there’s a possibility you can gather information about that, and allow that to become a part of your electronic medical record.

There is a lot of information available today that could be used in helping our population to be healthier.

Hopefully, that was forward thinking, but that kind of thinking is going to play into the art of the possible, with what we are going to be doing, both in this presentation and talking about that as part of the workshop.

Gardner: Larry, we’ve been having some other discussions with The Open Group around what they call Open Platform 3.0™, which is the confluence of big data, mobile, cloud computing, and social.

One of the big issues today is this avalanche of data, the Internet of things, but also the Internet of people. It seems that the more work that’s done to bring Open Platform 3.0 benefits to bear on business decisions, it could very well be impactful for centers and other data that comes from patients, regardless of where they are, to a medical establishment, regardless of where it is.

So do you think we’re really on the cusp of a significant shift in how medicine is actually conducted?

Schmidt: I absolutely believe that. There is a lot of information available today that could be used in helping our population to be healthier. And it really isn’t only the challenge of the communication model that we’ve been speaking about so far. It’s also understanding the information that’s available to us to take that and make that into knowledge to be applied in order to help improve the health of the population.

As we explore this from an as-is model in enterprise architecture to something that we believe we can first enable through a great collaboration model, through standardized messaging and things like that, I believe we’re going to get into even deeper detail around how information can truly provide empowered decisions to physicians and individuals around their healthcare.

So it will carry forward into the big data and analytics challenges that we have talked about and currently are talking about with The Open Group.

Healthcare framework

Gardner: Jason Uppal, we’ve also seen how in other business sectors, industries have faced transformation and have needed to rely on something like enterprise architecture and a framework like TOGAF® in order to manage that process and make it something that’s standardized, understood, and repeatable.

It seems to me that healthcare can certainly use that, given the pace of change, but that the impact on healthcare could be quite a bit larger in terms of actual dollars. This is such a large part of the economy that even small incremental improvements can have dramatic effects when it comes to dollars and cents.

So is there a benefit to bringing enterprise architect to healthcare that is larger and greater than other sectors because of these economics and issues of scale?

Uppal: That’s a great way to think about this thing. In other industries, applying enterprise architecture to do banking and insurance may be easily measured in terms of dollars and cents, but healthcare is a fundamentally different economy and industry.

It’s not about dollars and cents. It’s about people’s lives, and loved ones who are sick, who could very easily be treated, if they’re caught in time and the right people are around the table at the right time. So this is more about human cost than dollars and cents. Dollars and cents are critical, but human cost is the larger play here.

Whatever systems and methods are developed, they have to work for everybody in the world.

Secondly, when we think about applying enterprise architecture to healthcare, we’re not talking about just the U.S. population. We’re talking about global population here. So whatever systems and methods are developed, they have to work for everybody in the world. If the U.S. economy can afford an expensive healthcare delivery, what about the countries that don’t have the same kind of resources? Whatever methods and delivery mechanisms you develop have to work for everybody globally.

That’s one of the things that a methodology like TOGAF brings out and says to look at it from every stakeholder’s point of view, and unless you have dealt with every stakeholder’s concerns, you don’t have an architecture, you have a system that’s designed for that specific set of audience.

The cost is not this 18 percent of the gross domestic product in the U.S. that is representing healthcare. It’s the human cost, which is many multitudes of that. That’s is one of the areas where we could really start to think about how do we affect that part of the economy, not the 18 percent of it, but the larger part of the economy, to improve the health of the population, not only in the North America, but globally.

If that’s the case, then what really will be the impact on our greater world economy is improving population health, and population health is probably becoming our biggest problem in our economy.

We’ll be testing these methods at a greater international level, as opposed to just at an organization and industry level. This is a much larger challenge. A methodology like TOGAF is a proven and it could be stressed and tested to that level. This is a great opportunity for us to apply our tools and science to a problem that is larger than just dollars. It’s about humans.

All “experts”

Gardner: Jim Hietala, in some ways, we’re all experts on healthcare. When we’re sick, we go for help and interact with a variety of different services to maintain our health and to improve our lifestyle. But in being experts, I guess that also means we are witnesses to some of the downside of an unconnected ecosystem of healthcare providers and payers.

One of the things I’ve noticed in that vein is that I have to deal with different organizations that don’t seem to communicate well. If there’s no central process organizer, it’s really up to me as the patient to pull the lines together between the different services — tests, clinical observations, diagnosis, back for results from tests, sharing the information, and so forth.

Have you done any studies or have anecdotal information about how that boundaryless information flow would be still relevant, even having more of a centralized repository that all the players could draw on, sort of a collaboration team resource of some sort? I know that’s worked in other industries. Is this not a perfect opportunity for that boundarylessness to be managed?

Hietala: I would say it is. We all have experiences with going to see a primary physician, maybe getting sent to a specialist, getting some tests done, and the boundaryless information that’s flowing tends to be on paper delivered by us as patients in all the cases.

So the opportunity to improve that situation is pretty obvious to anybody who’s been in the healthcare system as a patient. I think it’s a great place to be doing work. There’s a lot of money flowing to try and address this problem, at least here in the U.S. with the HITECH Act and some of the government spending around trying to improve healthcare.

We’ll be testing these methods at a greater international level, as opposed to just at an organization and industry level.

You’ve got healthcare information exchanges that are starting to develop, and you have got lots of pain points for organizations in terms of trying to share information and not having standards that enable them to do it. It seems like an area that’s really a great opportunity area to bring lots of improvement.

Gardner: Let’s look for some examples of where this has been attempted and what the success brings about. I’ll throw this out to anyone on the panel. Do you have any examples that you can point to, either named organizations or anecdotal use case scenarios, of a better organization, an architectural approach, leveraging IT efficiently and effectively, allowing data to flow, putting in processes that are repeatable, centralized, organized, and understood. How does that work out?

Uppal: I’ll give you an example. One of the things that happens when a patient is admitted to hospital and in hospital is that they get what’s called a high-voltage care. There is staff around them 24×7. There are lots of people around, and every specialty that you can think of is available to them. So the patient, in about two or three days, starts to feel much better.

When that patient gets discharged, they get discharged to home most of the time. They go from very high-voltage care to next to no care. This is one of the areas where in one of the organizations we work with is able to discharge the patient and, instead of discharging them to the primary care doc, who may not receive any records from the hospital for several days, they get discharged to into a virtual team. So if the patient is at home, the virtual team is available to them through their mobile phone 24×7.

Connect with provider

If, at 3 o’clock in the morning, the patient doesn’t feel right, instead of having to call an ambulance to go to hospital once again and get readmitted, they have a chance to connect with their care provider at that time and say, “This is what the issue is. What do you want me to do next? Is this normal for the medication that I am on, or this is something abnormal that is happening?”

When that information is available to that care provider who may not necessarily have been part of the care team when the patient was in the hospital, that quick readily available information is key for keeping that person at home, as opposed to being readmitted to the hospital.

We all know that the cost of being in a hospital is 10 times more than it is being at home. But there’s also inconvenience and human suffering associated with being in a hospital, as opposed to being at home.

Those are some of the examples that we have, but they are very limited, because our current health ecosystem is a very organization specific, not  patient and provider specific. This is the area there is a huge room for opportunities for healthcare delivery, thinking about health information, not in the context of the organization where the patient is, as opposed to in a cloud, where it’s an association between the patient and provider and health information that’s there.

Extending that model will bring infinite value to not only reducing the cost, but improving the cost and quality of care.

In the past, we used to have emails that were within our four walls. All of a sudden, with Gmail and Yahoo Mail, we have email available to us anywhere. A similar thing could be happening for the healthcare record. This could be somewhere in the cloud’s eco setting, where it’s securely protected and used by only people who have granted access to it.

Those are some of the examples where extending that model will bring infinite value to not only reducing the cost, but improving the cost and quality of care.

Schmidt: Jason touched upon the home healthcare scenario and being able to provide touch points at home. Another place that we see evolving right now in the industry is the whole concept of mobile office space. Both countries, as well as rural places within countries that are developed, are actually getting rural hospitals and rural healthcare offices dropped in by helicopter to allow the people who live in those communities to have the opportunity to talk to a doctor via satellite technologies and so on.

The whole concept of a architecture around and being able to deal with an extension of what truly lines up being telemedicine is something that we’re seeing today. It would be wonderful if we could point to things like standards that allow us to be able to facilitate both the communication protocols as well as the information flows in that type of setting.

Many corporations can jump on the bandwagon to help the rural communities get the healthcare information and capabilities that they need via the whole concept of telemedicine.

That’s another area where enterprise architecture has come into play. Now that we see examples of that working in the industry today, I am hoping that as part of this working group, we’ll get to the point where we’re able to facilitate that much better, enabling innovation to occur for multiple companies via some of the architecture or the architecture work we are planning on producing.

Single view

Gardner: It seems that we’ve come a long way on the business side in many industries of getting a single view of the customer, as it’s called, the customer relationship management, big data, spreading the analysis around among different data sources and types. This sounds like a perfect fit for a single view of the patient across their life, across their care spectrum, and then of course involving many different types of organizations. But the government also needs to have a role here.

Jim Hietala, at The Open Group Conference in Philadelphia, you’re focusing on not only healthcare, but finance and government. Regarding the government and some of the agencies that you all have as members on some of your panels, how well do they perceive this need for enterprise architecture level abilities to be brought to this healthcare issue?

Hietala: We’ve seen encouraging signs from folks in government that are encouraging to us in bringing this work to the forefront. There is a recognition that there needs to be better data flowing throughout the extended healthcare IT ecosystem, and I think generally they are supportive of initiatives like this to make that happen.

Gardner: Of course having conferences like this, where you have a cross pollination between vertical industries, will perhaps allow some of the technical people to talk with some of the government people too and also have a conversation with some of the healthcare people. That’s where some of these ideas and some of the collaboration could also be very powerful.

We’ve seen encouraging signs from folks in government that are encouraging to us in bringing this work to the forefront.

I’m afraid we’re almost out of time. We’ve been talking about an interesting healthcare transition, moving into a new phase or even era of healthcare.

Our panel of experts have been looking at some of the trends in IT and how they are empowering improvement for how healthcare can be more responsive and efficient. And we’ve seen how healthcare industry organizations can take large scale transformation using cross-organizational collaboration, for example, and other such tools as big data, analytics, and cloud computing to help solve some of these issues.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference this July in Philadelphia. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL, and you will hear more about healthcare or Open Platform 3.0 as well as enterprise transformation in the finance, government, and healthcare sectors.

With that, I’d like to thank our panel. We’ve been joined today by Jason Uppal, Chief Architect and Acting CEO at clinicalMessage. Thank you so much, Jason.

Uppal: Thank you, Dana.

Gardner: And also Larry Schmidt, Chief Technologist at HP for the Health and Life Sciences Industries. Thanks, Larry.

Schmidt: You bet, appreciate the time to share my thoughts. Thank you.

Gardner: And then also Jim Hietala, Vice President of Security at The Open Group. Thanks so much.

Hietala: Thank you, Dana.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator throughout these thought leader interviews. Thanks again for listening and come back next time.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Conference, Enterprise Architecture, Healthcare, Open Platform 3.0, Professional Development, Service Oriented Architecture, TOGAF, TOGAF®

As Platform 3.0 ripens, expect agile access and distribution of actionable intelligence across enterprises, says The Open Group panel

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

This latest BriefingsDirect discussion, leading into the The Open Group Conference on July 15 in Philadelphia, brings together a panel of experts to explore the business implications of the current shift to so-called Platform 3.0.

Known as the new model through which big data, cloud, and mobile and social — in combination — allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we’re here now to learn more how to leverage Platform 3.0 as more than a IT shift — and as a business game-changer. It will be a big topic at next week’s conference.

The panel: Dave Lounsbury, Chief Technical Officer at The Open Group; Chris Harding, Director of Interoperability at The Open Group, and Mark Skilton, Global Director in the Strategy Office at Capgemini. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference, which is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They’re turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it’s a pretty fundamental change.

Lounsbury

If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we’re starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Harding: Enterprises have to keep up with the way that things are moving in order to keep their positions in their industries. Enterprises can’t afford to be working with yesterday’s technology. It’s a case of being able to understand the information that they’re presented, and make the best decisions.

Harding

We’ve always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we’re talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs — velocity, volume, and value — on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we’re now into is the multi-workload environment, where you have mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton

It has to do with not just one solution, not one subscription model — because we’re now into this subscription-model era … the subscription economy, as one group tends to describe it. Now, we’re looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we’re dealing with — 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Why should IT be thinking about this as a fundamental shift, rather than a modest change?

Lounsbury: A lot depends on how you define your IT organization. It’s useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it’s how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There’s no point giving someone data if it’s not been properly managed or if there’s incorrect information.

What’s going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we’ve seen in the open-source  role and things like that nature, but there’s the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they’re going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can’t do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally — how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I’d just like to add to Dave’s excellent points about, the shape of data has changed, but also about why should IT get involved. We’re seeing that there’s a shift in the constituency of who is using this data.

We have the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We’ve got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there’s also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: How do we prevent this from going off the rails?

Harding: This a very important point. And to add to the difficulties, it’s not only that a whole set of different people are getting involved with different kinds of information, but there’s also a step change in the speed with which all this is delivered. It’s no longer the case, that you can say, “Oh well, we need some kind of information system to manage this information. We’ll procure it and get a program written” that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It’s really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It’s a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Skilton: Capgemini has been doing work in this area. I break it down into four levels of scalability. It’s the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We’re very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it’s all about connectivity in the field. I meet a number of clients who are saying, “We’ve got this cloud service,” or “This service is in a certain area of my country. If I move to another parts of the country or I’m traveling, I can’t get connectivity.” That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the “long tail effect.” It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: We’re coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3.0, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We’re still in the formational stages of  “third platform” or Platform 3.0 for The Open Group as an industry. To some extent, we’re starting pretty much at the ground floor with that in the Platform 3.0 forum. We’re leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we’ve got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we’re really working toward in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Harding: We certainly also need to understand the business environment within which Platform 3.0 will be used. We’ve heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we’ve heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We’re also hearing about marketplaces for services, new ways in which services are being made available and combined.

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: Looking to the future, when we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we’ve begun to see that how a recommendation engine could be brought to bear. I’m thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thoughts?

Skilton: What we’re talking about is the next generation of the Internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the Internet.

I think that in the future, we’ll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We’ll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what’s happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: There’s this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, “So now this is where we could make a change to our business.” It’s the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It’s going to be different for every business, and I’m very happy to say this, it’s something that computers aren’t going to be able to do for a very long time yet. It’s going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it’s a very exciting time, and we’ll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we’ll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, “It’s going to be them” or “It’s going to be them.”

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I’d disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we’ll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Cloud/SOA, Conference, Data management, Enterprise Architecture, Platform 3.0, Professional Development, TOGAF®

The Open Group July Conference Emphasizes Value of Placing Structure and Agility Around Enterprise Risk Reduction Efforts

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview series, coming to you in conjunction with The Open Group Conference on July 15 in Philadelphia.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these discussions on Enterprise Transformation in the finance, government, and healthcare sector.

We’re here now with a panel of experts to explore new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We’ll learn how enterprises are better delivering risk assessment and, one hopes, defenses, in the current climate of challenging cyber security. And we’ll see how predicting risks and potential losses accurately, is an essential ingredient in enterprise transformation.

With that, please join me in welcoming our panel, we’re here with Jack Freund, the Information Security Risk Assessment Manager at TIAA-CREF. Jack has spent over 14 years in enterprise IT, is a visiting professor at DeVry University, and also chairs a Risk-Management Subcommittee for the ISACA. Welcome back, Jack.

Jack Freund: Glad to be here, Dana. Thanks for having me.

Gardner: We’re also here with Jack Jones. He is the Principal at CXOWARE, and he has more than nine years of experience as a Chief Information Security Officer (CISO). He is also an inventor of the FAIR, risk analysis framework. Welcome, Jack.

Jack Jones: Thank you very much.

Gardner: We’re also here with Jim Hietala. He is the Vice President, Security, at The Open Group. Welcome, Jim.

Jim Hietala: Thanks, Dana, good to be here.

Gardner: Let’s start with you, Jim. It’s been about six months since we spoke about these issues around risk assessment and understanding risk accurately, and it’s hard to imagine things getting any better in the last six months. There’s been a lot of news and interesting developments in the cyber-security landscape.

So has this heightened interest? What are The Open Group and others are doing in this field of risk assessment and accuracy and determining what your losses might be and how that can be a useful tool?

Hietala: I would say it has. Certainly, in the cyber security world in the past six or nine months, we’ve seen more and more discussion of the threats that are out there. We’ve got nation-state types of threats that are very concerning, very serious, and that organizations have to consider.

With what’s happening, you’ve seen that the US Administration and President Obama direct the National Institute of Standards and Technology (NIST) to develop a new cybersecurity framework. Certainly on the government side of things, there is an increased focus on what can we do to increase the level of cybersecurity throughout the country in critical infrastructure. So my short answer would be yes, there is more interest in coming up with ways to accurately measure and assess risk so that we can then deal with it.

Gardner: Jack Jones, do you also see a maturity going on, or are we just hearing more in the news and therefore there is a perception shift? How do you see things? How have things changed, in your perception, over the last six to nine months?

Jones: I continue to see growth and maturity, especially in areas of understanding the fundamental nature of risk and exploration of quantitative methods for it. A few years ago, that would have seemed unrealistic at best, and outlandish at worst in many people’s eyes. Now, they’re beginning to recognize that it is not only pragmatic, but necessary in order to get a handle on much of what we have to do from a prioritization perspective.

Gardner: Jack Freund are you seeing an elevation in the attention being paid to risk issues inside companies in larger organizations? Is this something that’s getting the attention of all the people it should?

Freund: We’re entering a phase where there is going to be increased regulatory oversights over very nearly everything. When that happens, all eyes are going to turn to IT and IT risk management functions to answer the question of whether we’re handling the right things. Without quantifying risk, you’re going to have a very hard time saying to your board of directors that you’re handling the right things the way a reasonable company should.

As those regulators start to see and compare among other companies, they’ll find that these companies over here are doing risk quantification, and you’re not. You’re putting yourself at a competitive disadvantage by not being able to provide those same sorts of services.

Gardner: So you’re saying that the market itself hasn’t been enough to drive this, and that regulation is required?

Freund: It’s probably a stronger driver than market forces at this point. The market is always going to be able to help push that to a more prominent role, but especially in information security. If you’re not experiencing primary losses as a result of these sorts of things, then you have to look to economic externalities, which are largely put in play by regulatory forces here in the United States.

Jones: To support Jack’s statement that regulators are becoming more interested in this too, just in the last 60 days, I’ve spent time training people at two regulatory agencies on FAIR. So they’re becoming more aware of these quantitative methods, and their level of interest is rising.

Gardner: Jack Jones, this is probably a good time for us to explain a little bit more about FAIR. For those listeners who might not be that familiar with it, please take a moment to give us the high-level overview of what FAIR is.

Jones: Sure, just thumbnail sketch of it. It’s, first and foremost, a model for what risk is and how it works. It’s a decomposition of the factors that make up risk. If you can measure or estimate the value of those factors, you can derive risk quantitatively in dollars and cents.

You see a lot of “risk quantification” based on ordinal scales — 1, 2, 3, 4, 5 scales, that sort of thing. But that’s actually not quantitative. If you dig into it, there’s no way you could defend a mathematical analysis based on those ordinal approaches. So FAIR is this model for risk that enables true quantitative analysis in a very pragmatic way.

Gardner: FAIR stands for a Factor Analysis of Information Risk. Is that correct?

Jones: That is correct.

Gardner: Jim Hietala, we also have in addition to a very interesting and dynamic cybersecurity landscape a major trend getting traction in big data, cloud computing, and mobile. There’s lots going on in the IT world. Perhaps IT’s very nature, the roles and responsibilities, are shifting. Is doing risk assessment and management becoming part and parcel of core competency of IT, and is that a fairly big departure from the past?

Hietala: As to the first question, it’s having to become kind of a standard practice within IT. When you look at outsourcing your IT operations to a cloud-service provider, you have to consider the security risks in that environment. What do they look like and how do we measure them?

It’s the same thing for things like mobile computing. You really have to look at the risks of folks carrying tablets and smart phones, and understand the risks associated with those same things for big data. For any of these large-scale changes to our IT infrastructure you’ve got to understand what it means from a security and risk standpoint.

Gardner: Jack Freund or Jack Jones, any thoughts about the changing role of IT as a service and service-level agreement brokering aspects of IT aligned with risk assessment?

Freund: I read an interesting article this morning around a school district that is doing something they call bring your own technology (BYOT). For anybody who has been involved in these sort of efforts in the corporate world that should sound very familiar. But I want to think culturally around this. When you have students wondering how to do these sorts of things and becoming accustomed to being able to bring current technology, oh my gosh. When they get to the corporate world and start to work, they’re going to expect the same sorts of levels of service.

To answer to your earlier question, absolutely. We have to find a way to embed risk assessment, which is really just a way to inform decision making and how we adapt all of these technological changes to increase market position and to make ourselves more competitive. That’s important.

Whether that’s an embedded function within IT or it’s an overarching function that exists across multiple business units, there are different models that work for different size companies and companies of different cultural types. But it has to be there. It’s absolutely critical.

Gardner: Jack Jones, how do you come down this role of IT shifting in the risk assessment issues, something that’s their responsibility. Are they embracing that or  maybe wishing it away?

Jones: It depends on whom you talk to. Some of them would certainly like to wish it away. I don’t think IT’s role in this idea for risk assessment and such has really changed. What is changing is the level of visibility and interest within the organization, the business side of the organization, in the IT risk position.

Previously, they were more or less tucked away in a dark corner. People just threw money at it and hoped bad things didn’t happen. Now, you’re getting a lot more board-level interest in IT risk, and with that visibility comes a responsibility, but also a certain amount of danger. If they’re doing it really badly, they’re incredibly immature in how they approach risk.

They’re going to look pretty foolish in front of the board. Unfortunately, I’ve seen that play out. It’s never pretty and it’s never good news for the IT folks. They’re realizing that they need to come up to speed a little bit from a risk perspective, so that they won’t look the fools when they’re in front of these executives.

They’re used to seeing quantitative measures of opportunities and operational issues of risk of various natures. If IT comes to the table with a red, yellow, green chart, the board is left to wonder, first how to interpret that, and second, whether these guys really get it. I’m not sure the role has changed, but I think the responsibilities and level of expectations are changing.

Gardner: Part of what FAIR does in risk analysis in general is to identify potential losses and put some dollars on what potential downside there is. That provides IT with the tool, the ability, to rationalize investments that are needed. Are you seeing the knowledge of potential losses to be an incentive for spending on modernization?

Jones: Absolutely. One organization I worked with recently had certain deficiencies from the security perspective that they were aware of, but that were going to be very problematic to fix. They had identified technology and process solutions that they thought would take them a long way towards a better risk position. But it was a very expensive proposition, and they didn’t have money in the IT or information security budget for it.

So, we did a current-state analysis using FAIR, how much loss exposure they had on annualized basis. Then, we said, “If you plug this solution into place, given how it affects the frequency and magnitude of loss that you’d expect to experience, here’s what’s your new annualized loss exposure would be.” It turned out to be a multimillion dollar reduction in annualized loss exposure for a few hundred thousand dollars cost.

When they took that business case to management, it was a no-brainer, and management signed the check in a hurry. So they ended up being in a much better position.

If they had gone to executive management saying, “Well, we’ve got a high risk and if we buy this set of stuff we’ll have low or medium risk,” it would’ve been a much less convincing and understandable business case for the executives. There’s reason to expect that it would have been challenging to get that sort of funding given how tight their corporate budgets were and that sort of thing. So, yeah, it can be incredibly effective in those business cases.

Gardner: Correct me if I am wrong, but you have a book out since we last spoke. Jack, maybe you could tell a bit about of that and how that comes to bear on these issues?

Freund: Well, the book is currently being written. Jack Jones and I have entered into a contract with Elsevier and we’re also going to be preparing the manuscript here over the summer and winter. Probably by second quarter next year, we’ll have something that we can share with everybody. It’s something that has been a long time coming. For Jack, I know he has wanted to write this for a long time.

We wanted to build a conversational book around how to assess risk using FAIR, and that’s an important distinction from other books in the market today. You really want to dig into a lot of the mathematical stuff. I’m speaking personally here, but I wanted to build a book that gave people tools, gave practitioners the risk tools to be able to handle common challenges and common opposition to what they are doing every day, and just understand how to apply concepts in FAIR in a very tangible way.

Gardner: Very good. What about the conference itself. We’re coming up very rapidly on The Open Group Conference. What should we expect in terms of some of your presentations and training activities?

Jones: I think it will be a good time. People would be pleased to have the quality of the presentations and some of the new information that they’ll get to see and experience. As you said, we’re offering FAIR training as a part of a conference. It’s a two-day session with an opportunity afterwards to take the certification exam.

If history is any indication, people will go through the training. We get a lot of very positive remarks about a number of different things. One, they never imagined that risk could be interesting. They’re also surprised that it’s not, as one friend of mine calls it “rocket surgery.” It’s relatively straightforward and intuitive stuff. It’s just that as a profession, we haven’t had this framework for reference, as well as some of the methods that we apply to make it practical and defensible before.

So we’ve gotten great feedback in the past, and I think people will be pleasantly surprised at what they experienced.

Freund: One of the things I always say about FAIR training is it’s a real red pill-blue pill moment — in reference to the old Matrix movies. I took FAIR training several years ago with Jack. I always tease Jack that it’s ruined me for other risk assessment methods. Once you learn how to do it right, it’s very obvious which are the wrong methods and why you can’t use them to assess risk and why it’s problematic.

I’m joking. It’s really great and valuable training, and now I use it every day. It really does open your eyes to the problems and the risk assessment portion of IT today, and gives a very practical and actionable things to do in order to be able to fix that, and to provide value to your organization.

Gardner: Jim Hietala, the emphasis in terms of vertical industries at the conference is on finance, government and healthcare. They seem to be the right groups to be factoring more standardization and understanding of risk. Tell me how it comes together. Why is The Open Group looking at vertical industries at this time?

Hietala: Specific to risk, if I can talk about that for a second, the healthcare world, at least here in the US, has new security rules, and one of the first few requirements is perform an annual risk assessment. So it’s currently relevant to that industry.

It’s the same thing with finance. One of the regulations around financial organizations tells them that, in terms of information security, they need to do a risk assessment. In government, clearly there has been a lot of emphasis on understanding risk and mitigating it throughout various government sectors.

In terms of The Open Group and verticals, we’ve done lots of great work in the area of Enterprise Architecture, security, and all the areas for which we’ve done work. In terms of our conferences, we’ve evolved things over the last year or so to start to look at what are the things that are unique in verticals.

It started in the mining industry. We set up a mining metals and exploration forum that looked at IT and architecture issues related specifically to that sector. We started that work several years ago and now we’re looking at other industries and starting to assess the unique things in healthcare, for example. We’ve got a one day workshop at Philadelphia on the Tuesday of the conference, looking at IT and transformation opportunities in the healthcare sector.

That’s how we got to this point, and we’ll see more of that from The Open Group in the future.

Gardner: Are there any updates that we should be aware of in terms of activities within The Open Group and other organizations working on standards, taxonomy, and definitions when it comes to risk?

Hietala: I’ll take that and dive into that. We at The Open Group originally published a risk taxonomy standard based on FAIR four years ago. Over time, we’ve seen greater adoption by large companies and we’ve also seen the need to extend what we’re doing there. So we’re updating the risk taxonomy standard, and the new version of that should be published by the end of this summer.

We also saw within the industry the need for a certification program for risk analysts, and so they’d be trained in quantitative risk assessment using FAIR. We’re working on that program and we’ll be talking more about it in Philadelphia.

Along the way, as we were building the certification program, we realized that there was a missing piece in terms of the body of knowledge. So we created a second standard that is a companion to the taxonomy. That will be called the Risk Analysis Standard that looks more at some of that the process issues and how to do risk analysis using FAIR. That standard will also be available by the end of the summer and, combined, those two standards will form the body of knowledge that we’ll be testing against in the certification program when it goes live later this year.

Gardner: Jack Freund, it seems that between regulatory developments, the need for maturity in these enterprises, and the standardization that’s being brought to bear by such groups as The Open Group, it’s making this quite a bit more of the science and less of an art.

What does that bring to organizations in terms of a bottom-line effect? I wonder if there is a use case or even an example that you could mention and explain that would help people better understand of what they get back when they go through these processes and they get this better maturity around risk?

Freund: I’m not an attorney, but I have had a lot of lawyers tell me — I think Jim had mentioned before in his vertical conversation — that a lot of the regulations start with performing annual risk assessment and then choose controls based upon that. They’re not very prescriptive that way.

One of the things that it drives in organizations is a sense of satisfaction that we’ve got things covered more than anything else. When you have your leadership in these organizations understanding that you’re doing what a regular reasonable company would do to manage risk this way, you have fewer fire drills. Nobody likes to walk into work and have to deal with hundred different things.

We’re moving hard drives out of printers and fax machines, what are we doing around scanning and vulnerabilities, and all of those various things that every single day can inundate you with worry, as opposed to focusing on the things that matter.

I like a folksy saying that sort of sums things up pretty well — a dime holding up a dollar. You have all these little bitty squabbly issues that get in the way of really focusing on reducing risk in your organization in meaningful ways and focusing on the things that matter.

Using approaches like FAIR, drives a lot of value into your organization, because you’re freeing up mind share in your executives to focus on things that really matter.

Gardner: Jack Jones, a similar question, any examples that exemplify the virtues of doing the due diligence and having some of these systems and understanding in place?

Jones: I have an example to Jack Freund’s point about being able to focus and prioritize. One organization I was working with had identified a significant risk issue and they were considering three different options for risk mitigation that had been proposed. One was “best practice,” and the other two were less commonly considered for that particular issue.

An analysis showed with real clarity that option B, one of the not-best practice options, should reduce risk every bit as effectively as best practice, but had a whole lot lower cost. The organization then got to make an informed decision about whether they were going to be herd followers or whether they were going to be more cost-effective in risk management.

Unfortunately, there’s always danger in not following the herd. If something happens downstream, and you didn’t follow best practice, you’re often asked to explain why you didn’t follow the herd.

That was part of the analysis too, but at the end of the day, management got to make a decision on how they wanted to behave. They chose to not follow best practice and be more cost-effective in using their money. When I asked them why they felt comfortable with that, they said, “Because we’re comfortable with the rigor in your analysis.”

To your question earlier about art-versus-science, first of all, in most organization there would have been no question. They would have said, “We must follow best practice.” They wouldn’t even examine the options, and management wouldn’t have had the opportunity to make that decision.

Furthermore, even if they had “examined” those options using a more subjective, artistic approach, somebody’s wet finger in the air, management almost certainly would not have felt comfortable with a non-best practice approach. So, the more scientific, more rigorous, approach that something like FAIR provides, gives you all kinds of opportunity to make informed decisions and to feel more comfortable about those decisions.

Gardner: It really sounds as if there’s a synergistic relationship between a lot of the big-data and analytics investments that are being made for a variety of reasons, and also this ability to bring more science and discipline to risk analysis.

How do those come together, Jack Jones? Are we seeing the dots being connected in these large organizations that they can take more of what they garner from big data and business intelligence (BI) and apply that to these risk assessment activities, is that happening yet?

Jones: It’s just beginning to. It’s very embryonic, and there are only probably a couple of organizations out there that I would argue are doing that with any sort of effectiveness. Imagine that — they’re both using FAIR.

But when you think about BI or any sort of analytics, there are really two halves to the equation. One is data and the other is models. You can have all the data in the world, but if your models stink, then you can’t be effective. And, of course, vice versa. If you’ve got great model and zero data, then you’ve got challenges there as well.

Being able to combine the two, good data and effective models, puts you in much better place. As an industry, we aren’t there yet. We’ve got some really interesting things going on, and so there’s a lot of potential there, but people have to leverage that data effectively and make sure they’re using a model that makes sense.

There are some models out there that that frankly are just so badly broken that all the data in the world isn’t going to help you. The models will grossly misinform you. So people have to be careful, because data is great, but if you’re applying it to a bad model, then you’re in trouble.

Gardner: We are coming up near the end of our half hour. Jack Freund, for those organizations that are looking to get started, to get more mature, perhaps start leveraging some of their investments in areas like big data, in addition to attending The Open Group Conference or watching some of the plenary sessions online, what tips do you have for getting started? Are there some basic building blocks that should be in place or ways in which to get the ball rolling when it comes to a better risk analysis?

Freund: Strong personality matters in this. They have to have some sort of evangelist in the organization who cares enough about it to drive it through to completion. That’s a stake on the ground to say, “Here is where we’re going to start, and here is the path that we are going to go on.”

When you start doing that sort of thing, even if leadership changes and other things happen, you have a strong commitment from the organization to keep moving forward on these sorts of things.

I spend a lot of my time integrating FAIR with other methodologies. One of the messaging points that I keep saying all the time is that what we are doing is implementing a discipline around how we choose our risk rankings. That’s one of the great things about FAIR. It’s universally compatible with other assessment methodologies, programs, standards, and legislation that allows you to be consistent and precise around how you’re connecting to everything else that your organization cares about.

Concerns around operational risk integration are important as well. But driving that through to completion in the organization has a lot to do with finding sponsorship and then just building a program to completion. But absent that high-level sponsorship, because FAIR allows you to build a discipline around how you choose rankings, you can also build it from the bottom up. You can have these groups of people that are FAIR trained that can build risk analyses or either pick ranges — 1, 2, 3, 4 or high, medium, low. But then when questioned, you have the ability to say, “We think this is a medium, because it met our frequency and magnitude criteria that we’ve been establishing using FAIR.”

Different organizations culturally are going to have different ways to implement and to structure quantitative risk analysis. In the end it’s an interesting and reasonable path to get to risk utopia.

Gardner: Jack Jones, any thoughts from your perspective on a good way to get started, maybe even through the lens of the verticals that The Open Group has targeted for this conference, finance, government and healthcare? Are there any specific important things to consider on the outset for your risk analysis journey from any of the three verticals?

Jones: A good place to start is with the materials that The Open Group has made available on the risk taxonomy and the soon to be published risk-analysis standard.

Another source that I recommend to everybody I talk to about other sorts of things is a book called ‘How to Measure Anything’ by Douglas Hubbard. If someone is even least bit interested in actually measuring risk in quantitative terms, they owe it to themselves to read that book. It puts into layman’s terms some very important concepts and approaches that are tremendously helpful. That’s an important resource for people to consider too.

As far as within organizations, some organizations will have a relatively mature enterprise risk-management program at the corporate level, outside of IT. Unfortunately, it can be hit-and-miss, but there can be some very good resources in terms of people and processes that the organization has already adopted. But you have to be careful there too, because with some of those enterprise risk management programs, even though they may have been in place for years, and thus, one would think over time and become mature, all they have done is dig a really deep ditch in terms of bad practices and misconceptions.

So it’s worth having the conversation with those folks to gauge how clueful are they, but don’t assume that just because they have been in place for a while and they have some specific title or something like that that they really understand risk at that level.

Gardner: Well, very good. I’m afraid we will have to leave it there. We’ve been talking with a panel of experts about the new trends and solutions in the area of anticipating risk and how to better manage organizations with that knowledge. We’ve seen how enterprises are better delivering risk assessments, or beginning to, as they are facing challenges in cyber-security as well as undergoing the larger undertaking of enterprise transformation.

This special BriefingsDirect discussion comes to you in conjunction with The Open Group Conference in July 2013 in Philadelphia. There’s more information on The Open Group website about that conference for you to attend or to gather information from either live streaming or there are resources available by downloading an app for the conference.

So with that thanks to our panel. We’ve been joined by Jack Freund. He is the Information Security Risk Assessment Manager at TIAA-CREF. Thank you so much, Jack.

Freund: Thank you Dana.

Gardner: And also Jack Jones, the Principal at CXOWARE. Thank you, sir.

Jones: It’s been my pleasure. Thanks.

Gardner: And then also lastly, Jim Hietala, Vice President, Security at The Open Group. Thank you Jim.

Hietala: Thank you, Dana.

Gardner: And this is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator through these thought leader interview series. Thanks again for listening, and come back next time.

1 Comment

Filed under ArchiMate®, Business Architecture, Conference, Enterprise Architecture, Professional Development, TOGAF®

The Open Group Sydney – My Conference Highlights

By Mac Lemon, MD Australia at Enterprise Architects

Sydney

Well the dust has settled now with the conclusion of The Open Group ‘Enterprise Transformation’ Conference held in Sydney, Australia for the first time on April 15-20. Enterprise Architects is proud to have been recognised at the event by The Open Group as being pivotal in the success of this event. A number of our clients including NBN, Australia Post, QGC, RIO and Westpac presented excellent papers on leading edge approaches in strategy and architecture and a number of EA’s own thought leaders in Craig Martin, Christine Stephenson and Ana Kukec also delivered widely acclaimed papers.

Attendance at the conference was impressive and demonstrated that there is substantial appetite for a dedicated event focussed on the challenges of business and technology strategy and architecture. We saw many international visitors both as delegates and presenting papers and there is no question that a 2014 Open Group Forum will be the stand out event in the calendar for business and technology strategy and architecture professionals.

My top 10 take-outs from the conference include the following:

  1. The universal maturing in understanding the criticality of Business Architecture and the total convergence upon Business Capability Modelling as a cornerstone of business architecture;
  2. The improving appreciation of techniques for understanding and expressing business strategy and motivation, such as strategy maps, business model canvass and business motivation modelling;
  3. That customer experience is emerging as a common driver for many transformation initiatives;
  4. While the process for establishing the case and roadmap for transformation appears well enough understood, the process for management of the blueprint through transformation is not and generally remains a major program risk;
  5. Then next version of TOGAF® should offer material uplift in support for security architecture which otherwise remains at low levels of maturity from a framework standardisation perspective;
  6. ArchiMate® is generating real interest as a preferred enterprise architecture modelling notation – and that stronger alignment of ArchiMate® and TOGAF® meta models in then next version of TOGAF® is highly anticipated;
  7. There is industry demand for recognised certification of architects to demonstrate learning alongside experience as the mark of a good architect. There remains an unsatisfied requirement for certification that falls in the gap between TOGAF® and the Open CA certification;
  8. Australia can be proud of its position in having the second highest per capita TOGAF® certification globally behind the Netherlands;
  9. While the topic of interoperability in government revealed many battle scarred veterans convinced of the hopelessness of the cause – there remain an equal number of campaigners willing to tackle the challenge and their free and frank exchange of views was entertaining enough to justify worth the price of a conference ticket;
  10. Unashamedly – Enterprise Architects remains in a league of its own in the concentration of strategy and architecture thought leadership in Australia – if not globally.

Mac LemonMac Lemon is the Managing Director of Enterprise Architects Pty Ltd and is based in Melbourne, Australia.

This is an extract from Mac’s recent blog post on the Enterprise Architects web site which you can view here.

Comments Off

Filed under ArchiMate®, Business Architecture, Certifications, Conference, Enterprise Architecture, Enterprise Transformation, Professional Development, Security Architecture, TOGAF, TOGAF®

Corso Introduces Roadmapping Support for TOGAF® 9 in its Strategic Planning Platform

By Martin Owen, CEO, Corso

Last week, we announced new roadmapping support for TOGAF® in IBM Rational System Architect®, a leading Enterprise Architecture and modeling software.

The new TOGAF extension supports the modeling, migration and implementation of an Enterprise Architecture within Corso’s Strategic Planning Platform, which integrates Enterprise Architecture, IT planning and strategic planning into a single, comprehensive solution. The new TOGAF extension provides capabilities in managing current and future state architectures, work packages and timelines/lifecycles /heatmaps—key areas for successful roadmapping and transition planning.

Corso now offers roadmapping solutions for both ArchiMate® 2.0 and TOGAF as part of its Strategic Planning Platform. Both solutions are available as SaaS option, on-premise or standard perpetual license solution. A roadmapping datasheet and white paper are available.

Roadmapping is critical for building change-tolerant Enterprise Architectures that accurately describe and manage strategic business transformations. Our new solution gives Enterprise Architects the tools within TOGAF to more quickly map out a transition plan with deliverables for the organization. By tying plans to the business strategy, the architects can drive a faster development and implementation lifecycle.

Our new TOGAF solution offers these key capabilities:

  • Automatic generation of timeline diagrams with milestones and dimensions.
  • Work package definitions and resources so users can group and track specific actions.
  • Heat maps that display a visual map of the state of the business and IT infrastructure and highlight cost overruns.
  • Improved gap analysis through enhanced support for plateaus and gaps.
  • Roadmap reports that enable users to see the current and future states of the architecture and work packages.
  • Integration with IBM Rational Focal Point® so that work packages and milestones can be used in portfolio management and prioritization initiatives.
  • Lifecycle support for standard states such as application portfolio management.

Corso’s Strategic Planning Platform is a comprehensive solution that integrates Enterprise Architecture, IT and strategic planning into a fully charged change process that uses cloud technology to elevate decision-making to a strategic level. This approach unites business and architecture views into one central platform and leverages existing tools and the Web to share information and decision-making across various teams within the organization. For more information about Corso and its roadmapping solutions, visit http://www.corso.co.uk.

owen_martin

Martin Owen, CEO, Corso has spent over 20 years in Enterprise Architecture and is a co-author of the original Business Process Modeling Notation (BPMN) standard. Martin has run teams driving the product directions, strategies and roadmaps for the Enterprise Architecture tools at IBM.

Comments Off

Filed under ArchiMate®, Enterprise Architecture, TOGAF®

The Open Group Speakers Discuss Enterprise Architecture, Business Architecture and Enterprise Transformation

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here: Expert Panel Explores Enterprise Architecture and Business Architecture as Enterprise Transformation Agents, or read the transcript here.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership interview series, coming to you in conjunction with The Open Group Conference on April 15, in Sydney, Australia.

I’m Dana Gardner, Principal Analyst at Interarbor Solutions, and I’ll be your host and moderator throughout these business transformation discussions. The conference, The Open Group’s first in Australia, will focus on “How Does Enterprise Architecture Transform an Enterprise?” And there will be special attention devoted to how enterprise transformation impacts such vertical industries as finance and defense, as well as exploration, mining, and minerals. [Disclosure: The Open Group is a sponsor of BriefingsDirect podcasts.]

We’re here now with two of the main speakers at the conference — Hugh Evans, the Chief Executive Officer of Enterprise Architects, a specialist enterprise architecture (EA) firm based in Melbourne, Australia; and Craig Martin, Chief Operations Officer and Chief Architect at Enterprise Architects.

As some background, Hugh is both the founder and CEO at Enterprise Architects. His professional experience blends design and business, having started out in traditional architecture, computer games design, and digital media, before moving into enterprise IT and business transformation.
In 1999, Hugh founded the IT Strategy Architecture Forum, which included chief architects from most of the top 20 companies in Australia. He has also helped found the Australian Architecture Body  of Knowledge and the London Architecture Leadership Forum in the UK.

Since starting Enterprise Architects in 2002, Hugh has grown the team to more than 100 people, with offices in Australia, the UK, and the U.S.
With a career spanning more than 20 years, Craig has held executive positions in the communications, high tech, media, entertainment, and government markets and has operated as an Enterprise Architect and Chief Consulting Architect for a while.

In 2012, Craig became COO of Enterprise Architects to improve the global scalability of the organization, but he is also a key thought leader for strategy and architecture practices for all their clients and also across the EA field.

Craig has been a strong advocate of finding differentiation in businesses through identifying new mixes of business capabilities in those organizations. He advises that companies that do not optimize how they reassemble their capabilities will struggle, and he also believes that business decision making should be driven by economic lifecycles.

So welcome to you both. How are you doing?

Hugh Evans: Great, Dana. Good morning, Dana. Welcome everyone. Craig Martin: Thanks very much for having us.

Big-picture perspective

Gardner: I look forward to our talk. Let’s look at this first from a big-picture perspective and then drill down into what you are going to get into at the conference in a couple of weeks. What are some of the big problems that businesses are facing, that they need to solve, and that architecture-level solutions can really benefit them. I’ll open this up to both Hugh and Craig?

Evans: Thanks very much, Dana. I’ll start with the trend in the industry around fast-paced change and disruptive innovation. You’ll find that many organizations, many industries, at the moment in the U.S., Australia, and around the world are struggling with the challenges of how to reinvent themselves with an increasing number of interesting and innovative business models coming through. For many organizations, this means that they need to wrap their arms around an understanding of their current business activities and what options they’ve got to leverage their strategic advantages.

We’re seeing business architecture as a tool for business model innovation, and on the other side, we’re also seeing business architecture as a tool that’s being used to better manage risk, compliance, security, and new technology trends around things like cloud, big data, and so on.

Martin: Yes, there is a strong drive within the industry to try and reduce complexity.  As organizations are growing, the business stakeholders are confronted with a large amount of information, especially within the architecture space. We’re seeing that they’re struggling with this complexity and have to make accurate and efficient business decisions on all this information.

What we are seeing, and based upon what Hugh has already discussed, is that some of those industry drivers are around disruptive business models. For example, we’re seeing it with the likes of higher education, the utility space, and financial services space, which are the dominant three.
There is a lot of change occurring in those spaces, and businesses are looking for ways to make them more agile to adapt to that change, and looking towards disciplined architecture and the business-architecture discipline to try and help them in that process.

Gardner: I think I know a bit about how we got here — computing, globalization, outsourcing, companies expanding across borders, the ability to enter new markets freely, and dealing with security, but also great opportunity. Did I miss anything? Is there anything about the past 10 or 15 years in business practices that have led now to this need for a greater emphasis on that strategic architectural level of thinking?

Martin: A lot has to do with basically building blocks. We’ve seen a journey that’s travelled within the architecture disciplines specifically. We call it the commodification of the business, and we’ve seen that maturity in the IT space. A lot of processes that used to be innovative in our business are now becoming fairly utility and core to the business. In any Tier 1 organization, a lot of the processes that used to differentiate them are now freely available in a number of vendor platforms, and any of their competitors can acquire those.

Looking for differentiation

So they are looking for that differentiation, the ability to be able to differentiate themselves from their competitors, and away from that sort of utility space. That’s a shift that’s beginning to occur. Because a lot of those IT aspects have become industrialized, that’s also moving up into the business space.

In other words, how can we now take complex mysteries in the business space and codify them? In other words, how can we create building blocks for them, so that organizations now can actually effectively work with those building blocks and string them together in different ways to solve more complex business problems.

Evans: I’ll add to that Dana. EA is now around 30 years old, but the rise in EA has really come from the need for IT systems to interoperate and to create common standards and common understanding within an organization for how an IT estate is going to come together and deliver the right type of business value.

Through the ’90s we saw the proliferation of technologies as a result of the extension of distributed computing models and the emergence of the Internet. We’ve seen now the ubiquity of the Internet and technology across business. The same sort of concepts that ring true in technology architecture extend out into the business, around how the business interoperates with its components.

The need to change very fast for business, which is occurring now in the current economy, with the entrepreneurship and the innovation going on, is seeing this type of thinking come to the fore. This type of thinking enables organizations to change more rapidly. The architecture itself won’t make the organization change rapidly, but it will provide the appropriate references and enable people to have the right conversations to make that happen.

Gardner: So architecture can come as a benefit when the complexity kicks in. When you try to change an organization, you don’t get lost along the way. Give me a sense about what sort of paybacks your clients get when they do this correctly, and what happens when you don’t do this very well?

Evans: Business architecture, as well as strategic architecture, is still quite a nascent capability for organizations, and many organizations are really still trying to get a grip on this. The general rule is that organizations don’t manage this so well at the moment, but organizations are looking to improving in this area, because of the obvious, even heuristic, payoffs that you get from being better organized.

You end up spending less money, because you’re a more efficient organization, and you end up delivering better value to customers, because you’re a more effective organization. This efficiency and effectiveness need within organizations is worth the price of investment in this area.
The actual tangible benefits that we’re seeing across our customers includes reduced cost of their IT estate.

Meeting profiles

You have improved security and improved compliance, because organizations can see where their capabilities are meeting the various risk and compliance profiles, and you are also seeing organizations bring products to market quicker. The ability to move through the product management process, bring products to market more rapidly, and respond to customer need more rapidly puts organizations in front and makes them more competitive.

The sorts of industries we’re seeing acting in this area would include the postal industry, where they are moving from a traditional mail- to parcels, which is a result of a move towards online retailing. You’re also seeing it in the telco sector and you’re seeing it in the banking and finance sector.
In the banking and finance sector, we’ve also seen a lot of this investment driven by the merger and acquisition (M&A) activity that’s come out of the financial crisis in various countries where we operate. These organizations are getting real value from understanding where the enterprise boundaries are, how they bring the business together, how they better integrate the organizations and acquisitions, and how they better divest.

Martin: We’re seeing, especially at the strategic level, that the architecture discipline is able to give business decision makers a view into different strategic scenarios. For example, where a number of environmental factors and market pressures would have been inputs into a discussion around how to change a business, we’re also seeing business decision makers getting a lot of value from running those scenarios through an actual hypothesis of the business model.

For example, they could be considering four or five different strategic scenarios, and what we are seeing is that, using the architecture discipline, it’s showing them effectively what those scenarios look like as they cascade through the business. It’s showing the impact on capabilities, on people and the approaches and technologies, and the impact on capital expenditures (CAPEX) and operational expenditures (OPEX). Those views of each of those strategic scenarios allows them to basically pull the trigger on the better strategic scenario to pursue, before they’ve invested all of their efforts and all that analysis to possibly get to the point where it wasn’t the right decision in the first place. So that might be referred to as sort of the strategic enablement piece.

We’re also seeing a lot of value for organizations within the portfolio space. We traditionally get questions like, “I have 180 projects out there. Am I doing the right things? Are those the right 180 projects, and are they going to help me achieve the types of CAPEX and OPEX reductions that I am looking for?”

With the architecture discipline, you don’t take a portfolio lens into what’s occurring within the business. You take an architectural lens, and you’re able to give executives an overview of exactly where the spend is occurring. You give them an overview of where the duplication is occurring, and where the loss of cohesion is occurring.

Common problems

A common problem we find, when we go into do these types of gigs, is the amount of duplication occurring across a number of projects. In a worst-case scenario, 75 percent of the projects are all trying to do the same thing, on the same capability, with the same processes.
So there’s a reduction of complexity and the production of efforts that’s occurring across the organizations to try and bring it and get it into more synergistic sessions.

We’re also seeing a lot of value occurring up at the customer experience space. That is really taking a strong look at this customer experience view, which is less around all of the underlying building blocks and capabilities of an organization and looking more at what sort of experiences we want to give our customer? What type of product offerings must we assemble, and what underlying building blocks of the organization must be assembled to enable those offerings and those value propositions?

That sort of traceability through the cycle gives you a view of what levers you must pull to optimize your customer experience. Organizations are seeing a lot of value there and that’s basically increasing their effectiveness in the market and having a direct impact on their market share.
And that’s something that we see time and time again, regardless of what the driver was behind the investment in the architecture project, seeing the team interact and build a coalition for action and for change. That’s the most impressive thing that we get to see.

Gardner: Let’s drill down a little bit into some of what you’ll be discussing at the conference in Sydney in April. One of the things that’s puzzling to me, when I go to these Open Group Conferences, is to better understand the relationship between business architecture and IT architecture and where they converge and where they differ. Perhaps you could offer some insights and maybe tease out what some discussion points for that would be at the conference.

Martin: That’s actually quite a hot topic. In general, the architecture discipline has grown from the IT space, and that’s a good progression for it to take, because we’re seeing the fruits of that discipline in how they industrialize IT components. We’re seeing the fruits of that in complex enterprise resource planning (ERP) systems, the modularization of those ERP systems, their ability to be customized, and adapt to businesses. It’s a fairly mature space, and the natural progression of that is to apply those same thinking patterns back up into the business space.

In order for this to work effectively well, when somebody asks a question like that, we normally respond with a “depends” statement. We have in this organization a thing called the mandate curve, and it relates to what the mandate is within the business. What is the organization looking to solve?

Are they looking to build an HR management system? Are they looking to gain efficiencies from an enterprise-wide ERP solution? Are they looking to reduce the value chain losses that they’re having on a monthly basis? Are they looking to improve customer experience across a group of companies? Or are they looking to improve shareholder value across the organization for an M&A, or maybe reduce cost-to-income.

Problem spaces

Those are some of the problem spaces, and we often get into that mind space to ask, “Those are the problems that you are solving, but what mandate is given to architecture to solve them?” We often find that the mandate for the IT architecture space is sitting beneath the CIO, and the CIO tends to use business architecture as a communication tool with business. In other words, to understand business better, to begin to apply architecture rigor to the business process.

Evans: It’s interesting, Dana. I spent a lot of time last year in the UK, working with the team across a number of business-architecture requirements. We were building business-architecture teams. We were also delivering some projects, where the initial investigation was a business- architecture piece, and we also ran some executive roundtables in the UK.

One thing that struck me in that investigation was the separation that existed in the business- architecture community from the traditional enterprise and technology architecture or IT architecture communities in those organizations that we were dealing with.
One insurance company, in particular, that was building a business-architecture team was looking for people that didn’t necessarily have an architecture background, but possibly could apply that insight. They were looking for deep business domain knowledge inside the various aspects of the insurance organization that they were looking to cover.

So to your question about the relationship between business architecture and IT architecture, where they converge and how they differ, it’s our view that business architecture is a subset of the broader EA picture and that these are actually integrated and unified disciplines.
However, in practice you’ll find that there is often quite a separation between these two groups. I think that the major reason for that is that the drivers that are actually creating the investment for business architecture are actually now from coming outside of IT, and to some extent, IT is replicating that investment to build the engagement capability to engage with business so that they can have a more strategic discussion, rather than just take orders from the business.

I think that over this year, we’re going to see more convergence between these two groups, and that’s certainly something that we are looking to foster in EA.

Gardner: I just came back from The Open Group Conference in California a few weeks ago, where the topic was focused largely on big data, but analysis was certainly a big part of that. Now, business analysis and business analysts, I suppose, are also part of this ecosystem. Are they subsets of the business architect? How do you see the role of business analysts now fitting into this, given the importance of data and the ability for organizations to manage data with new efficiency and scale?

Martin: Once again, that’s also a hot topic. There is a convergence occurring, and we see that across the landscape, when it comes to the number of frameworks and standards that people certify on. Ultimately, it comes to this knife-edge point, in which we need to interact with the business stakeholder and we need to elicit requirements from that stakeholder and be able to model them successfully.
The business-analysis community is slightly more mature in this particular space. They have, for example, the Business Analysis Body of Knowledge (BABOK). Within that space, they leverage a competency model, which in effect goes through a cycle, from an entry level BA, right up to what they refer to as the generalist BA, which is where they see the start of the business- architecture role.

Career path

There’s a career path from a traditional business analyst role, which is around requirements solicitation and requirements management, which seems to be quite project focused. In other words, dropping down onto project environments, understanding stakeholder needs and requirements, and modeling those and documenting them, helping the IT teams model the data flows, the data structures but with a specific link into the business space.

As you move up that curve, you get into the business-architecture space, which is a broader structural view around how all the building blocks fit together. In other words, it’s a far broader view than what the business analyst traditional part would take, and looks at a number of different domains. The business architect tends to focus a lot on, as you mentioned, the information space, and we see a difference between the information and the data space.

So the business architect is looking at performance, market-related aspects, and customer, information, as well as the business processes and functional aspects of an organization. You can see that the business analysts could almost be seen as the soldiers of these types of functions. In other words, they’re the guys that are in the trenches seeing what’s working on a day-to-day basis. They’ve got a number of tools that they’re equipped with, which for example the BABOK has given them. And there are all different ways and techniques that they are using to elicit those requirements from various business stakeholders, until they move out that curve up into the business architecture and strategic architecture space.

Evans: There’s an interesting pattern that I’ve noticed with the business-analyst-to-business- architecture career journey and the traditional IT track, where you see a number of people move into solution architect roles. There might be a solution architect on a project, they might move to multiple projects and ultimately do a program, and a number of those people then pop out to a much broader enterprise view, as they go through their career.

The business analyst is, in many respects, tracking that journey, where business analysts might focus on a project and requirements for a project, might look across at a high view, and possibly get to a point where they have a strong domain understanding that can drive high level sort of strategic discussions within the organization.

There is certainly a pattern emerging, and there are great opportunities for business analysts to come across into the architecture sphere. However, I believe that the broader EA discipline does need to make the effort to bridge that gap. Architecture needs to come across and find those connection points with the analyst community and help to elevate and converge the two sides.

Gardner: Craig, in your presentation at The Open Group Conference in Sydney, what do you hope to accomplish, and will this issue of how the business analyst fits in be prominent in that?

Martin: It’s a general theme that we’re using leading right up to the conference. We have a couple of webinars, which deal specifically with this topic. That’s leading up to the plenary talk at The Open Group Conference, which is really looking at how we can use these tools of the architecture discipline to be able to achieve the types of outcomes that we’ve spoken about here.

Building cohesion

In other words, how do I build cohesion in an organization? How do I look at different types of scenarios that I can execute against? What are the better ways to assemble all the efforts in my organization to achieve those outcomes? That’s taking us through a variety of examples that will be quite visual.

We’ll also be addressing the specific role of where we see the career path and the complementary nature of the business analyst and business architect, as they travel through the cycle of trying to operate at a strategic level and as a strategic enabler within the organization.

Gardner: Maybe you could also help me better understand something. When organizations decide that this is the right thing for them — as you mentioned earlier, this is still somewhat nascent — what are some good foundational considerations to get started? What needs to be put in place? Maybe it’s a mindset. How do you often find that enterprises get beyond the inertia and into this discussion about architecture and about the strategic benefits of it?

Martin: Once again, it’s a “depends” answer. For example, we often have two market segments, where a Tier 1 type company would want to build the capability themselves. So there’s a journey that we need to take them on around how to have a business-architecture capability while delivering the actual outcomes?

Tier 2 and Tier 3 clients often don’t necessarily want to build that type of capability, so we would focus directly on the outcomes. And those outcomes start with two views. Traditionally, we’re seeing the view driven almost on a bottom-up view, as the sponsors of these types of exercises try to get credibility within the organization.

That relates to helping the clients build what we refer to as the utility of the business-architecture space. Our teams go in and, in effect, build a bunch of what we refer to as anchor models to try and get a consistent representation of the business and a consistent language occurring across the entire enterprise, not just within a specific project.

And that gives them a common language they can talk about, for example, common capabilities and common outcomes that they’re looking to achieve. In other words, it’s not just a bunch of building blocks, but the actual outcome of each of those building blocks and how does it match something like a business-motivation model.

They also look within each of those building blocks to see what the resources are that creates each of those building blocks — things like people, process and tools. How do we mix those resources in the right way to achieve those types of outcomes that the business is looking for? Normally, the first path that we go through is to try to get that sort of consistent language occurring within an organization. As an organization matures, that artifact starts to lose its value, and we then find that, because it has created a consistent language in the organization, you can now overlay a variety of different types of views to give business people insights. Ultimately, they don’t necessarily want all these models, but they actually want insight into their organizations to enable them to make decisions.

We can overlay objectives, current project spend, CAPEX, and OPEX. We can overlay where duplication is occurring, where overspend is occurring, where there’s conflict occurring at a global scale around duplication of efforts, and with the impact of costs and reduction and efficiencies, all of those types of questions can be answered by merely overlaying a variety of views across this common language.

Elevating the value

That starts to elevate the value of these types of artifacts, and we start to see our business sponsors walking into meetings with all of these overlays on them, and having conversations between them and their colleagues, specifically around the insights that are drawn from these artifacts. We want the architecture to tell the story, not necessarily lengthy PowerPoint presentations, but as people are looking at these types of artifacts, they are actually seeing all the insights that come specifically from it.

The third and final part is often around the business getting to a level of maturity, in that they’re starting to use these types of artifacts and then are looking for different ways that they can now mix and assemble. That’s normally a sign of a mature organization and the business-architecture practice.

They have the building blocks. They’ve seen the value or the types of insights that they can provide. Are there different ways that I can string together my capabilities to achieve different outcomes? Maybe I have got different critical success factors that I am looking to achieve. Maybe there are new shift or new pressures coming in from the environment. How can I assemble the underlying structures of my organization to better cope with it? That’s the third phase that we take customers through, once they get to that level of maturity.

Evans: Just to add to that, Dana, I agree with Craig on the point that, if you show the business what can actually be delivered such as views on a page that elicit the right types of discussions and that demonstrate the issues, when they see what they’re going to get delivered, typically the eyes light up and they say, “I want one of those things.”

The thing with architecture that I have noticed over the years is that architecture is done by a lot of very intelligent people, who have great insights and great understanding, but it’s not just enough to know the answer. You have to know how to engage somebody with the material. So when the architecture content that’s coming through is engaging, clear, understandable, and can be consumed by a variety of stakeholders, they go, “That’s what I want. I want one of those.”

So my advice to somebody who is going down this path is that if they want to get support and sponsorship for this sort of thing, make sure they get some good examples of what gets delivered when it’s done well, as that’s a great way to actually get people behind it.

Gardner: I’m afraid we will have to leave it there. We’ve been talking with Hugh Evans, the CEO of Enterprise Architects, a specialist EA firm in Melbourne; and Craig Martin, the COO and Chief Architect at Enterprise Architects. Thanks to you both.

Evans: Thanks very much Dana, it has been a pleasure.

Martin: Thank you, Dana.

Gardner: This BriefingsDirect discussion comes to you in conjunction with The Open Group Conference, the first in Australia, on April 15 in Sydney. The focus will be on “How Does Enterprise Architecture Transform an Enterprise?”

So thanks again to both Hugh and Craig, and I know they will be joined by many more thought leaders and speakers on the EA subject and other architecture issues at the conference, and I certainly encourage our readers and listeners to attend that conference, if they’re in the Asia- Pacific region.

This is Dana Gardner, Principal Analyst at Interarbor Solutions, your host and moderator through these thought leadership interviews. Thanks again for listening, and come back next time.

1 Comment

Filed under ArchiMate®, Business Architecture, Conference, Enterprise Architecture, Professional Development, TOGAF®