Category Archives: Standards

The Open Group Edinburgh 2015 Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

On Monday October 19, Allen Brown, President and CEO of The Open Group, welcomed over 230 attendees from 26 countries to the Edinburgh International Conference Center located in the heart of historic Edinburgh, Scotland.

Allen kicked off the morning with an overview of company achievements and third quarter activities. The Open Group has over 500 member organizations in 42 countries, with the newest members coming from Peru and Zambia. Allen provided highlights of the many activities of our Forums and Work Groups. Too many to list, but white papers, guides, snapshots and standards have been published and continue to be in development. The newest Work Group is Digital Business Strategy and Customer Experience. The UDEF Work Group is now named O-DEF (Open – Data Element Framework) Work Group. The Real Time and Embedded Systems Forum is becoming more focused on critical systems and high assurance. Our members and staff have been very productive as always!

The morning plenary featured the theme “Architecting Business Transformation” with BAES Submarines. Speakers were Stephen Cole, CIO, BAE Systems Maritime Submarines; John Wilcock, Head of Operations Transformation, BAE Systems Submarine Solutions; Matthew Heard, Senior Operations Engineer, BAE Systems Maritime Submarines; and Paul Homan, Enterprise Architect, IBM. The presentation included a history of BAES Submarines and a ‘case study’ on using TOGAF® to help define BAE’s strategy for transforming their operations and production functions. The gentlemen all advocated the need to continue to drive change and transformation through the TOGAF principles. TOGAF has provided a structured, standardized approach to solving functional problems. TOGAF also ultimately allows organizations to document and measure their success along the way for meeting business objectives.

Following the keynotes, all presenters joined Allen for a panel consisting of an engaging Q&A with the audience.

By Loren K. Baynes, Director, Global Marketing CommunicationsPaul Homan, John Wilcock, Matthew Heard, Stephen Cole, Allen Brown

In the afternoon, the agenda offered several tracks on Risk, Dependability and Trusted Technology; EA and Business Transformation and Open Platform 3.0™.

One of the many sessions was “Building the Digital Enterprise – from Digital Disruption to Digital Experience” with Mark Skilton, Digital Expert, and Rob Mettler, Director of Digital Business, both with PA Consulting. The speakers discussed the new Work Group of The Open Group – Digital Business and Customer Experience, which is in the early stage of researching and developing a framework for the digital boom and new kind of ecosystem. The group examines how the channels from 15 years ago compare to today’s multi-device/channel work requiring a new thinking and process, while “always keeping in mind, customer behavior is key”.

The evening concluded with a networking Partner Pavilion (IT4IT™, The Open Group Open Platform™ and Enterprise Architecture) and a whisky tasting by the Scotch Whisky Heritage Centre.

Tuesday, October 20th began with another warm Open Group welcome by Allen Brown.

Allen and Ron Ashkenas, Senior Partner, Schaffer Consulting presented “A 20-year Perspective on the Boundaryless Organization and Boundaryless Information Flow™. The More Things Change, the More They Stay the Same”.

Ron shared his vision of how the book “The Boundaryless Organization” came to light and was published in 1995. He discussed his experiences working with Jack Welch to progress GE (General Electric). Their pondering included “can staff/teams be more nimble without boundaries and layers?”. After much discussion, the concept of ‘boundaryless’ was born. The book showed companies how to sweep away the artificial obstacles – such as hierarchy, turf, and geography – that get in the way of outstanding business performance. The presentation was a great retrospective of boundaryless and The Open Group. But they also explored the theme of ‘How does boundaryless fit today in light of the changing world?’. The vision of The Open Group is Boundaryless Information Flow.

Allen emphasized that “then standards were following the industry, now their leading the industry”. Boundaryless Information Flow does not mean no boundaries exist. Boundaryless means aspects are permeable to boundaries to enable business, yet not prohibit it.

During the next session, Allen announced the launch of the IT4IT™ Reference Architecture v2.0 Standard. Chris Davis, University of South Florida and Chair of The Open Group IT4IT™ Forum, provided a brief overview of IT4IT and the standard. The Open Group IT4IT Reference Architecture is a standard reference architecture and value chain-based operating model for managing the business of IT.

After the announcement, Mary Jarrett, IT4IT Manager, Shell, presented “Rationale for Adopting an Open Standard for Managing IT”. In her opening, she stated her presentation was an accountant’s view of IT4IT and the Shell journey. Mary’s soundbites included: “IT adds value to businesses and increases revenue and profits; ideas of IT are changing and we need to adapt; protect cyber back door as well as physical front door.”

The afternoon tracks consisted of IT4IT™, EA Practice & Professional Development, Open Platform 3.0™, and Architecture Methods and Techniques.

The evening concluded with a fantastic private function at the historic Edinburgh Castle. Bagpipes, local culinary offerings including haggis, and dancing were enjoyed by all!

By Loren K. Baynes, Director, Global Marketing Communications

Edinburgh Castle

On Wednesday and Thursday, work sessions and member meetings were held.

A special ‘thank you’ goes to our sponsors and exhibitors: BiZZdesign; Good e-Learning, HP, Scape, Van Haren Publishing and AEA.

Other content, photos and highlights can be found via #ogEDI on Twitter.  Select videos are on The Open Group YouTube channel. For full agenda and speakers, please visit The Open Group Edinburgh 2015.

By Loren K. Baynes, Director, Global Marketing CommunicationsLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog, media relations and social media. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Comments Off on The Open Group Edinburgh 2015 Highlights

Filed under boundaryless information flow, Enterprise Architecture, IT, IT4IT, Open Platform 3.0, The Open Group, The Open Group Ediburgh 2015, TOGAF

Balancing Complexity and Continuous Improvements – A Case Study from the Automotive Industry

By The Open Group

Background

The automotive industry is currently facing massive challenges. For the past 30-40 years, automakers have faced stiff competition in the marketplace, as well as constant pressure to make more innovative and efficient vehicles while reducing the costs to manufacture them.

At the same time, current technological advances are making the industry—and the technology inside automobiles—increasingly complex. Digitalization is also affecting not only how automobiles work but is forcing changes in the manufacturing process and in how automakers run their businesses. With technology now touching nearly every part of the business and how it functions, the IT landscape for automakers is becoming a web of interconnected systems running both inside and outside of the business.

In addition, with computing systems becoming a more integral part of the systems that run vehicles, the lines between traditional IT functions and IT within cars themselves are beginning to blur. With trends such as Big Data and analytics, the Internet of Things and The Open Group Open Platform 3.0™ making cars, manufacturers, dealers and owners increasingly interconnected, automotive company IT departments are being forced to get involved in areas of the business, such as product development and maintenance, in ways they’ve never been before.

Between economic forces and technological change, automakers, like many businesses today, are facing massive upheaval and the need for major transformation in order to deal with levels of business complexity they’ve never seen before.

Company

These challenges are very real for the automotive company in this case study. In addition to general economic and technological change, the company has gone through a number of transitions that have created additional infrastructure issues for the company. Over the past two decades, the company was bought then sold and bought again, bringing in two new owners and technological systems. Between the company’s original legacy IT systems and the systems brought in by its subsequent owners, the company’s IT landscape had become extremely complicated. In addition, the company is in the process of extending its footprint in the burgeoning Chinese market, a step that requires the company to invest in additional infrastructure in order to take advantage of China’s growing economic wealth to speed sales.

Between the company’s existing systems, the need to grow into emerging markets and increased digitalization across the company and its products, the company was in need of new approach to its overall architecture.

Problem

Although the company started early on to utilize IT to make the information flows across the company value chain as effective as possible, the existing IT environment had grown organically as the company had changed owners. In order to prepare themselves for an increasingly digital business environment, the company needed to address the increasing complexity of its systems without adding more complexity and while designing systems that could scale and run for the long haul.

Previously, the company had begun to consider using an Enterprise Architecture approach to address its growing complexity. Although the company had a number of solutions architects on staff, they soon realized that they needed a more holistic approach that could address the entire enterprise, not just the individual solutions that made up that IT landscape.

In an industry where time to market is of outmost importance there will always be challenges in balancing short-term solutions with strategic investments. As such, the company initially decided to invest in an Enterprise Architecture capability with the objective of addressing internal complexities to better understand and eventually deal with them. Because TOGAF®, an Open Group standard was seen as the de-facto industry standard for Enterprise Architecture it was the natural choice for the company to create its architecture framework. The majority of the Enterprise and solution Architects at the company were then trained and certified in TOGAF 9. Subsequently, TOGAF was adopted by the architecture community in the IT organization.

Within the IT department, TOGAF provided an ontology for discussing IT issues, and it also provided a foundation for the Enterprise Architecture repository. However, it was seen within the organization primarily as an IT architecture concern, not a framework for transformational change. The EA team decided that in order to really benefit from TOGAF and address the complexity challenges throughout the enterprise, they would need to prove that TOGAF could be used to add value throughout the entire organization and influence how changes were delivered to the IT landscape, as well as prove the value of a structured approach to addressing internal issues.

In order to prove that TOGAF could help with its overall transformation, the team decided to put together a couple of pilot projects within different business areas to showcase the benefits of using a structured approach to change. Due to a need to fix how the company sourced product components, the team decided to first pilot a TOGAF-based approach for its procurement process, since it was widely viewed as one of the most complex areas of the business.

A New Procurement Platform

The initial pilot project was aimed at modernizing the company’s procurement landscape. Although procurement is normally a fairly straightforward process, in the automotive business the intricacies and variations within the product structure, combined with a desire to control logistic costs and material flows, represented a major challenge for the company. In short, to save costs, the company only wanted to buy things they would actually use in the vehicle manufacturing process—no more, no less.

Over the years the IT supporting the company’s procurement process had become very fragmented due to investments in various point solutions and different partnerships that had been established over time. In addition, some parts of the system had been closed down, all of which made the information flow, including all the systems integrations that had occurred along the way, very difficult to map. There were also several significant gaps in the IT support of the procurement process that severely limited the transparency and integrity of the process.

Solution

Using TOGAF as an architecture framework and method in conjunction with ArchiMate®, an Open Group standard, for modelling notations and Sparx Enterprise Architect (EA) as a modelling tool, the team set out to establish a roadmap for implementing a new procurement platform. The TOGAF Architecture Development Method (ADM) was used to establish the architecture vision, and the architecture development phases were completed outlining a target architecture and a subsequent roadmap. No major adaptions were made to the ADM but the sourcing process for the platform was run in parallel to putting together the ADM, requiring an iterative approach to be used

As part of the roadmap, the following ArchiMate views were developed:

  • Motivation views
  • Information structure views
  • Baseline and target business process views
  • Baseline and target business function views
  • Baseline and target application function views
  • Baseline and target application landscape views
  • Baseline and target application usage views
  • Baseline and target infrastructure landscape views
  • Baseline and target infrastructure usage views

Each view was created using Sparx EA configured to facilitate the ADM process and acting as the architecture repository.

The TOGAF ADM provided a structured approach for developing a roadmap whose results could be traced back to the original vision. Having a well-defined methodology with clear deliverables and an artifacts meta-model made the work focused, and both TOGAF and ArchiMate were relatively easy to get buy in for.

The challenges for the project were mainly in one area—aligning the architecture development with the IT solution sourcing process. Because the company wanted to identify sourcing solutions early to assess costs and initiate negotiation, that emphasis pushed the project into identifying solutions building blocks very early on. In most cases, the output from the ADM process could directly be used as input for sourcing commercial of solutions; however, in this case, sourcing soon took precedence over the architecture development process. Usually moving through the ADM phases A to E can be done within a couple of months but evaluating solutions and securing funding within this company proved to be much more difficult and time consuming.

Results

With a new procurement process roadmap in hand, the company has now begun to use the ADM to engage with and get Requests for Information (RFIs) from new suppliers. In addition, using TOGAF and ArchiMate to map the company’s procurement process and design an infrastructure roadmap helped to demystify what had been seen as an extremely complex procurement process. The project allowed the IT team to identify where the real complexities were in the process, many of which are at the component level rather than within the system itself. In addition, the company has been able to identify the areas that they need to prioritize as they begin their implementation process.

Observations

Initially TOGAF was seen as a silver bullet within the organization. However, companies must realize that the TOGAF methodology represents best practices, and there is still a need within any organization to have skilled, knowledgeable Enterprise Architects available and with the mandate to do the work.

As part of the project, the following benefits were provided by TOGAF:

  • Provided structure to the analysis
  • Ensured a holistic perspective for all domains
  • Kept the team focused on the outcome, definition, roadmap, etc.
  • Provided a good view into current and future data for the roadmap
  • Provided proven credibility for the analysis

ArchiMate added additional support by providing well-defined viewpoints, and Sparx EA is a cost effective modelling tool and repository that can easily be deployed to all stakeholder in an initiative.

However, within this particular organization, there were a number of challenges that need to be overcome, many of which can hinder the adoption of TOGAF. These challenges included:

  • Competing processes, methodologies and capabilities
  • Strong focus on solution design rather than architecture
  • Strong focus on project delivery tradition rather than managing programs and outcomes
  • Governance for solutions rather than architecture

Adopting Archimate proved to be more straightforward internally at this organization because it could be used to address immediate modelling needs but without requiring a coordinated approach around methodology and governance.

In cases such as this, it is probably best to sell the TOGAF and ArchiMate methodologies into the business organization as common sense solutions rather than as specific technology architecture methodologies. Although they may be presented as such to the EA community within the organization, it makes the decision process simpler not to oversell the technical solution, as it were, to the business, instead selling them the business benefits of the process.

Future

Currently the company is beginning to move through the implementation phase of their roadmap. In addition, individuals throughout the organization have begun to regularly use ArchiMate as a tool for modeling different business areas within the organization. In addition the tools and concepts of TOGAF have been put into use successfully in several initiatives. The timeframe however for formally implementing a more comprehensive Enterprise Architecture Framework throughout other parts of the organization has been slowed down due to the company’s current focus on the release of new models. This is cyclical within the company and once the immediate focus on product delivery weakens, the need for consolidation and simplification will become a priority once again.

As with most companies, the key to a implementing a successful Enterprise Architecture capability within this company will come down to establishing a more effective partnership between the IT organization and the business organizations that IT is supporting. As such, for projects such as this, early engagement is key, and the IT organization must position itself not only as a delivery organization but a business partner that provides investment advice and helps minimize business risk through improved processes and technology based business transformation (as is prescribed by methodologies such as TOGAF and ArchiMate). This requires a unified view of the company mission and its business objectives and associated approaches from IT. Project managers, business analysts and Enterprise Architects must have a common view as to how to approach engagements for them to succeed. Without buy-in throughout the organization, the tools will only be useful techniques used by individuals and their real potential may not be realized.

Comments Off on Balancing Complexity and Continuous Improvements – A Case Study from the Automotive Industry

Filed under ArchiMate®, big data, digital technologies, EA, IoT, Open Platform 3.0, The Open Group, TOGAF

IT4IT™ Reference Architecture Version 2.0, an Open Group Standard

By The Open Group

1 Title/Current Version

IT4IT™ Reference Architecture Version 2.0, an Open Group Standard

2 The Basics

The Open Group IT4IT Reference Architecture standard comprises a reference architecture and a value chain-based operating model for managing the business of IT.

The IT Value Chain

The IT Value Chain has four value streams supported by a reference architecture to drive efficiency and agility. The four value streams are:

  • Strategy to Portfolio
  • Request to Fulfill
  • Requirement to Deploy
  • Detect to Correct

Each IT Value Stream is centered on a key aspect of the service model, the essential data objects (information model), and functional components (functional model) that support it. Together, the four value streams play a vital role in helping IT control the service model as it advances through its lifecycle.

The IT4IT Reference Architecture

  • Provides prescriptive guidance on the specification of and interaction with a consistent service model backbone (common data model/context)
  • Supports real-world use-cases driven by the Digital Economy (e.g., Cloud-sourcing, Agile, DevOps, and service brokering)
  • Embraces and complements existing process frameworks and methodologies (e.g., ITIL®, CoBIT®, SAFe, and TOGAF®) by taking a data-focused implementation model perspective, essentially specifying an information model across the entire value chain

3 Summary

The IT4IT Reference Architecture standard consists of the value chain and a three-layer reference architecture. Level 1 is shown below.

By The Open GroupThe IT4IT Reference Architecture provides prescriptive, holistic guidance for the implementation of IT management capabilities for today’s digital enterprise. It is positioned as a peer to comparable reference architectures such as NRF/ARTS, TMF Framework (aka eTOM), ACORD, BIAN, and other such guidance.

Together, the four value streams play a vital role in helping IT control the service model as it advances through its lifecycle:

By The Open GroupThe Strategy to Portfolio (S2P) Value Stream:

  • Provides the strategy to balance and broker your portfolio
  • Provides a unified viewpoint across PMO, enterprise 
architecture, and service portfolio
  • Improves data quality for decision-making
  • Provides KPIs and roadmaps to improve business communication

The Requirement to Deploy (R2D) Value Stream:

  • Provides a framework for creating, modifying, or sourcing a service
  • Supports agile and traditional development methodologies
  • Enables visibility of the quality, utility, schedule, and cost of 
the services you deliver
  • Defines continuous integration and deployment control points

The Request to Fulfill (R2F) Value Stream:

  • Helps your IT organization transition to a service broker model
  • Presents a single catalog with items from multiple supplier 
catalogs
  • Efficiently manages subscriptions and total cost of service
  • Manages and measures fulfillments across multiple suppliers

The Detect to Correct (D2C) Value Stream:

  • Brings together IT service operations to enhance results and efficiency
  • Enables end-to-end visibility using a shared configuration model
  • Identifies issues before they affect users
  • Reduces the mean time to repair

4 Target Audience

The target audience for the standard consists of:

  • IT executives
  • IT process analysts
  • Architects tasked with “business of IT” questions
  • Development and operations managers
  • Consultants and trainers active in the IT industry

5 Scope

The Open Group IT4IT standard is focused on defining, sourcing, consuming, and managing IT services by looking holistically at the entire IT Value Chain. While existing frameworks and standards have placed their main emphasis on process, this standard is process-agnostic, focused instead on the data needed to manage a service through its lifecycle. It then describes the functional components (software) that are required to produce and consume the data. Once integrated together, a system of record fabric for IT management is created that ensures full visibility and traceability of the service from cradle to grave.

IT4IT is neutral with respect to development and delivery models. It is intended to support Agile as well as waterfall approaches, and lean Kanban process approaches as well as fully elaborated IT service management process models.

The IT4IT Reference Architecture relates to TOGAF®, ArchiMate®, and ITIL® as shown below:

By The Open Group6 Relevant Website

For further details on the IT4IT Reference Architecture standard, visit www.opengroup.org/IT4IT.

Comments Off on IT4IT™ Reference Architecture Version 2.0, an Open Group Standard

Filed under IT4IT, Standards, Value Chain

The Open Group Edinburgh 2015: BAE Systems – Using TOGAF® for Operations Transformation

By The Open Group

When Matthew Heard first heard the term TOGAF®, not only did he have no idea what it was but he misspelled the name of the standard at first. It wasn’t until after searching Google for “TOGATH” that the real name for the architectural framework popped up and he got a sense for what it was, he says. And thus began a more than 15-month journey that has started Heard and his colleagues at BAE Systems, a British defense, aerospace and security systems provider, down a path to help transform the Operations function of the company’s Maritime Submarine division.

As is the case when any company looks to TOGAF, an Open Group standard, BAE’s Submarine division was in search of a structured way to help make organizational changes when they sought out the framework. According to Heard, a Senior Operations Engineer at BAE, the company’s needs were multifold. As a product manufacturer, BAE was in need of a way to prepare their systems to transition from their current product to the next generation. With a new product planned to go into production in the near future—one that would require higher technical demands and performance—the company first needed to set itself up to smoothly move into production for the higher demand product while still building the current product line.

In addition, the company wanted to make operational changes. After having surveyed 3,000 of their employees regarding what could be done to make people’s jobs easier and make the company a better place to work, the company had received 8,000 comments about how to create a better working environment. After winnowing those down to 800 separate problem statements that included ideas on how to improve things like safety, deliverables and the overall workplace, the team had many potential ideas and solutions, but no way to determine where to start.

“How do you structure things so that you don’t try to do everything at once and therefore don’t do anything because it’s too overwhelming?” Heard says. “We had a lot of change to make but we couldn’t quantify what it was and what order to do it in.”

As it happened, IBM’s Paul Homan had been doing some work on-site with BAE. When he heard that the company was looking to make some organizational changes, he suggested they look at an Enterprise Architecture framework, such as TOGAF. Although the company’s new head of transformation was familiar with the framework, there were no Enterprise Architects on staff, no TOGAF certified employees and no one else on staff had heard of the standard or of Enterprise Architecture, Heard says. Thus the mix-up the first time he tried to look it up online.

After downloading a copy of TOGAF® 9.1, Heard and his colleague John Wilcock began the task of going through the entire standard to determine if it would help them.

And then they did something very unusual.

“The first thing we did was, anything with more than three syllables, we crossed out with a black pen,” Heard says.

Why did they go through the text and black out entire sections as if it were a classified document riddled with redacted text?

According to Heard, since many of the terms used throughout the TOGAF standard are technology and IT-driven, they knew that they would need to “translate” the document to try to adapt it to their industry and make it understandable for their own internal audiences.

“It talked about ‘Enterprise Architecture,’” Heard said. “If we said that to a welder or pipe fitter, no one’s going to know what that means. I didn’t even know what it meant.”

As a recent university graduate with a background in Engineering Management, Heard says the IT terminology of TOGAF was completely foreign to him. But once they began taking out the IT-related language and began replacing it with terminology related to what submarine mechanics and people in operations would understand, they thought they might be able to better articulate the framework to others.

“We didn’t know whether we had gone so far away from the original intent or whether we were still on the right line,” Heard says.

Luckily, with Paul Homan on-site, they had someone who was familiar with TOGAF that they could go to for guidance. Homan encouraged them to continue on their path.

“For example, it might say something like ‘define the enterprise architecture vision,’” Heard says. “Well I just crossed out the word ‘architecture’ and turned the word ‘enterprise’ into ‘function’ so it said ‘define the functional vision.’ Well, I can do that. I can define what we want the function to look like and operate like and have the performance that we need. That becomes tangible. That’s when we went back to Paul and asked if we were on the right track or if we were just making it up. He said, ‘Carry on with what you’re doing.’”

As it turned out, after Heard and Wilcock had gone through the entire 900-page document, they had maintained the essence and principles of TOGAF while adapting it so that they could use the framework in the way that made the most sense to them and for BAE’s business needs. They adapted the methodology to what they needed it to do for them—which is exactly what the TOGAF ADM is meant to do anyway.

TOGAF was ultimately used to help define BAE’s strategy for transforming their operations and production functions. The project is currently at the stage where they are moving from defining a scope of projects to planning which projects to begin with. The team has scoped approximately 27 transformation projects that will take place over approximately three to five years.

Heard says that it was a fortuitous coincidence that Homan was there to suggest the framework since it ultimately provided exactly the guidance they needed. But Heard also believes that it was also fortuitous that no one was familiar with the standard beforehand and that they took the risk of translating it and adapting it for their own needs. He feels had they already been trained in TOGAF before they started their project, they would have spent more time trying to shoehorn the standard into what they needed instead of just adapting it from the start.

“That was the real learning there,” he says.

Now Heard says he finds himself using the framework on a daily basis for any project he has to tackle.

“It’s now become a routine go-to thing even if it’s a very small project or a piece of work. It’s very easy to understand how to get to an answer,” he says.

Heard says that by providing a structured, standardized approach to solving problems, TOGAF ultimately allows organizations to not only take a structured approach to transformational projects, but also to document and measure their success along the way, which is key for meeting business objectives.

“Standardization gives process to projects. If you follow the same approach you become more efficient. If there’s no standard, you can’t do that.”

Learn more about how BAE is using TOGAF® for Business Transformation at The Open Group Edinburgh, October 19-22, 2015

Join the conversation #ogEDI

By The Open GroupMatthew Heard attended the University of Birmingham from where he graduated with an MSc in Engineering Management in 2013. During his time at University Matthew worked as a Project Engineer for General Motors, focusing on the development of improvements in efficiency of the production line. Upon graduating Matthew joined BAE Systems-Maritime-Submarines looking for a new challenge and further experience of change and improvement programmes. Since Matthew joined BAE his predominant focus has been the delivery of Operational change initiatives. Matthew undertook a review and translation of the TOGAF principles and objectives to develop a unique strategy to deliver a program of change for the Operations Function, the outputs of which delivered the Operations Transformation Strategic Intent and Work Scopes. Going forward Matthew aims to continue developing and utilising the principles and objectives of TOGAF to aid other functions within BAE with their own future strategic developments, starting with the HR Transformation Work Scope.

By The Open GroupJohn Wilcock has worked within the Maritime Sector for the last 27 years. Starting as a shipwright apprentice, John has worked his way up through the organisation to his current position as Head of Manufacturing & Construction Strategy. Throughout his career John has gained a wide range of experiences, working on a diverse selection of Defence and Commercial projects, including Warship and Submarine platforms. During this time John has been instrumental in many change programmes and in his current role John is responsible for the development and delivery of the functional Transformation and Build Strategies. In order to develop the Operations Transformation Strategy John has worked alongside Matthew Heard to undertake a review and translation of the TOGAF principles and objectives to create a bespoke strategic intent and work scope. John continues to drive change and transformation through the TOGAF principles.

By The Open GroupPaul Homan

Enterprise Architect at IBM, CTO for Aerospace, Defence & Shipbuilding IBM UKI

Comments Off on The Open Group Edinburgh 2015: BAE Systems – Using TOGAF® for Operations Transformation

Filed under architecture, Enterprise Transformation, Standards, TOGAF, TOGAF®

The Open Group ArchiMate® Model Exchange File Format and Archi 3.3

By Phil Beauvoir

Some of you might have noticed that Archi 3.3 has been released. This latest version of Archi includes a new plug-in which supports The Open Group ArchiMate Model Exchange File Format standard. This represents the fruits of some years and months’ labour! I’ve been collaborating with The Open Group, and representatives from associated parties and tool vendors, for some time now to produce a file format that can be used to exchange single ArchiMate models between conforming toolsets. Finally, version 1.0 of the standard has been released!

The file format uses XML, which is backed by a validating XSD Schema. Why is this? Wouldn’t XMI be better? Well, yes it would if we had a MOF representation of the ArchiMate standard. Currently, one doesn’t exist. Also, it’s very hard to agree exactly what should be formally represented in a persistence format, as against what can be usefully represented and exchanged using a persistence format. For example, ArchiMate symbols use colour to denote the different layers, and custom colour schemes can be employed to convey meaning. Clearly, this is not something that can be enforced in a specification. Probably the only things that can be enforced are the ArchiMate concepts and relations themselves. Views, viewpoints, and visual arrangements of those concepts and relations are, arguably, optional. A valid ArchiMate model could simply consist of a set of concepts and relations. However, this is probably not very useful in the real world, and so the exchange format seeks to provide a file format for describing and exchanging the most used aspects of ArchiMate models, optional aspects as well as mandatory aspects.

So, simply put, the aim of The Open Group ArchiMate Model Exchange File Format is to provide a pragmatic and useful mechanism for exchanging ArchiMate models and visual representations between compliant toolsets. It does not seek to create a definitive representation of an ArchiMate model. For that to happen, I believe many things would have to be formally declared in the ArchiMate specification. For this reason, many of the components in the exchange format are optional. For example, the ArchiMate 2.1 specification describes the use of attributes as a means to extend the language and provide additional properties to the concepts and relations. The specification does not rigidly mandate their use. However, many toolsets do support and encourage the use of attributes to create model profiles, for example. To support this, the exchange format provides a properties mechanism, consisting of typed key/value pairs. This allows implementers to (optionally) represent additional information for all of the concepts, relations and views.

Even though I have emphasised that the main use for the exchange format is exchange (the name is a bit of a giveaway here ;-)), another advantage of using XML/XSD for the file format is that it is possible to use XSLT to transform the XML ArchiMate model instances into HTML documents, reports, as input for a database, and so on. I would say that the potential for exploiting ArchiMate data in this way is huge.

The exchange format could also help with learning the ArchiMate language and Enterprise Architecture – imagine a repository of ArchiMate models (tagged with Dublin Core metadata to facilitate search and description) that could be used as a resource pool of model patterns and examples for those new to the language. One thing that I personally would like to see is an extensive pool of example models and model snippets as examples of good modelling practice. And using the exchange format, these models and snippets can be loaded into any supporting toolset.

Here are my five “winning features” for the ArchiMate exchange file format:

  • Transparent
  • Simple
  • Well understood format
  • Pragmatic
  • Open

I’m sure that The Open Group ArchiMate Model Exchange File Format will contribute to, and encourage the use of the ArchiMate modelling language, and perhaps reassure users that their valuable data is not locked into any one vendor’s proprietary tool format. I personally think that this is a great initiative and that we have achieved a great result. Of course, nothing is perfect and the exchange format is still at version 1.0, so user feedback is welcome. With greater uptake the format can be improved, and we may see it being exploited in ways that we have not yet thought of!

(For more information about the exchange format, see here.)

About The Open Group ArchiMate® Model Exchange File Format:

The Open Group ArchiMate® Model Exchange File Format Standard defines a file format that can be used to exchange data between systems that wish to import, and export ArchiMate models. ArchiMate Exchange Files enable exporting content from one ArchiMate modelling tool or repository and importing it into another while retaining information describing the model in the file and how it is structured, such as a list of model elements and relationships. The standard focuses on the packaging and transport of ArchiMate models.

The standard is available for free download from:

http://www.opengroup.org/bookstore/catalog/C154.htm.

An online resource site is available at http://www.opengroup.org/xsd/archimate.

By Phil BeauvoirPhil Beauvoir has been developing, writing, and speaking about software tools and development for over 25 years. He was Senior Researcher and Developer at Bangor University, and, later, the Institute for Educational Cybernetics at Bolton University, both in the UK. During this time he co-developed a peer-to-peer learning management and groupware system, a suite of software tools for authoring and delivery of standards-compliant learning objects and meta-data, and tooling to create IMS Learning Design compliant units of learning.  In 2010, working with the Institute for Educational Cybernetics, Phil created the open source ArchiMate Modelling Tool, Archi. Since 2013 he has been curating the development of Archi independently. Phil holds a degree in Medieval English and Anglo-Saxon Literature.

Comments Off on The Open Group ArchiMate® Model Exchange File Format and Archi 3.3

Filed under ArchiMate®, Standards, The Open Group

Using Risk Management Standards: A Q&A with Ben Tomhave, Security Architect and Former Gartner Analyst

By The Open Group

IT Risk Management is currently in a state of flux with many organizations today unsure not only how to best assess risk but also how to place it within the context of their business. Ben Tomhave, a Security Architect and former Gartner analyst, will be speaking at The Open Group Baltimore on July 20 on “The Strengths and Limitations of Risk Management Standards.”

We recently caught up with Tomhave pre-conference to discuss the pros and cons of today’s Risk Management standards, the issues that organizations are facing when it comes to Risk Management and how they can better use existing standards to their advantage.

How would you describe the state of Risk Management and Risk Management standards today?

The topic of my talk is really on the state of standards for Security and Risk Management. There’s a handful of significant standards out there today, varying from some of the work at The Open Group to NIST and the ISO 27000 series, etc. The problem with most of those is that they don’t necessarily provide a prescriptive level of guidance for how to go about performing or structuring risk management within an organization. If you look at ISO 31000 for example, it provides a general guideline for how to structure an overall Risk Management approach or program but it’s not designed to be directly implementable. You can then look at something like ISO 27005 that provides a bit more detail, but for the most part these are fairly high-level guides on some of the key components; they don’t get to the point of how you should be doing Risk Management.

In contrast, one can look at something like the Open FAIR standard from The Open Group, and that gets a bit more prescriptive and directly implementable, but even then there’s a fair amount of scoping and education that needs to go on. So the short answer to the question is, there’s no shortage of documented guidance out there, but there are, however, still a lot of open-ended questions and a lot of misunderstanding about how to use these.

What are some of the limitations that are hindering risk standards then and what needs to be added?

I don’t think it’s necessarily a matter of needing to fix or change the standards themselves, I think where we’re at is that we’re still at a fairly prototypical stage where we have guidance as to how to get started and how to structure things but we don’t necessarily have really good understanding across the industry about how to best make use of it. Complicating things further is an open question about just how much we need to be doing, how much value can we get from these, do we need to adopt some of these practices? If you look at all of the organizations that have had major breaches over the past few years, all of them, presumably, were doing some form of risk management—probably qualitative Risk Management—and yet they still had all these breaches anyway. Inevitably, they were compliant with any number of security standards along the way, too, and yet bad things happen. We have a lot of issues with how organizations are using standards less than with the standards themselves.

Last fall The Open Group fielded an IT Risk Management survey that found that many organizations are struggling to understand and create business value for Risk Management. What you’re saying really echoes those results. How much of this has to do with problems within organizations themselves and not having a better understanding of Risk Management?

I think that’s definitely the case. A lot of organizations are making bad decisions in many areas right now, and they don’t know why or aren’t even aware and are making bad decisions up until the point it’s too late. As an industry we’ve got this compliance problem where you can do a lot of work and demonstrate completion or compliance with check lists and still be compromised, still have massive data breaches. I think there’s a significant cognitive dissonance that exists, and I think it’s because we’re still in a significant transitional period overall.

Security should really have never been a standalone industry or a standalone environment. Security should have just been one of those attributes of the operating system or operating environments from the outset. Unfortunately, because of the dynamic nature of IT (and we’re still going through what I refer to as this Digital Industrial Revolution that’s been going on for 40-50 years), everything’s changing everyday. That will be the case until we hit a stasis point that we can stabilize around and grow a generation that’s truly native with practices and approaches and with the tools and technologies underlying this stuff.

An analogy would be to look at Telecom. Look at Telecom in the 1800s when they were running telegraph poles and running lines along railroad tracks. You could just climb a pole, put a couple alligator clips on there and suddenly you could send and receive messages, too, using the same wires. Now we have buried lines, we have much greater integrity of those systems. We generally know when we’ve lost integrity on those systems for the most part. It took 100 years to get there. So we’re less than half that way with the Internet and things are a lot more complicated, and the ability of an attacker, one single person spending all their time to go after a resource or a target, that type of asymmetric threat is just something that we haven’t really thought about and engineered our environments for over time.

I think it’s definitely challenging. But ultimately Risk Management practices are about making better decisions. How do we put the right amount of time and energy into making these decisions and providing better information and better data around those decisions? That’s always going to be a hard question to answer. Thinking about where the standards really could stand to improve, it’s helping organizations, helping people, understand the answer to that core question—which is, how much time and energy do I have to put into this decision?

When I did my graduate work at George Washington University, a number of years ago, one of the courses we had to take went through decision management as a discipline. We would run through things like decision trees. I went back to the executives at the company that I was working at and asked them, ‘How often do you use decision trees to make your investment decisions?” And they just looked at me funny and said, ‘Gosh, we haven’t heard of or thought about decision trees since grad school.’ In many ways, a lot of the formal Risk Management stuff that we talk about and drill into—especially when you get into the quantitative risk discussions—a lot of that goes down the same route. It’s great academically, it’s great in theory, but it’s not the kind of thing where on a daily basis you need to pull it out and use it for every single decision or every single discussion. Which, by the way, is where the FAIR taxonomy within Open FAIR provides an interesting and very valuable breakdown point. There are many cases where just using the taxonomy to break down a problem and think about it a little bit is more than sufficient, and you don’t have to go the next step of populating it with the actual quantitative estimates and do the quantitative estimations for a FAIR risk analysis. You can use it qualitatively and improve the overall quality and defensibility of your decisions.

How mature are most organizations in their understanding of risk today, and what are some of the core reasons they’re having such a difficult time with Risk Management?

The answer to that question varies to a degree by industry. Industries like financial services just seem to deal with this stuff better for the most part, but then if you look at multibillion dollar write offs for JP Morgan Chase, you think maybe they don’t understand risk after all. I think for the most part most large enterprises have at least some people in the organization that have a nominal understanding of Risk Management and risk assessment and how that factors into making good decisions.

That doesn’t mean that everything’s perfect. Look at the large enterprises that had major breaches in 2014 and 2013 and clearly you can look at those and say ‘Gosh, you guys didn’t make very good decisions.’ Home Depot is a good example or even the NSA with the Snowden stuff. In both cases, they knew they had an exposure, they had done a reasonable job of risk management, they just didn’t move fast enough with their remediation. They just didn’t get stuff in place soon enough to make a meaningful difference.

For the most part, larger enterprises or organizations will have better facilities and capabilities around risk management, but they may have challenges with velocity in terms of being able to put to rest issues in a timely fashion. Now slip down to different sectors and you look at retail, they continue to have issues with cardholder data and that’s where the card brands are asserting themselves more aggressively. Look at healthcare. Healthcare organizations, for one thing, simply don’t have the budget or the control to make a lot of changes, and they’re well behind the curve in terms of protecting patient records and data. Then look at other spaces like SMBs, which make up more than 90 percent of U.S. employment firms or look at the education space where they simply will never have the kinds of resources to do everything that’s expected of them.

I think we have a significant challenge here – a lot of these organizations will never have the resources to have adequate Risk Management in-house, and they will always be tremendously resource-constrained, preventing them from doing all that they really need to do. The challenge for them is, how do we provide answers or tools or methods to them that they can then use that don’t require a lot of expertise but can guide them toward making better decisions overall even if the decision is ‘Why are we doing any of this IT stuff at all when we can simply be outsourcing this to a service that specializes in my industry or specializes in my SMB business size that can take on some of the risk for me that I wasn’t even aware of?’

It ends up being a very basic educational awareness problem in many regards, and many of these organizations don’t seem to be fully aware of the type of exposure and legal liability that they’re carrying at any given point in time.

One of the other IT Risk Management Survey findings was that where the Risk Management function sits in organizations is pretty inconsistent—sometimes IT, sometimes risk, sometimes security—is that part of the problem too?

Yes and no—it’s a hard question to answer directly because we have to drill in on what kind of Risk Management we’re talking about. Because there’s enterprise Risk Management reporting up to a CFO or CEO, and one could argue that the CEO is doing Risk Management.

One of the problems that we historically run into, especially from a bottom-up perspective, is a lot of IT Risk Management people or IT Risk Management professionals or folks from the audit world have mistakenly thought that everything should boil down to a single, myopic view of ‘What is risk?’ And yet it’s really not how executives run organizations. Your chief exec, your board, your CFO, they’re not looking at performance on a single number every day. They’re looking at a portfolio of risk and how different factors are balancing out against everything. So it’s really important for folks in Op Risk Management and IT Risk Management to really truly understand and make sure that they’re providing a portfolio view up the chain that adequately represents the state of the business, which typically will represent multiple lines of business, multiple systems, multiple environments, things like that.

I think one of the biggest challenges we run into is just in an ill-conceived desire to provide value that’s oversimplified. We end up hyper-aggregating results and data, and suddenly everything boils down to a stop light that IT today is either red, yellow or green. That’s not really particularly informative, and it doesn’t help you make better decisions. How can I make better investment decisions around IT systems if all I know is that today things are yellow? I think it comes back to the educational awareness topic. Maybe people aren’t always best placed within organizations but really it’s more about how they’re representing the data and whether they’re getting it into the right format that’s most accessible to that audience.

What should organizations look for in choosing risk standards?

I usually get a variety of questions and they’re all about risk assessment—‘Oh, we need to do risk assessment’ and ‘We hear about this quant risk assessment thing that sounds really cool, where do we get started?’ Inevitably, it comes down to, what’s your actual Risk Management process look like? Do you actually have a context for making decisions, understanding the business context, etc.? And the answer more often than not is no, there is no actual Risk Management process. I think really where people can leverage the standards is understanding what the overall risk management process looks like or can look like and in constructing that, making sure they identify the right stakeholders overall and then start to drill down to specifics around impact analysis, actual risk analysis around remediation and recovery. All of these are important components but they have to exist within the broader context and that broader context has to functionally plug into the organization in a meaningful, measurable manner. I think that’s really where a lot of the confusion ends up occurring. ‘Hey I went to this conference, I heard about this great thing, how do I make use of it?’ People may go through certification training but if they don’t know how to go back to their organization and put that into practice not just on a small-scale decision basis, but actually going in and plugging it into a larger Risk Management process, it will never really demonstrate a lot of value.

The other piece of the puzzle that goes along with this, too, is you can’t just take these standards and implement them verbatim; they’re not designed to do that. You have to spend some time understanding the organization, the culture of the organization and what will work best for that organization. You have to really get to know people and use these things to really drive conversations rather than hoping that one of these risk assessments results will have some meaningful impact at some point.

How can organizations get more value from Risk Management and risk standards?

Starting with latter first, the value of the Risk Management standards is that you don’t have to start from scratch, you don’t have to reinvent the wheel. There are, in fact, very consistent and well-conceived approaches to structuring risk management programs and conducting risk assessment and analysis. That’s where the power of the standards come from, from establishing a template or guideline for establishing things.

The challenge of course is you have to have it well-grounded within the organization. In order to get value from a Risk Management program, it has to be part of daily operations. You have to plug it into things like procurement cycles and other similar types of decision cycles so that people aren’t just making gut decisions based off whatever their existing biases are.

One of my favorite examples is password complexity requirements. If you look back at the ‘best practice’ standards requirements over the years, going all the way back to the Orange Book in the 80s or the Rainbow Series which came out of the federal government, they tell you ‘oh, you have to have 8-character passwords and they have to have upper case, lower, numbers, special characters, etc.’ The funny thing is that while that was probably true in 1985, that is probably less true today. When we actually do risk analysis to look at the problem, and understand what the actual scenario is that we’re trying to guard against, password complexity ends up causing more problems than it solves because what we’re really protecting against is a brute force attack against a log-in interface or guessability on a log-in interface. Or maybe we’re trying to protect against a password database being compromised and getting decrypted. Well, password complexity has nothing to do with solving how that data is protected in storage. So why would we look at something like password complexity requirements as some sort of control against compromise of a database that may or may not be encrypted?

This is where Risk Management practices come into play because you can use Risk Management and risk assessment techniques to look at a given scenario—whether it be technology decisions or security control decisions, administrative or technical controls—we can look at this and say what exactly are we trying to protect against, what problem are we trying to solve? And then based on our understanding of that scenario, let’s look at the options that we can apply to achieve an appropriate degree of protection for the organization.

That ultimately is what we should be trying to achieve with Risk Management. Unfortunately, that’s usually not what we see implemented. A lot of the time, what’s described as risk management is really just an extension of audit practices and issuing a bunch of surveys, questionnaires, asking a lot of questions but never really putting it into a proper business context. Then we see a lot of bad practices applied, and we start seeing a lot of math-magical practices come in where we take categorical data—high, medium, low, more or less, what’s the impact to the business? A lot, a little—we take these categorical labels and suddenly start assigning numerical values to them and doing arithmetic calculations on them, and this is a complete violation of statistical principles. You shouldn’t be doing that at all. By definition, you don’t do arithmetic on categorical data, and yet that’s what a lot of these alleged Risk Management and risk assessment programs are doing.

I think Risk Management gets a bad rap as a result of these poor practices. Conducting a survey, asking questions is not a risk assessment. A risk assessment is taking a scenario, looking at the business impact analysis for that scenario, looking at the risk tolerance, what the risk capacity is for that scenario, and then looking at what the potential threats and weaknesses are within that scenario that could negatively impact the business. That’s a risk assessment. Asking people a bunch of questions about ‘Do you have passwords? Do you use complex passwords? Have you hardened the server? Are there third party people involved?’ That’s interesting information but it’s not usually reflective of the risk state and ultimately we want to find out what the risk state is.

How do you best determine that risk state?

If you look at any of the standards—and again this is where the standards do provide some value—if you look at what a Risk Management process is and the steps that are involved in it, take for example ISO 31000—step one is establishing context, which includes establishing potential business impact or business importance, business priority for applications and data, also what the risk tolerance, risk capacity is for a given scenario. That’s your first step. Then the risk assessment step is taking that data and doing additional analysis around that scenario.

In the technical context, that’s looking at how secure is this environment, what’s the exposure of the system, who has access to it, how is the data stored or protected? From that analysis, you can complete the assessment by saying ‘Given that this is a high value asset, there’s sensitive data in here, but maybe that data is strongly encrypted and access controls have multiple layers of defense, etc., the relative risk here of a compromise or attack being successful is fairly low.’ Or ‘We did this assessment, and we found in the application that we could retrieve data even though it was supposedly stored in an encrypted state, so we could end up with a high risk statement around the business impact, we’re looking at material loss,’ or something like that.

Pulling all of these pieces together is really key, and most importantly, you cannot skip over context setting. If you don’t ever do context setting, and establish the business importance, nothing else ends up mattering. Just because a system has a vulnerability doesn’t mean that it’s a material risk to the business. And you can’t even know that unless you establish the context.

In terms of getting started, leveraging the standards makes a lot of sense, but not from a perspective of this is a compliance check list that I’m going to use verbatim. You have to use it as a structured process, you have to get some training and get educated on how these things work and then what requirements you have to meet and then do what makes sense for the organizational role. At the end of the day, there’s no Easy Button for these things, you have to invest some time and energy and build something that makes sense and is functional for your organization.

To download the IT Risk Management survey summary, please click here.

By The Open GroupFormer Gartner analyst Ben Tomhave (MS, CISSP) is Security Architect for a leading online education organization where he is putting theories into practice. He holds a Master of Science in Engineering Management (Information Security Management concentration) from The George Washington University, and is a member and former co-chair of the American Bar Association Information Security Committee, senior member of ISSA, former board member of the Northern Virginia OWASP chapter, and member and former board member for the Society of Information Risk Analysts. He is a published author and an experienced public speaker, including recent speaking engagements with the RSA Conference, the ISSA International Conference, Secure360, RVAsec, RMISC, and several Gartner events.

Join the conversation! @theopengroup #ogchat #ogBWI

1 Comment

Filed under Cybersecurity, RISK Management, Security, Security Architecture, Standards, The Open Group Baltimore 2015, Uncategorized

The Open Group Madrid 2015 – Day One Highlights

By The Open Group

On Monday, April 20, Allen Brown, President & CEO of The Open Group, welcomed 150 attendees to the Enabling Boundaryless Information Flow™ summit held at the Madrid Eurobuilding Hotel.  Following are highlights from the plenary:

The Digital Transformation of the Public Administration of Spain – Domingo Javier Molina Moscoso

Domingo Molina, the first Spanish national CIO, said that governments must transform digitally to meet public expectations, stay nationally competitive, and control costs – the common theme in transformation of doing more with less. Their CORA commission studied what commercial businesses did, and saw the need for an ICT platform as part of the reform, along with coordination and centralization of ICT decision making across agencies.

Three Projects:

  • Telecom consolidation – €125M savings, reduction in infrastructure and vendors
  • Reduction in number of data centers
  • Standardizing and strengething security platform for central administration – only possible because of consolidation of telecom.

The Future: Increasing use of mobile, social networks, online commercial services such as banking – these are the expectations of young people. The administration must therefore be in the forefront of providing digital services to citizens. They have set a transformation target of having citizens being able to interact digitally with all government services by 2020.

Q&A:

  • Any use of formal methods for transformation such as EA? Looked at other countries – seen models such as outsourcing. They are taking a combined approach of reusing their experts and externalizing.
  • How difficult has it been to achieve savings in Europe given labor laws? Model is to re-assign people to higher-value tasks.
  • How do you measure progress: Each unit has own ERP for IT governance – no unified reporting. CIO requests and consolidates data. Working on common IT tool to do this.

An Enterprise Transformation Approach for Today’s Digital Business – Fernando García Velasco

Computing has moved from tabulating systems to the internet and moving into an era of “third platform” of Cloud, Analytics, Mobile and Social (CAMS) and cognitive computing. The creates a “perfect storm” for disruption of enterprise IT delivery.

  • 58% say SMAC will reduce barriers to entry
  • 69% say it will increase competition
  • 41% expect this competition to come from outside traditional market players

These trends are being collected and consolidated in The Open Group Open Platform 3.0™ standard.

He sees the transformation happening in three ways:

  1. Top-down – a transformation view
  2. Meet in the middle: Achieving innovation through EA
  3. Bottom-up: the normal drive for incremental improvement

Gartner: EA is the discipline for leading enterprise response to disruptive forces. IDC: EA is mandatory for managing transformation to third platform.

EA Challenges & Evolution – a Younger Perspective

Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects (AEA), noted the AEA is leading the development of EA as a profession, and is holding the session to recognize the younger voices joining the EA profession. He introduced the panelists: Juan Abal, Itziar Leguinazabal, Mario Gómez Velasco, Daniel Aguado Pérez, Ignacio Macias Jareño.

The panelists talked about their journey as EAs, noting that their training focused on development with little exposure to EA or Computer Science concepts. Schools aren’t currently very interested in teaching EA, so it is hard to get a start. Steve Nunn noted the question of how to enter EA as a profession is a worldwide concern. The panelists said they started looking at EA as a way of gaining a wider perspective of the development or administrative projects they were working on. Mentoring is important, and there is a challenge in learning about the business side when coming from a technical world. Juan Abal said such guidance and mentoring by senior architects is one of the benefits the AEA chapter offers.

Q: What advice would you give to someone entering into the EA career? A: If you are starting from a CS or engineering perspective, you need to start learning about the business. Gain a deep knowledge of your industry. Expect a lot of hard work, but it will have the reward of having more impact on decisions. Q: EA is really about business and strategy. Does the AEA have a strategy for making the market aware of this? A: The Spanish AEA chapter focuses on communicating that EA is a mix, and that EAs need to develop business skills. It is a concern that young architects are focused on IT aspects of EA, and how they can be shown the path to understand the business side.

Q: Should EA be part of the IT program or the CS program in schools? A: We have seen around the world a history of architects coming from IT and that only a few universities have specific IT programs. Some offer it at the postgraduate level. The EA is trying globally to raise awareness of the need for EA education. continuing education as part of a career development path is a good way to manage the breadth of skills a good EA needs; organizations should also be aware of the levels of The Open Group Open CA certifications.

Q: If EA is connected to business, should EAs be specialized to the vertical sector, or should EA be business agnostic? A: Core EA skills are industry-agnostic, and these need to be supplemented by industry-specific reference models. Methodology, Industry knowledge and interpersonal skills are all critical, and these are developed over time.

Q: Do you use EA tools in your job? A: Not really – the experience to use complex tools comes over time.

Q: Are telecom companies adopting EA? A: Telecom companies are adopting standard reference architectures. This sector has not made much progress in EA, though it is critical for transformation in the current market. Time pressure in a changing market is also a barrier.

Q: Is EA being grown in-house or outsourced? A: We are seeing increased uptake among end-user companies in using EA to achieve transformation – this is happening across sectors and is a big opportunity in Spain right now.

Join the conversation! @theopengroup #ogMAD

Comments Off on The Open Group Madrid 2015 – Day One Highlights

Filed under Boundaryless Information Flow™, Enterprise Architecture, Internet of Things, Open Platform 3.0, Professional Development, Standards, Uncategorized