Tag Archives: standards

Mac OS X El Capitan Achieves UNIX® Certification

By The Open Group

The Open Group, an international vendor- and technology-neutral consortium, has announced that Apple, Inc. has achieved UNIX® certification for its latest operating system – Mac OS X version 10.11 known as “El Capitan.”

El Capitan was announced on September 29, 2015 following it being registered as conforming to The Open Group UNIX® 03 standard on the September 7, 2015.

The UNIX® trademark is owned and managed by The Open Group, with the trademark licensed exclusively to identify operating systems that have passed the tests identifying that they conform to The Single UNIX Specification, a standard of The Open Group. UNIX certified operating systems are trusted for mission critical applications because they are powerful and robust, they have a small footprint and are inherently more secure and more stable than the alternatives.

Mac OS X is the most widely used UNIX desktop operating system. Apple’s installed base is now over 80 million users. It’s commitment to the UNIX standard as a platform enables wide portability of applications between compliant and compatible operating systems.

Leave a comment

Filed under Uncategorized

The Open Group ArchiMate® Model Exchange File Format and Archi 3.3

By Phil Beauvoir

Some of you might have noticed that Archi 3.3 has been released. This latest version of Archi includes a new plug-in which supports The Open Group ArchiMate Model Exchange File Format standard. This represents the fruits of some years and months’ labour! I’ve been collaborating with The Open Group, and representatives from associated parties and tool vendors, for some time now to produce a file format that can be used to exchange single ArchiMate models between conforming toolsets. Finally, version 1.0 of the standard has been released!

The file format uses XML, which is backed by a validating XSD Schema. Why is this? Wouldn’t XMI be better? Well, yes it would if we had a MOF representation of the ArchiMate standard. Currently, one doesn’t exist. Also, it’s very hard to agree exactly what should be formally represented in a persistence format, as against what can be usefully represented and exchanged using a persistence format. For example, ArchiMate symbols use colour to denote the different layers, and custom colour schemes can be employed to convey meaning. Clearly, this is not something that can be enforced in a specification. Probably the only things that can be enforced are the ArchiMate concepts and relations themselves. Views, viewpoints, and visual arrangements of those concepts and relations are, arguably, optional. A valid ArchiMate model could simply consist of a set of concepts and relations. However, this is probably not very useful in the real world, and so the exchange format seeks to provide a file format for describing and exchanging the most used aspects of ArchiMate models, optional aspects as well as mandatory aspects.

So, simply put, the aim of The Open Group ArchiMate Model Exchange File Format is to provide a pragmatic and useful mechanism for exchanging ArchiMate models and visual representations between compliant toolsets. It does not seek to create a definitive representation of an ArchiMate model. For that to happen, I believe many things would have to be formally declared in the ArchiMate specification. For this reason, many of the components in the exchange format are optional. For example, the ArchiMate 2.1 specification describes the use of attributes as a means to extend the language and provide additional properties to the concepts and relations. The specification does not rigidly mandate their use. However, many toolsets do support and encourage the use of attributes to create model profiles, for example. To support this, the exchange format provides a properties mechanism, consisting of typed key/value pairs. This allows implementers to (optionally) represent additional information for all of the concepts, relations and views.

Even though I have emphasised that the main use for the exchange format is exchange (the name is a bit of a giveaway here ;-)), another advantage of using XML/XSD for the file format is that it is possible to use XSLT to transform the XML ArchiMate model instances into HTML documents, reports, as input for a database, and so on. I would say that the potential for exploiting ArchiMate data in this way is huge.

The exchange format could also help with learning the ArchiMate language and Enterprise Architecture – imagine a repository of ArchiMate models (tagged with Dublin Core metadata to facilitate search and description) that could be used as a resource pool of model patterns and examples for those new to the language. One thing that I personally would like to see is an extensive pool of example models and model snippets as examples of good modelling practice. And using the exchange format, these models and snippets can be loaded into any supporting toolset.

Here are my five “winning features” for the ArchiMate exchange file format:

  • Transparent
  • Simple
  • Well understood format
  • Pragmatic
  • Open

I’m sure that The Open Group ArchiMate Model Exchange File Format will contribute to, and encourage the use of the ArchiMate modelling language, and perhaps reassure users that their valuable data is not locked into any one vendor’s proprietary tool format. I personally think that this is a great initiative and that we have achieved a great result. Of course, nothing is perfect and the exchange format is still at version 1.0, so user feedback is welcome. With greater uptake the format can be improved, and we may see it being exploited in ways that we have not yet thought of!

(For more information about the exchange format, see here.)

About The Open Group ArchiMate® Model Exchange File Format:

The Open Group ArchiMate® Model Exchange File Format Standard defines a file format that can be used to exchange data between systems that wish to import, and export ArchiMate models. ArchiMate Exchange Files enable exporting content from one ArchiMate modelling tool or repository and importing it into another while retaining information describing the model in the file and how it is structured, such as a list of model elements and relationships. The standard focuses on the packaging and transport of ArchiMate models.

The standard is available for free download from:


An online resource site is available at http://www.opengroup.org/xsd/archimate.

By Phil BeauvoirPhil Beauvoir has been developing, writing, and speaking about software tools and development for over 25 years. He was Senior Researcher and Developer at Bangor University, and, later, the Institute for Educational Cybernetics at Bolton University, both in the UK. During this time he co-developed a peer-to-peer learning management and groupware system, a suite of software tools for authoring and delivery of standards-compliant learning objects and meta-data, and tooling to create IMS Learning Design compliant units of learning.  In 2010, working with the Institute for Educational Cybernetics, Phil created the open source ArchiMate Modelling Tool, Archi. Since 2013 he has been curating the development of Archi independently. Phil holds a degree in Medieval English and Anglo-Saxon Literature.

Leave a comment

Filed under ArchiMate®, Standards, the open group

The Open Trusted Technology Provider™ Standard (O-TTPS) Approved as ISO/IEC International Standard

The Open Trusted Technology Provider™ Standard (O-TTPS), a Standard from The Open Group for Product Integrity and Supply Chain Security, Approved as ISO/IEC International Standard

Doing More to Secure IT Products and their Global Supply Chains

By Sally Long, The Open Group Trusted Technology Forum Director

As the Director of The Open Group Trusted Technology Forum, I am thrilled to share the news that The Open Trusted Technology Provider™ Standard – Mitigating Maliciously Tainted and Counterfeit Products (O-TTPS) v 1.1 is approved as an ISO/IEC International Standard (ISO/IEC 20243:2015).

It is one of the first standards aimed at assuring both the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

The standard defines a set of best practices for COTS ICT providers to use to mitigate the risk of maliciously tainted and counterfeit components from being incorporated into each phase of a product’s lifecycle. This encompasses design, sourcing, build, fulfilment, distribution, sustainment, and disposal. The best practices apply to in-house development, outsourced development and manufacturing, and to global supply chains.

The ISO/IEC standard will be published in the coming weeks. In advance of the ISO/IEC 20243 publication, The Open Group edition of the standard, technically identical to the ISO/IEC approved edition, is freely available here.

The standardization effort is the result of a collaboration in The Open Group Trusted Technology Provider Forum (OTTF), between government, third party evaluators and some of industry’s most mature and respected providers who came together as members and, over a period of five years, shared and built on their practices for integrity and security, including those used in-house and those used with their own supply chains. From these, they created a set of best practices that were standardized through The Open Group consensus review process as the O-TTPS. That was then submitted to the ISO/IEC JTC1 process for Publicly Available Specifications (PAS), where it was recently approved.

The Open Group has also developed an O-TTPS Accreditation Program to recognize Open Trusted Technology Providers who conform to the standard and adhere to best practices across their entire enterprise, within a specific product line or business unit, or within an individual product. Accreditation is applicable to all ICT providers in the chain: OEMS, integrators, hardware and software component suppliers, value-add distributors, and resellers.

While The Open Group assumes the role of the Accreditation Authority over the entire program, it also uses third-party assessors to assess conformance to the O-TTPS requirements. The Accreditation Program and the Assessment Procedures are publicly available here. The Open Group is also considering submitting the O-TTPS Assessment Procedures to the ISO/IEC JTC1 PAS process.

This international approval comes none-too-soon, given the global threat landscape continues to change dramatically, and cyber attacks – which have long targeted governments and big business – are growing in sophistication and prominence. We saw this most clearly with the Sony hack late last year. Despite successes using more longstanding hacking methods, maliciously intentioned cyber criminals are looking at new ways to cause damage and are increasingly looking at the technology supply chain as a potentially profitable avenue. In such a transitional environment, it is worth reviewing again why IT products and their supply chains are so vulnerable and what can be done to secure them in the face of numerous challenges.

Risk lies in complexity

Information Technology supply chains depend upon complex and interrelated networks of component suppliers across a wide range of global partners. Suppliers deliver parts to OEMS, or component integrators who build products from them, and in turn offer products to customers directly or to system integrators who integrate them with products from multiple providers at a customer site. This complexity leaves ample opportunity for malicious components to enter the supply chain and leave vulnerabilities that can potentially be exploited.

As a result, organizations now need assurances that they are buying from trusted technology providers who follow best practices every step of the way. This means that they not only follow secure development and engineering practices in-house while developing their own software and hardware pieces, but also that they are following best practices to secure their supply chains. Modern cyber criminals go through strenuous efforts to identify any sort of vulnerability that can be exploited for malicious gain and the supply chain is no different.

Untracked malicious behavior and counterfeit components

Tainted products introduced into the supply chain pose significant risk to organizations because altered products introduce the possibility of untracked malicious behavior. A compromised electrical component or piece of software that lies dormant and undetected within an organization could cause tremendous damage if activated externally. Customers, including governments are moving away from building their own high assurance and customized systems and moving toward the use of commercial off the shelf (COTS) information and communication technology (ICT), typically because they are better, cheaper and more reliable. But a maliciously tainted COTS ICT product, once connected or incorporated, poses a significant security threat. For example, it could allow unauthorized access to sensitive corporate data including intellectual property, or allow hackers to take control of the organization’s network. Perhaps the most concerning element of the whole scenario is the amount of damage that such destructive hardware or software could inflict on safety or mission critical systems.

Like maliciously tainted components, counterfeit products can also cause significant damage to customers and providers resulting in failed or inferior products, revenue and brand equity loss, and disclosure of intellectual property. Although fakes have plagued manufacturers and suppliers for many years, globalization has greatly increased the number of out-sourced components and the number of links in every supply chain, and with that comes increased risk of tainted or counterfeit parts making it into operational environments. Consider the consequences if a faulty component was to fail in a government, financial or safety critical system or if it was also maliciously tainted for the sole purpose of causing widespread catastrophic damage.

Global solution for a global problem – the relevance of international standards

One of the emerging challenges is the rise of local demands on IT providers related to cybersecurity and IT supply chains. Despite technology supply chains being global in nature, more and more local solutions are cropping up to address some of the issues mentioned earlier, resulting in multiple countries with different policies that included disparate and variable requirements related to cybersecurity and their supply chains. Some are competing local standards, but many are local solutions generated by governmental policies that dictate which country to buy from and which not to. The supply chain has become a nationally charged issue that requires the creation of a level playing field regardless of where your company is based. Competition should be based on the quality, integrity and security of your products and processes and not where the products were developed, manufactured, or assembled.

Having transparent criteria through global international standards like our recently approved O-TTPS standard (ISO/IEC 20243) and objective assessments like the O-TTPS Accreditation Program that help assure conformance to those standards is critical to both raise the bar on global suppliers and to provide equal opportunity (vendor-neutral and country-nuetral) for all constituents in the chain to reach that bar – regardless of locale.

The approval by ISO/IEC of this universal product integrity and supply chain security standard is an important next step in the continued battle to secure ICT products and protect the environments in which they operate. Suppliers should explore what they need to do to conform to the standard and buyers should consider encouraging conformance by requesting conformance to it in their RFPs. By adhering to relevant international standards and demonstrating conformance we will have a powerful tool for technology providers and component suppliers around the world to utilize in combating current and future cyber attacks on our critical infrastructure, our governments, our business enterprises and even on the COTS ICT that we have in our homes. This is truly a universal problem that we can begin to solve through adoption and adherence to international standards.

By Sally Long, OTTF DirectorSally Long is the Director of The Open Group Trusted Technology Forum (OTTF). She has managed customer supplier forums and collaborative development projects for over twenty years. She was the release engineering section manager for all multi-vendor collaborative technology development projects at The Open Software Foundation (OSF) in Cambridge Massachusetts. Following the merger of the OSF and X/Open under The Open Group, she served as director for multiple forums in The Open Group. Sally has a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Massachusetts.

Contact:  s.long@opengroup.org; @sallyannlong

Leave a comment

Filed under Uncategorized

A World Without IT4IT: Why It’s Time to Run IT Like a Business

By Dave Lounsbury, CTO, The Open Group

IT departments today are under enormous pressure. In the digital world, businesses have become dependent on IT to help them remain competitive. However, traditional IT departments have their roots in skills such as development or operations and have not been set up to handle a business and technology environment that is trying to rapidly adapt to a constantly changing marketplace. As a result, many IT departments today may be headed for a crisis.

At one time, IT departments led technology adoption in support of business. Once a new technology was created—departmental servers, for instance—it took a relatively long time before businesses took advantage of it and even longer before they became dependent on the technology. But once a business did adopt the technology, it became subject to business rules—expectations and parameters for reliability, maintenance and upgrades that kept the technology up to date and allowed the business it supported to keep up with the market.

As IT became more entrenched in organizations throughout the 1980s and 1990s, IT systems increased in size and scope as technology companies fought to keep pace with market forces. In large enterprises, in particular, IT’s function became to maintain large infrastructures, requiring small armies of IT workers to sustain them.

A number of forces have combined to change all that. Today, most businesses do their business operations digitally—what Constellation Research analyst Andy Mulholland calls “Front Office Digital Business.” Technology-as-a-service models have changed how the technologies and applications are delivered and supported, with support and upgrades coming from outsourced vendors, not in-house staff. With Cloud models, an IT department may not even be necessary. Entrepreneurs can spin up a company with a swipe of a credit card and have all the technology they need at their fingertips, hosted remotely in the Cloud.

The Gulf between IT and Business

Although the gap between IT and business is closing, the gulf in how IT is run still remains. In structure, most IT departments today remain close to their technology roots. This is, in part, because IT departments are still run by technologists and engineers whose primary skills lie in the challenge (and excitement) of creating new technologies. Not every skilled engineer makes a good businessperson, but in most organizations, people who are good at their jobs often get promoted into management whether or not they are ready to manage. The Peter Principle is a problem that hinders many organizations, not just IT departments.

What has happened is that IT departments have not traditionally been run as if they were a business. Good business models for how IT should be run have been piecemeal or slow to develop—despite IT’s role in how the rest of the business is run. Although some standards have been developed as guides for how different parts of IT should be run (COBIT for governance, ITIL for service management, TOGAF®, an Open Group standard, for architecture), no overarching standard has been developed that encompasses how to holistically manage all of IT, from systems administration to development to management through governance and, of course, staffing. For all its advances, IT has yet to become a well-oiled business machine.

The business—and technological—climate today is not the same as it was when companies took three years to do a software upgrade. Everything in today’s climate happens nearly instantaneously. “Convergence” technologies like Cloud Computing, Big Data, social media, mobile and the Internet of Things are changing the nature of IT. New technical skills and methodologies are emerging every day, as well. Although languages such as Java or C may remain the top programming languages, new languages like Pig or Hive are emerging everyday, as are new approaches to development, such as Scrum, Agile or DevOps.

The Consequences of IT Business as Usual

With these various forces facing IT, departments will either need to change and adopt a model where IT is managed more effectively or departments may face some impending chaos that ends up hindering their organizations.

Without an effective management model for IT, companies won’t be able to mobilize quickly for a digital age. Even something as simple as an inability to utilize data could result in problems such as investing in a product prototype that customers aren’t interested in. Those are mistakes most companies can’t afford to make these days.

Having an umbrella view of what all of IT does also allows the department to make better decisions. With technology and development trends changing so quickly, how do you know what will fit your organization’s business goals? You want to take advantage of the trends or technologies that make sense for the company and leave behind those that don’t.

For example, in DevOps, one of the core concepts is to bring the development phase into closer alignment with releasing and operating the software. You need to know your business’s operating model to determine whether this approach will actually work or not. Having a sense of that also allows IT to make decisions about whether it’s wise to invest in training or hiring staff skilled in those methods or buying new technologies that will allow you to adopt the model.

Not having that management view can leave companies subject to the whims of technological evolution and also to current IT fads. If you don’t know what’s valuable to your business, you run the risk of chasing every new fad that comes along. There’s nothing worse—as the IT guy—than being the person who comes to the management meeting each month saying you’re trying yet another new approach to solve a problem that never seems to get solved. Business people won’t respond to that and will wonder if you know what you’re doing. IT needs to be decisive and choose wisely.

These issues not only affect the IT department but to trickle up to business operations. Ineffective IT shops will not know when to invest in the correct technologies, and they may miss out on working with new technologies that could benefit the business. Without a framework to plan how technology fits into the business, you could end up in the position of having great IT bows and arrows but when you walk out into the competitive world, you get machine-gunned.

The other side is cost and efficiency—if the entire IT shop isn’t running smoothly throughout then you end up spending too much money on problems, which in turn takes money away from other parts of the business that can keep the organization competitive. Failing to manage IT can lead to competitive loss across numerous areas within a business.

A New Business Model

To help prevent the consequences that may result if IT isn’t run more like a business, industry leaders such as Accenture; Achmea; AT&T; HP IT; ING Bank; Munich RE; PwC; Royal Dutch Shell; and University of South Florida, recently formed a consortium to address how to better run the business of IT. With billions of dollars invested in IT each year, these companies realized their investments must be made wisely and show governable results in order succeed.

The result of their efforts is The Open Group IT4IT™ Forum, which released a Snapshot of its proposed Reference Architecture for running IT more like a business this past November. The Reference Architecture is meant to serve as an operating model for IT, providing the “missing link” that previous IT-function specific models have failed to address. The model allows IT to achieve the same level of business, discipline, predictability and efficiency as other business functions.

The Snapshot includes a four-phase Value Chain for IT that provides both an operating model for an IT business and outlines how value can be added at every stage of the IT process. In addition to providing suggested best practices for delivery, the Snapshot includes technical models for the IT tools that organizations can use, whether for systems monitoring, release monitoring or IT point solutions. Providing guidance around IT tools will allow these tools to become more interoperable so that they can exchange information at the right place at the right time. In addition, it will allow for better control of information flow between various parts of the business through the IT shop, thus saving IT departments the time and hassle of aggregating tools or cobbling together their own tools and solutions. Staffing guidance models are also included in the Reference Architecture.

Why IT4IT now? Digitalization cannot be held back, particularly in an era of Cloud, Big Data and an impending Internet of Things. An IT4IT Reference Architecture provides more than just best practices for IT—it puts IT in the context of a business model that allows IT to be a contributing part of an enterprise, providing a roadmap for digital businesses to compete and thrive for years to come.

Join the conversation! @theopengroup #ogchat

By The Open GroupDavid is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, David leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia.

David holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.


Filed under Cloud, digital technologies, Enterprise Transformation, Internet of Things, IT, IT4IT, TOGAF, TOGAF®

Using the ArchiMate® Language to Model TOGAF® Architectures

By William Estrem, President, Metaplexity Associates LLC, Serge Thorn, Associate, Metaplexity Associates LLC, and Sonia Gonzalez, Architecture and ArchiMate® Forums Director, The Open Group

If you are using the TOGAF® standard in your organization to guide the process of developing Enterprise Architectures, you could consider using the ArchiMate® language. ArchiMate, an Open Group standard, is a modeling language that is designed from the ground up to support modeling Enterprise Architectures and that can be very successfully applied for developing architecture descriptions that are well aligned with your organization’s strategy

The TOGAF standard is a framework for creating an Enterprise Architecture capability in your organization. The TOGAF Architecture Development Method (ADM) is a central feature of the TOGAF standard. The ADM cycle describes an incremental and iterative method for designing Business, Data, Applications, and Technology architectures. It progresses from high-level concept diagrams, to detailed domain architectures, all the way to the development of solution architectures, architecture roadmaps and implementation plans.

The ArchiMate® language is an Open Group standard that provides an Enterprise Architecture modeling language. The Archimate® language views the model as a set of layers (Business, Application, and Technology) as well as some specialized extensions (Motivation, and Implementation and Migration).

The Open Group Architecture and ArchiMate Forums have established a joint project known as Project Harmony that is focused on improving how the TOGAF and ArchiMate standards can be used together to create effective architecture descriptions.

Project Harmony has now published its first deliverables, a series of white papers that deliver guidance on the combined use of the TOGAF® Enterprise Architecture (EA) framework and the ArchiMate® EA modeling language. A Practitioner’s Guide summarizes the key findings of three in-depth white papers, which analyze the standards in terms of terminology, viewpoints, and metamodels, and provide recommendations on how they can be best used together.

The full series is entitled TOGAF® Framework and ArchiMate® Modeling Language Harmonization. The four white papers are:

  • A Practitioner’s Guide to Using the TOGAF® Framework and the ArchiMate® Language (W14C)
  • Content Metamodel Harmonization: Entitles and Relationships (W14D)
  • Glossaries Comparison (W14A)
  • Viewpoints Mapping (W14B)

All four can be downloaded from here (select the ZIP file link).

The Open Group has recently hosted a webinar highlighting how you can use TOGAF and ArchiMate together more effectively, to view the webinar visit: https://www2.opengroup.org/ogsys/catalog/D120

By William Estrem, Serge Thorn and Sonia GonzalezWilliam Estrem, President of Metaplexity Associates LLC, is currently serving as the chairman of Project Harmony. He has been involved with the development of the TOGAF standard since 1994. He is a former chairman of the Architecture Forum and served a two year term on the Open Group Board of Governors. Metaplexity Associates is a Gold Level member of The Open Group. It is a U.S. based education and consulting firm that offers services related to Enterprise Architecture. Metaplexity Associates offers accredited TOGAF courses.


By William Estrem, Serge Thorn and Sonia GonzalezSerge Thorn was CIO of Architecting the Enterprise, now an Associate of Metaplexity Associates LLC. He has worked in the IT Industry and Consultancy (Banking-Finance, Biotechnology-Pharma/Chemical, Telcos), for over 30 years, in a variety of roles, which include: Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Project and Portfolio Management, Enterprise Architecture and Service Management (ITIL), Products Development, Coaching-Mentoring. For 10 years, he has been Chairman of the itSMF (IT Service Management Forum) Swiss chapter, involved with The Open Group Architecture and ArchiMate Forums.


By William Estrem, Serge Thorn and Sonia GonzalezSonia Gonzalez is The Open Group Forum Director for the Architecture and ArchiMate® Forums. Sonia has been also a trainer and consultant in the areas of business innovation, business process modeling, and Enterprise Architecture applying TOGAF® and ArchiMate. In this position, she is involved in the development and evolution of current and future EA standards. She is TOGAF® 9 Certified and ArchiMate® 2 Certified, and has been a trainer for an accredited training course provider and developed workshops and EA consultancy projects using the TOGAF standard as a reference framework and the ArchiMate standard as a modeling language.



1 Comment

Filed under ArchiMate®, Enterprise Architecture, Professional Development, Standards, TOGAF®, Uncategorized

“Lean” Enterprise Architecture powered by TOGAF® 9.1

By Krish Ayyar, Managing Principal, Martin-McDougall Technologies

Enterprise Architecture is there to solve Enterprise level problems. A typical problem scenario could be something like “A large Mining and Resources company uses many sensors to collect and feed engineering data back to the central control room for monitoring their assets. These sensors are from multiple vendors and they use proprietary networking technologies and also data formats. There are interoperability issues. The company would like to improve the manageability and availability of these systems by exploring solutions around the emerging Internet Of Things (IoT) technology”.

There are many ways to solve Enterprise level problems. A typical approach might be to purchase a packaged software or develop bespoke solutions and sponsor an IT project to implement it.

So, what is special about Enterprise Architecture? EA is the only approach that puts you in the driver seat when it comes to orderly evolution of your enterprise’s business and information systems landscape.

How do we go about doing this?

The best way is to develop Enterprise Architecture in a short engagement cycle of say 4 to 6 weeks through the use of TOGAF® 9.1 method. If you think about it, the TOGAF® ADM basically covers 4 “Meta” phases. They are namely: Preparing and Setting the Architecture Vision, Blueprinting the Target State, Solutioning & Road Mapping, Governance and Change Management. The key to a short engagement cycle is in not doing those activities which are already done elsewhere in the organisation but linking with them. This includes Business Strategy, IT Strategy, Detailed Implementation Planning and Governance. This might mean “Piggy Backing” on PMO processes and extending them to include Enterprise Architecture.

As part of “Preparing and Setting the Architecture Vision”, we identify the Business Goals, Objectives and Drivers related to this problem scenario. For instance in this case, let us say we ran business scenario workshops and documented the CFO’s statement that the overall cost of remotely monitoring and supporting Engineering Systems must come down. We now elicit the concerns and requirements related to business and information systems from the stakeholders. In this case, the CEO has felt that the company needs new capabilities for monitoring devices anytime, anywhere.

As part of the “Governance and Change Management”, we look at emerging Business and Technology trends. Internet of Things or “IoT” is trending as the technology which has the ability to connect sensors to the internet for effective control. At this juncture, we should do some research and collect information about the Product and Technology Solutions that could deliver the new or enhanced capabilities. Major vendors such as SAP, Cisco and Microsoft have IoT Solutions in their offerings. These solutions are capable of enabling remote support using mobile devices streaming data in the cloud, network infrastructure for transporting the data using open standards, Cloud Computing, sensor connectivity to Wifi / Internet etc.,

Next, as part of “Blueprinting the Target State””, we model the Current and Target state Business Capabilities and Information System Services and Functionalities. We can do this very quickly by selecting the relevant TOGAF® 9.1 Artifacts to address the concerns and requirements. These are grouped by Architecture Domains within the TOGAF® 9.1 document. We then identify the Gaps. In our example, these could be new support capabilities using IoT.

Now as part of “Solutioning and Road Mapping”, we roadmap the gaps in a practical way to deliver business value. We could effectively use the TOGAF® 9.1 “Business Value Assessment” technique to achieve this. This will help us to realise the business goals and objectives as per business priorities delivered by the solution components. For example, reducing the cost of remotely monitoring and supporting engineering systems could be realised by solutions that enable remote monitoring and support using mobile devices streaming data in the cloud.

Of course, architecture work is not complete until the solution is architected from a design perspective to manage the product and technology complexities during implementation. There is also the need for Architecture Governance to ensure that it does not go pear-shaped during implementation and operation.

This does not seem to be a lot of effort, does it? In fact, some sort of conceptualisation happens in all major projects prior to the business case leading up to funding approval. When it is done by people who do not have the right mix of strategy, project management, solutioning and consulting skills, it becomes a mere “tick in the box” exercise. Why not adopt this structured approach of Enterprise Architecture powered by TOGAF® 9.1 and reap the rewards?

By Krish Ayyar, Martin-McDougall TechnologiesKrish Ayyar is an accomplished Enterprise Architecture Practitioner with well over 10 years consulting and teaching Enterprise Architecture internationally. He is a sought after Trainer of TOGAF® 9.1 Level 2 and Archimate® 2.1 Level 2 Certification Courses with teaching experience for over 5 years in Australia, New Zealand, China, Japan, India, USA and Canada.  His experience includes a background in management consulting with Strategy and Business Transformation consulting, Enterprise Architecture consulting and Enterprise Architect functional roles in Australia, Singapore, Malaysia and USA for over 15 years. Krish is an active contributor to The Open Group Architecture Forum activities through membership of his own consulting company based in Sydney, Australia.  Krish has been a presenter in Open Group conferences at Boston, Washington D.C and Sydney. He is currently Vice Chair of the Certification Standing Committee of the Architecture Forum.



Comments Off on “Lean” Enterprise Architecture powered by TOGAF® 9.1

Filed under ArchiMate®, Business Architecture, Certifications, Enterprise Architecture, Professional Development, Standards, TOGAF®, Uncategorized

Professional Training Trends (Part Two): A Q&A with Chris Armstrong, Armstrong Process Group

By The Open Group

This is part two in a two part series.

Professional development and training is a perpetually hot topic within the technology industry. After all, who doesn’t want to succeed at their job and perform better?

Ongoing education and training is particularly important for technology professionals who are already in the field. With new tech trends, programming languages and methodologies continuously popping up, most professionals can’t afford not to keep their skill sets up to date these days.

The Open Group member Chris Armstrong is well-versed in the obstacles that technology professionals face to do their jobs. President of Armstrong Process Group, Inc. (APG), Armstrong and his firm provide continuing education and certification programs for technology professionals and Enterprise Architects covering all aspects of the enterprise development lifecycle. We recently spoke with Armstrong about the needs of Architecture professionals and the skills and tools he thinks are necessary to do the job effectively today.

What are some of the tools that EAs can be using to do architecture right now?

There’s quite a bit out there. I’m kind of developing a perspective on how to lay them out across the landscape a bit better. I think there are two general classes of EA tools based on requirements, which is not necessarily the same as what is offered by commercial or open source solutions.

When you take a look at the EA Capability model and the value chain, the first two parts of it have to do with understanding and analyzing what’s going on in an enterprise. Those can be effectively implemented by what I call Enterprise Architecture content management tools, or EACM. Most of the modeling tools would fall within that categorization. Tools that we use? There’s Sparx Enterprise Architect. It’s a very effective modeling tool that covers all aspects of the architecture landscape top-to-bottom, left-to-right and it’s very affordable. Consequently, it’s one of the most popular tools in the world—I think there are upwards of 300,000 licenses active right now. There are lots of other modeling tools as well.

A lot of people find the price point for Sparx Enterprise Architect so appealing that when people go into an investment and it’s only $5K, $10K, or $15K, instead of $100K or $250K, find it’s a great way to get into coming to grips with what it means to really build models. It really helps you build those fundamental modeling skills, which are best learned via on-the-job experience in your real business domain, without having to mortgage the farm.

Then there’s the other part of it, and this is where I think there needs to be a shift in emphasis to some extent. A lot of times the architect community gets caught up in modeling. We’ve been modeling for decades—modeling is not a new thing. Despite that—and this is just an anecdotal observation—the level of formal, rigorous modeling, at least in our client base in the U.S. market, is still very low. There are lots of Fortune 1000 organizations that have not made investments in some of these solutions yet, or are fractured or not well-unified. As a profession, we have a big history of modeling and I’m a big fan of that, but it sometimes seems a bit self-serving to some extent, in that a lot of times the people we model for are ourselves. It’s all good from an engineering perspective—helps us frame stuff up, produce views of our content that are meaningful to other stakeholders. But there’s a real missed opportunity in making those assets available and useful to the rest of the organization. Because if you build a model, irrespective of how good and relevant and pertinent it is, if nobody knows about it and nobody can use it to make good decisions or can’t use it to accelerate their project, there’s some legitimacy to the question of “So how much value is this really adding?” I see a chasm between the production of Enterprise Architecture content and the ease of accessing and using that content throughout the enterprise. The consumer market for Enterprise Architecture is much larger than the provider community.

But that’s a big part of the problem, which is why I mentioned cross-training earlier–most enterprises don’t have the self-awareness that they’ve made some investment in Enterprise Architecture and then often ironically end up making redundant, duplicative investments in repositories to keep track of inventories of things that EA is already doing or could already be doing. Making EA content as easily accessible to the enterprise as going to Google and searching for it would be a monumental improvement. One of the big barriers to re-use is finding if something useful has already been created, and there’s a lot of opportunity to deliver better capability through tooling to all of the consumers throughout an enterprise.

If we move a bit further along the EA value chain to what we call “Decide and Respond,” that’s a really good place for a different class of tools. Even though there are modeling tool vendors that try to do it, we need a second class of tools for EA Lifecycle Management (EALM), which is really getting into the understanding of “architecture-in-motion”. Once architecture content has been described as the current and future state, the real $64,000 question is how do we get there? How do we build a roadmap? How do we distribute the requirements of that roadmap across multiple projects and tie that to the strategic business decisions and critical assets over time? Then there’s how do I operate all of this stuff once I build it? That’s another part of lifecycle management—not just how do I get to this future state target architecture, but how do I continue to operate and evolve it incrementally and iteratively over time?

There are some tools that are emerging in the lifecycle management space and one of them is another product we partner with—that’s a solution from HP called Enterprise Maps. From our perspective it meets all the key requirements of what we consider enterprise architecture lifecycle management.

What tools do you recommend EAs use to enhance their skillsets?

Getting back to modeling—that’s a really good place to start as it relates to elevating the rigor of architecture. People are used to drawing pictures with something like Visio to graphically represent ”here’s how the business is arranged” or “here’s how the applications landscape looks,” but there’s a big difference in transitioning how to think about building a model. Because drawing a picture and building a model are not the same thing. The irony, though, is that to many consumers it looks the same, because you often look into a model through a picture. But the skill and the experience that the practitioner needs is very different. It’s a completely different way of looking at the world when you start building models as opposed to solely drawing pictures.

I see still, coming into 2015, a huge opportunity to uplift that skill set because I find a lot of people say they know how to model but they haven’t really had that experience. You just can’t simply explain it to somebody, you have to do it. It’s not the be-all and end-all—it’s part of the architect’s toolkit, but being able to think architecturally and from a model-driven approach is a key skill set that people are going to need to keep pace with all the rapid changes going on in the marketplace right now.

I also see that there’s still an opportunity to get people better educated on some formal modeling notations. We’re big fans of the Unified Modeling Language, UML. I still think uptake of some of those specifications is not as prevalent as it could be. I do see that there are architects that have some familiarity with some of these modeling standards. For example, in our TOGAF® training we talk about standards in one particular slide, many architects have only heard of one or two of them. That just points to there being a lack of awareness about the rich family of languages that are out there and how they can be used. If a community of architects can only identify one or two modeling languages on a list of 10 or 15 that is an indirect representation of their background in doing modeling, in my opinion. That’s anecdotal, but there’s a huge opportunity to uplift architect’s modeling skills.

How do you define the difference between models and pictures?

Modeling requires a theory—namely you have to postulate a theory first and then you build a model to test that theory. Picture drawing doesn’t require a theory—it just requires you to dump on a piece of paper a bunch of stuff that’s in your head. Modeling encourages more methodical approaches to framing the problem.

One of the anti-patterns that I’ve seen in many organizations is they often get overly enthusiastic, particularly when they get a modeling tool. They feel like they can suddenly do all these things they’ve never done before, all that modeling stuff, and they end up “over modeling” and not modeling effectively because one of the key things for modeling is modeling just enough because there’s never enough time to build the perfect thing. In my opinion, it’s about building the minimally sufficient model that’s useful. And in order to do that, you need to take a step back. TOGAF does acknowledge this in the ADM—you need to understand who your stakeholders are, what their concerns are and then use those concerns to frame how you look at this content. This is where you start coming up with the theory for “Why are we building a model?” Just because we have tools to build models doesn’t mean we should build models with those tools. We need to understand why we’re building models, because we can build infinite numbers of models forever, where none of them might be useful, and what’s the point of that?

The example I give is, there’s a CFO of an organization that needs to report earnings to Wall Street for quarterly projections and needs details from the last quarter. And the accounting people say, “We’ve got you covered, we know exactly what you need.” Then the next day the CFO comes in and on his/her desk is eight feet of green bar paper. She/he goes out to the accountants and says, “What the heck is this?” And they say “This is a dump of the general ledger for the first quarter. Every financial transaction you need.” And he/she says, “Well it’s been a while since I’ve been a CPA, and I believe it’s all in there, but there’s just no way I’ve got time to weed through all that stuff.”

There are generally accepted accounting principles where if I want to understand the relationship between revenue and expense that’s called a P&L and if I’m interested in understanding the difference between assets and liabilities that’s a balance sheet. We can think of the general ledger as the model of the finances of an organization. We need to be able to use intelligence to give people views of that model that are pertinent and help them understand things. So, the CFO says “Can you take those debits and credits in that double entry accounting system and summarize them into a one-pager called a P&L?”

The P&L would be an example of a view into a model, like a picture or diagram. The diagram comes from a model, the general ledger. So if you want to change the P&L in an accounting system you don’t change the financial statement, you change the general ledger. When you make an adjustment in your general ledger, you re-run your P&L with different content because you changed the model underneath it.

You can kind of think of it as the difference between doing accounting on register paper like we did up until the early 21st Century and then saying “Why don’t we keep track of all the debits and credits based on a chart of accounts and then we can use reporting capabilities to synthesize any way of looking at the finances that we care to?” It’s allows a different way for thinking about the interconnectedness of things.

What are some of the most sought after classes at APG?

Of course TOGAF certification is one of the big ones. I’d say in addition to that we do quite a bit in systems engineering, application architecture, and requirements management. Sometimes those are done in the context of solution delivery but sometimes they’re done in the context of Enterprise Architecture. There’s still a lot of opportunity in supporting Enterprise Architecture in some of the fundamentals like requirements management and effective architectural modeling.

What kinds of things should EAs look for in training courses?

I guess the big thing is to try to look for are offerings that get you as close to practical application as possible. A lot of people start with TOGAF and that’s a great way to frame the problem space. I would set the expectation—and we always do when we deliver our TOGAF training—that this will not tell you “how” to do Enterprise Architecture, there’s just not enough time for that in four days. We talk about “what” Enterprise Architecture is and related emerging best practices. That needs to be followed up with “how do I actually do Enterprise Architecture modeling,” “how do I actually collect EA requirements,” “how do I actually do architecture trade-off analysis?” Then “How do I synthesize an architecture roadmap,” “how do I put together a migration plan,” and “how do I manage the lifecycle of applications in my portfolio over the long haul?” Looking for training that gets you closer to those experiences will be the most valuable ones.

But a lot of this depends on the level of maturity within the organization, because in some places, just getting everybody on the same page of what Enterprise Architecture means is a big victory. But I also think Enterprise Architects need to be very thoughtful about this cross-training. I know it’s something I’m trying to make an investment in myself, is becoming more attuned to what’s going on in other parts of the enterprise in which Enterprise Architecture has some context but perhaps is not a known player. Getting training experiences in other places and engaging those parts of your organizations to really find out what are the problems they’re trying to solve and how might Enterprise Architecture help them is essential.

One of the best ways to demonstrate that is part of the organizational learning related to EA adoption. That may even be the bigger question. As individual architects, there are always opportunities for greater skill development, but really, organizational learning is where the real investment needs to be made so you can answer the question, “Why do I care?” One of the best ways to respond to that is to have an internal success. After a pilot project say, “We did EA on a limited scale for a specific purpose and look what we got out of it and how could you not want to do more of it?”

But ultimate the question usually should be “How can we make Enterprise Architecture indispensible? How can we create an environment where people can perform their duties more rapidly, more efficiently, more effectively and more sustainably based on Enterprise Architecture?” This is part of the problem, especially in larger organizations. In 2015, it’s not really the first time people have been making investments in Enterprise Architecture, it’s the second or third or fourth time, so it’s a reboot. You want to make sure that EA can become indispensible but you want to be able to support those critical activities with EA support and then when the stakeholders become dependent on it, you can say “If you like that stuff, we need you to show up with some support for EA and get some funding and resources, so we can continue to operate and sustain this capability.”

What we’ve found is that it’s a double-edged sword, ironically. If an organization has success in propping up their Architecture capability and sustaining and demonstrating some value there, it can be a snowball effect where you can become victims of your own success and suddenly people are starting to get wind of “Oh, I don’t have to do that if the EA’s already done it,” or “I can align myself with a part of the business where the EA has already been done.” The architecture community can get very busy—more busy than they’re prepared for—because of the momentum that might exist to really exploit those EA investments. But at the end of the day, it’s all good stuff because the more you can show the enterprise that it’s worth the investment, that it delivers value, the more likely you’ll get increased funding to sustain the capability.

By The Open GroupChris Armstrong is president of Armstrong Process Group, Inc. and an internationally recognized thought leader and expert in iterative software development, enterprise architecture, object-oriented analysis and design, the Unified Modeling Language (UML), use case driven requirements and process improvement.

Over the past twenty years, Chris has worked to bring modern software engineering best practices to practical application at many private companies and government organizations worldwide. Chris has spoken at over 30 conferences, including The Open Group Enterprise Architecture Practitioners Conference, Software Development Expo, Rational User Conference, OMG workshops and UML World. He has been published in such outlets as Cutter IT Journal, Enterprise Development and Rational Developer Network.

Join the conversation! @theopengroup #ogchat

Comments Off on Professional Training Trends (Part Two): A Q&A with Chris Armstrong, Armstrong Process Group

Filed under Business Architecture, Enterprise Architecture, Enterprise Transformation, Professional Development, Standards, TOGAF, TOGAF®, Uncategorized