Category Archives: Enterprise Architecture

Professional Training Trends (Part One): A Q&A with Chris Armstrong, Armstrong Process Group

By The Open Group

This is part one in a two part series.

Professional development and training is a perpetually hot topic within the technology industry. After all, who doesn’t want to succeed at their job and perform better?

Ongoing education and training is particularly important for technology professionals who are already in the field. With new tech trends, programming languages and methodologies continuously popping up, most professionals can’t afford not to keep their skill sets up to date these days.

The Open Group member Chris Armstrong is well-versed in the obstacles that technology professionals face to do their jobs. President of Armstrong Process Group, Inc. (APG), Armstrong and his firm provide continuing education and certification programs for technology professionals and Enterprise Architects covering all aspects of the enterprise development lifecycle. We recently spoke with Armstrong about the needs of Architecture professionals and the skills and tools he thinks are necessary to do the job effectively today.

What are some of the latest trends you’re seeing in training today?

If I look at the kinds of things we’ve been helping people with, we definitely continue to do professional certifications like TOGAF®. It appears that the U.S. is still lagging behind Europe with penetration of TOGAF certifications. For example, the trend has been that the U.K. is number one in certifications and the U.S. is number two. Based on sheer numbers of workers, there should actually be far more people certified in the U.S., but that could be related to cultural differences in regional markets as related to certification.

Another trend we’re seeing a lot of is “How do I do this in the real world?” TOGAF intentionally does not go to the level of detail that prescribes how you really do things. Many practitioners are looking for more focused, detailed training specific to different Enterprise Architecture (EA) domains. APG does quite a bit of that with our enterprise clients to help institutionalize EA practices. There are also many tool vendors that provide tools to help accomplish EA tasks and we help with training on those.

We also find that there’s a need for balance between how much to train someone in terms of formal training vs. mentoring and coaching them. As a profession, we do a lot of classroom training, but we need to follow up more with how we’re going to apply it in the real world and in our environment with on-the-job training. Grasping the concepts in an instructor-led class isn’t the same as doing it for real, when trying to solve a problem you actually care about.

When people are interested in becoming Enterprise Architects, what kind of training should they pursue?

That’s a pretty compelling question as it has to do with the state of the architecture profession, which is still in its infancy. From a milestone perspective, it’s still hard to call Enterprise Architecture a “true” profession if you can’t get educated on it. With other professions—attorneys or medical doctors—you can go to an accredited university and get a degree or a master’s and participate in continuing education. There are some indicators that things are progressing though. Now there are master’s programs in Enterprise Architecture at institutions like Penn State. We’ve donated some of our architecture curriculum as a gift- in-kind to the program and have a seat on their corporate advisory board. It was pretty awesome to make that kind of contribution to support and influence their program.

We talk about this in our Enterprise Architecture training to help to make people aware of that milestone. However, do you think that getting a four-year degree in Computer Science or Math or Engineering and then going on to get a master’s is sufficient to be a successful Enterprise Architect? Absolutely not. So if that’s insufficient, we have to agree what additional experiences individuals should have in order to become Enterprise Architects.

It seems like we need the kind of post-graduate experience of a medical doctor where there’s an internship and a residency, based on on-the-ground experience in the real world with guidance from seasoned professionals. That’s been that approach to most professional trades—apprentice, journeyman, to master—they require on-the-job training. You become a master artisan after a certain period of time and experience. Now there are board-level certifications and some elements of a true profession, but we’re just not there yet in Enterprise Architecture. Len Fehskens at the Association of Enterprise Architects (AEA) has been working on this a lot recently. I think it’s still unclear what it will take to legitimize this as a profession, and while I’m not sure I know the answer, there may be some indicators to consider.

I think as Enterprise Architecture becomes more commonplace, there will be more of an expectation for it. Part of the uptake issue is that most people running today’s organizations likely have an MBA and when they got it 20, 30 or 40 years ago, EA was not recognized as a key business capability. Now that there are EA master’s programs, future MBA candidates will have been exposed to it in their education, which will remove some of the organizational barriers to adoption.

I think it will still be another 20 or 30 years for mass awareness. As more organizations become successful in showing how they have exploited Enterprise Architecture to deliver real business benefits (increased profitability and reduced risk), the call for qualified people will increase. And because of the consequences of the decisions Enterprise Architects are involved in, business leaders will want assurance that their people are qualified and have the requisite accreditation and experience that they’d expect from an attorney or doctor.

Maybe one other thing to call out—in order for us to overcome some of these barriers, we need to be thinking about what kind of education do we need to be providing our business leaders about Enterprise Architecture so that they are making the right kinds of investments. It’s not just Architect education that we need, but also business leader education.

What kind of architecture skills are most in demand right now?

Business Architecture has a lot of legs right now because it’s an essential part of the alignment with the business. I do see some risks of bifurcation between the “traditional” EA community and the emerging Business Architecture community. The business is enterprise, so it’s critical that the EA and BA communities are unified. There is more in common amongst us than differences as professionals, and I think there’s strength in numbers. And while Business Architecture seems to have some good velocity right now, at the end of the day you still need to be able to support your business with IT Architecture.

There is a trend coming up I do wonder about, which is related to Technology Architecture, as it’s known in TOGAF. Some people may also call it Infrastructure Architecture. With the evolution of cloud as a platform, it’s becoming in my mind—and this might be just because I’m looking at it from the perspective of a start-up IT company with APG—it’s becoming less and less of an issue to have to care as much about the technology and the infrastructure because in many cases people are making investments in these platforms where that’s taken care of by other people. I don’t want to say we don’t care at all about the technology, but a lot of the challenges organizations have of standardizing on technology to make sure that things can be easily sustainable from a cost and risk perspective, many of those may change when more and more organizations start putting things in the cloud, so it could possibly mean that a lot of the investments that organizations have made in technical architecture could become less important.

Although, that will have to be compensated for from a different perspective, particularly an emerging domain that some people call Integration Architecture. And that also applies to Application Architecture as well—as many organizations move away from custom development to packaged solutions and SaaS solutions, when they think about where they want to make investments, it may be that when all these technologies and application offerings are being delivered to us via the cloud, we may need to focus more on how they’re integrated with one another.

But there’s still obviously a big case for the entirety of the discipline—Enterprise Architecture—and really being able to have that clear line of sight to the business.

What are some of the options currently available for ongoing continuing education for Enterprise Architects?

The Association of Enterprise Architects (AEA) provides a lot of programs to help out with that by supplementing the ecosystem with additional content. It’s a blend between formal classroom training and conference proceedings. We’re doing a monthly webinar series with the AEA entitled “Building an Architecture Platform,” which focuses on how to establish capabilities within the organization to deliver architecture services. The topics are about real-world concerns that have to do with the problems practitioners are trying to address. Complementing professional skills development with these types of offerings is another part of the APG approach.

One of the things APG is doing, and this is a project we’re working on with others at The Open Group, is defining an Enterprise Architecture capability model. One of the things that capability model will be used for is to decide where organizations need to make investments in education. The current capability model and value chain that we have is pretty broad and has a lot of different dimensions to it. When I take a look at it and think “How do people do those things?” I see an opportunity for education and development. Once we continue to elaborate the map of things that comprise Enterprise Architecture, I think we’ll see a lot of opportunity for getting into a lot of different dimensions of how Enterprise Architecture affects an organization.

And one of the things we need to think about is how we can deliver just-in-time training to a diverse, global community very rapidly and effectively. Exploiting online learning management systems and remote coaching are some of the avenues that APG is pursuing.

Are there particular types of continuing education programs that EAs should pursue from a career development standpoint?

One of the things I’ve found interesting is that I’ve seen a number of my associates in the profession going down the MBA path. My sense is that that’s a representation of an interest in understanding better how the business executives see the enterprise from their world and to help perhaps frame the question “How can I best anticipate and understand where they’re coming from so that I can more effectively position Enterprise Architecture at a different level?” So that’s cross-disciplinary training. Of course that makes a lot of sense, because at the end of the day, that’s what Enterprise Architecture is all about—how to exploit the synergy that exists within an enterprise. A lot of times that’s about going horizontal within the organization into places where people didn’t necessarily think you had any business in. So raising that awareness and understanding of the relevance of EA is a big part of it.

Another thing that certainly is driving many organizations is regulatory compliance, particularly general risk management. A lot of organizations are becoming aware that Enterprise Architecture plays an essential role in supporting that. Getting cross-training in those related disciplines would make a lot of sense. At the end of the day, those parts of an organizations typically have a lot more authority, and consequently, a lot more funding and money than Enterprise Architecture does, because the consequence of non-conformance is very punitive—the pulling of licenses to operate, heavy fines, bad publicity. We’re just not quite there that if an organization were to not do “good” on Enterprise Architecture, that it’d become front-page news in The New York Times. But when someone steals 30 million cardholders’ personal information, that does become headline news and the subject of regulatory punitive damages. And not to say that Enterprise Architecture is the savior of all things, but it is well-accepted within the EA community that Enterprise Architecture is an essential part of building an effective governance and a regulatory compliance environment.

By The Open GroupChris Armstrong is president of Armstrong Process Group, Inc. and an internationally recognized thought leader and expert in iterative software development, enterprise architecture, object-oriented analysis and design, the Unified Modeling Language (UML), use case driven requirements and process improvement.

Over the past twenty years, Chris has worked to bring modern software engineering best practices to practical application at many private companies and government organizations worldwide. Chris has spoken at over 30 conferences, including The Open Group Enterprise Architecture Practitioners Conference, Software Development Expo, Rational User Conference, OMG workshops and UML World. He has been published in such outlets as Cutter IT Journal, Enterprise Development and Rational Developer Network.

Join the conversation!  @theopengroup #ogchat

Comments Off on Professional Training Trends (Part One): A Q&A with Chris Armstrong, Armstrong Process Group

Filed under Business Architecture, Enterprise Architecture, Professional Development, Standards, TOGAF®, Uncategorized

A Historical Look at Enterprise Architecture with John Zachman

By The Open Group

John Zachman’s Zachman Framework is widely recognized as the foundation and historical basis for Enterprise Architecture. On Tuesday, Feb. 3, during The Open Group’s San Diego 2015 event, Zachman will be giving the morning’s keynote address entitled “Zachman on the Zachman Framework and How it Complements TOGAF® and Other Frameworks.”

We recently spoke to Zachman in advance of the event about the origins of his framework, the state of Enterprise Architecture and the skills he believes EAs need today.

As a discipline, Enterprise Architecture is still fairly young. It began getting traction in the mid to late 1980s after John Zachman published an article describing a framework for information systems architectures in the IBM Systems Journal. Zachman said he lived to regret initially calling his framework “A Framework for Information Systems Architecture,” instead of “Enterprise Architecture” because the framework actually has nothing to do with information systems.

Rather, he said, it was “A Framework for Enterprise Architecture.” But at the time of publication, the idea of Enterprise Architecture was such a foreign concept, Zachman said, that people didn’t understand what it was. Even so, the origins of his ontological framework were already almost 20 years old by the time he first published them.

In the late 1960s, Zachman was working as an account executive in the Marketing Division of IBM. His account responsibility was working with the Atlantic Richfield Company (better known as ARCO). In 1969, ARCO had just been newly formed out of the merger of three separate companies, Atlantic Refining out of Philadelphia and Richfield in California, which merged and then bought Sinclair Oil in New York in 1969.

“It was the biggest corporate merger in history at the time where they tried to integrate three separate companies into one company. They were trying to deal with an enterprise integration issue, although they wouldn’t have called it that at the time,” Zachman said.

With three large companies to merge, ARCO needed help in figuring out how to do the integration. When the client asked Zachman how they should handle such a daunting task, he said he’d try to get some help. So he turned to a group within IBM called the Information Systems Control and Planning Group and the group’s Director of Architecture, Dewey Walker, for guidance.

Historically, when computers were first used in commercial applications, there already were significant “Methods and Procedures” systems communities in most large organizations whose job was to formalize many manual systems in order to manage the organization, Zachman said. When computers came on the scene, they were used to improve organizational productivity by replacing the people performing the organizations’ processes. However, because manual systems defined and codified organizational responsibilities, when management made changes within an organization, as they often did, it would render the computer systems obsolete, which required major redevelopment.

Zachman recalled Walker’s observation that “organizational responsibilities” and “processes” were two different things. As such, he believed systems should be designed to automate the process, not to encode the organizational responsibilities, because the process and the organization changed independently from one another. By separating these two independent variables, management could change organizational responsibilities without affecting or changing existing systems or the organization. Many years later, Jim Champy and Mike Hammer popularized this notion in their widely read 1991 book, “Reengineering the Corporation,” Zachman said.

According to Zachman, Walker created a methodology for defining processes as separate entities from the organizational structure. Walker came out to Los Angeles, where Zachman and ARCO were based to help provide guidance on the merger. Zachman recalls Walker telling him that the key to defining the systems for Enterprise purposes was in the data, not necessarily the process itself. In other words, the data across the company needed to be normalized so that they could maintain visibility into the assets and structure of the enterprise.

“The secret to this whole thing lies in the coding and the classification of the data,” Zachman recalled Walker saying. Walker’s methodology, he said, began by classifying data by its existence not by its use.

Since all of this was happening well before anyone came up with the concept of data modeling, there were no data models from which to design their system. “Data-oriented words were not yet in anyone’s vocabulary,” Zachman said. Walker had difficulty articulating his concepts because the words he had at his disposal were inadequate, Zachman said.

Walker understood that to have structural control over the enterprise, they needed to look at both processes and data as independent variables, Zachman said. That would provide the flexibility and knowledge base to accommodate escalating change. This was critical, he said, because the system is the enterprise. Therefore, creating an integrated structure of independent variables and maintaining visibility into that structure are crucial if you want to be able to manage and change it. Otherwise, he says, the enterprise “disintegrates.”

Although Zachman says Walker was “onto this stuff early on,” Walker eventually left IBM, leaving Zachman with the methodology Walker had named “Business Systems Planning.” (Zachman said Walker knew that it wasn’t just about the information systems, but about the business systems.) According to Zachman, he inherited Walker’s methodology because he’d been working closely with Walker. “I was the only person that had any idea what Dewey was doing,” he said.

What he was left with, Zachman says, was what today he would call a “Row 1 methodology”—or the “Executive Perspective” and the “Scope Contexts” in what would eventually become his ontology.

According to Zachman, Walker had figured out how to transcribe enterprise strategy in such a fashion that engineering work could be derived from it. “What we didn’t know how to do,” Zachman said, “was to transform the strategy (Zachman Framework Row 1), which tends to be described at a somewhat abstract level of definition into the operating Enterprise (Row 6), which was comprised of very precise instructions (explicit or implicit) for behavior of people and/or machines.”

Zachman said that they knew that “Architecture” had something to do with the Strategy to Instantiation transformation logic but they didn’t know what architecture for enterprises was in those days. His radical idea was to ask someone who did architecture for things like buildings, airplanes, locomotives, computers or battleships. What the architecture was for those Industrial Age products. Zachman believed if he could find out what they thought architecture was for those products, he might be able to figure out what architecture was for enterprises and thereby figure out how to transform the strategy into the operating enterprise to align the enterprise implementation with the strategy.

With this in mind, Zachman began reaching out to people in other disciplines to see how they put together things like buildings or airplanes. He spoke to an architect friend and also to some of the aircraft manufacturers that were based in Southern California at the time. He began gathering different engineering specs and studying them.

One day while he was sitting at his desk, Zachman said, he began sorting the design artifacts he’d collected for buildings and airplanes into piles. Suddenly he noticed there was something similar in how the design patterns were described.

“Guess what?” he said. “The way you describe buildings is identical to the way you describe airplanes, which turns out to be identical to the way you describe locomotives, which is identical to the way you describe computers. Which is identical to the way you describe anything else that humanity has ever described.”

Zachman says he really just “stumbled across” the way to describe the enterprise and attributes his discovery to providence, a miracle! Despite having kick-started the discipline of Enterprise Architecture with this recognition, Zachman claims he’s “actually not very innovative,” he said.

“I just saw the pattern and put enterprise names on it,” he said

Once he understood that Architectural design descriptions all used the same categories and patterns, he knew that he could also define Architecture for Enterprises. All it would take would be to apply the enterprise vocabulary to the same pattern and structure of the descriptive representations of everything else.

“All I did was, I saw the pattern of the structure of the descriptive representations for airplanes, buildings, locomotives and computers, and I put enterprise names on the same patterns,” he says. “Now you have the Zachman Framework, which basically is Architecture for Enterprises. It is Architecture for every other object known to human kind.”

Thus the Zachman Framework was born.

Ontology vs. Methodology

According to Zachman, what his Framework is ultimately intended for is describing a complex object, an Enterprise. In that sense, the Zachman Framework is the ontology for Enterprise Architecture, he says. What it doesn’t do, is tell you how to do Enterprise Architecture.

“Architecture is architecture is architecture. My framework is just the definition and structure of the descriptive representation for enterprises,” he said.

That’s where methodologies, such as TOGAF®, an Open Group standard, DoDAF or other methodological frameworks come in. To create and execute an Architecture, practitioners need both the ontology—to help them define, translate and place structure around the enterprise descriptive representations—and they need a methodology to populate and implement it. Both are needed—it’s an AND situation, not an OR, he said. The methodology simply needs to use (or reuse) the ontological constructs in creating the implementation instantiations in order for the enterprise to be “architected.”

The Need for Architecture

Unfortunately, Zachman says, there are still a lot of companies today that don’t understand the need to architect their enterprise. Enterprise Architecture is simply not on the radar of general management in most places.

“It’s not readily acknowledged on the general management agenda,” Zachman said.

Instead, he says, most companies focus their efforts on building and running systems, not engineering the enterprise as a holistic unit.

“We haven’t awakened to the concept of Enterprise Architecture,” he says. “The fundamental reason why is people think it takes too long and it costs too much. That is a shibboleth – it doesn’t take too long or cost too much if you know what you’re doing and have an ontological construct.”

Zachman believes many companies are particularly guilty of this type of thinking, which he attributes to a tendency to think that there isn’t any work being done unless the code is up and running. Never mind all the work it took to get that code up and running in the first place.

“Getting the code to run, I’m not arguing against that, but it ought to be in the context of the enterprise design. If you’re just providing code, you’re going to get exactly what you have right now—code. What does that have to do with management’s intentions or the Enterprise in its entirety?”

As such, Zachman compares today’s enterprises to log cabins rather than skyscrapers. Many organizations have not gotten beyond that “primitive” stage, he says, because they haven’t been engineered to be integrated or changed.

According to Zachman, the perception that Enterprise Architecture is too costly and time consuming must change. And, people also need to stop thinking that Enterprise Architecture belongs solely under the domain of IT.

“Enterprise Architecture is not about building IT models. It’s about solving general management problems,” he said. “If we change that perception, and we start with the problem and we figure out how to solve that problem, and then, oh by the way we’re doing Architecture, then we’re going to get a lot of Architecture work done.”

Zachman believes one way to do this is to build out the Enterprise Architecture iteratively and incrementally. By tackling one problem at a time, he says, general management may not even need to know whether you’re doing Enterprise Architecture or not, as long as their problem is being solved. The governance system controls the architectural coherence and integration of the increments. He expects that EA will trend in that direction over the next few years.

“We’re learning much better how to derive immediate value without having the whole enterprise engineered. If we can derive immediate value, that dispels the shibboleth—the misperception that architecture takes too long and costs too much. That’s the way to eliminate the obstacles for Enterprise Architecture.”

As far as the skills needed to do EA into the future, Zachman believes that enterprises will eventually need to have multiple types of architects with different skill sets to make sure everything is aligned. He speculates that someday, there may need to be specialists for every cell in the framework, saying that there is potentially room for a lot of specialization and people with different skill sets and a lot of creativity. Just as aircraft manufacturers need a variety of engineers—from aeronautic to hydraulic and everywhere in between—to get a plane built. One engineer does not engineer the entire airplane or a hundred-story building or an ocean liner, or, for that matter, a personal computer. Similarly, increasingly complex enterprises will likely need multiple types of engineering specialties. No one person knows everything.

“Enterprises are far more complex than 747s. In fact, an enterprise doesn’t have to be very big before it gets really complex,” he said. “As enterprise systems increase in size, there is increased potential for failure if they aren’t architected to respond to that growth. And if they fail, the lives and livelihoods of hundreds of thousand of people can be affected, particularly if it’s a public sector Enterprise.”

Zachman believes it may ultimately take a generation or two for companies to understand the need to better architect the way they run. As things are today, he says, the paradigm of the “system process first” Industrial Age is still too ingrained in how systems are created. He believes it will be a while before that paradigm shifts to a more Information Age-centric way of thinking where the enterprise is the object rather than the system.

“Although this afternoon is not too early to start working on it, it is likely that it will be the next generation that will make Enterprise Architecture an essential way of life like it is for buildings and airplanes and automobiles and every other complex object,” he said.

By The Open GroupJohn A. Zachman, Founder & Chairman, Zachman International, Executive Director of FEAC Institute, and Chairman of the Zachman Institute

Join the conversation – @theopengroup, #ogchat, #ogSAN


Filed under Enterprise Architecture, Standards, TOGAF®, Uncategorized

Catching Up with The Open Group Internet of Things Work Group

By The Open Group

The Open Group’s Internet of Things (IoT) Work Group is involved in developing open standards that will allow product and equipment management to evolve beyond the traditional limits of product lifecycle management. Meant to incorporate the larger systems management that will be required by the IoT, these standards will help to handle the communications needs of a network that may encompass products, devices, people and multiple organizations. Formerly known as the Quantum Lifecycle Management (QLM) Work Group, its name was recently changed to the Internet of Things Work Group to more accurately reflect its current direction and focus.

We recently caught up with Work Group Chairman Kary Främling to discuss its two new standards, both of which are geared toward the Internet of Things, and what the group has been focused on lately.

Over the past few years, The Open Group’s Internet of Things Work Group (formerly the Quantum Lifecycle Management Work Group) has been working behind the scenes to develop new standards related to the nascent Internet of Things and how to manage the lifecycle of these connected products, or as General Electric has referred to it, the “Industrial Internet.”

What their work ultimately aims to do is help manage all the digital information within a particular system—for example, vehicles, buildings or machines. By creating standard frameworks for handling this information, these systems and their related applications can be better run and supported during the course of their “lifetime,” with the information collected serving a variety of purposes, from maintenance to improved design and manufacturing to recycling and even refurbishing them.

According to Work Group Chairman Kary Främling, CEO of ControlThings and Professor of Practice in Building Information Modeling at Aalto University in Finland, the group has been working with companies such as Caterpillar and Fiat, as well as refrigerator and machine tool manufacturers, to enable machines and equipment to send sensor and status data on how machines are being used and maintained to their manufacturers. Data can also be provided to machine operators so they are also aware of how the machines are functioning in order to make changes if need be.

For example, Främling says that one application of this system management loop is in HVAC systems within buildings. By building Internet capabilities into the system, now a ventilation system—or air-handling unit—can be controlled via a smartphone from the moment it’s turned on inside a building. The system can provide data and alerts to facilities management about how well it’s operating and whether there are any problems within the system to whomever needs it. Främling also says that the system can provide information to both the maintenance company and the system manufacturer so they can collect information from the machines on performance, operations and other indicators. This allows users to determine things as simple as when an air filter may need changing or whether there are systematic problems with different machine models.

According to Främling, the ability to monitor systems in this way has already helped ventilation companies make adjustments to their products.

“What we noticed was there was a certain problem with certain models of fans in these machines. Based on all the sensor readings on the machine, I could deduce that the air extraction fan had broken down,” he said.

The ability to detect such problems via sensor data as they are happening can be extremely beneficial to manufacturers because they can more easily and more quickly make improvements to their systems. Another advantage afforded by machines with Web connectivity, Främling says, is that errors can also be corrected remotely.

“There’s so much software in these machines nowadays, so just by changing parameters you can make them work better in many ways,” he says.

In fact, Främling says that the Work Group has been working on systems such as these for a number of years already—well before the term “Internet of Things” became part of industry parlance. They first worked on a system for a connected refrigerator in 2007 and even worked on systems for monitoring how vehicles were used before then.

One of the other things the Work Group is focused on is working with the Open Platform 3.0 Forum since there are many synergies between the two groups. For instance, the Work Group provided a number of the uses cases for the Forum’s recent business scenarios.

“I really see what we are doing is enabling the use cases and these information systems,” Främling says.

Two New Standards

In October, the Work Group also published two new standards, both of which are two of the first standards to be developed for the Internet of Things (IoT). A number of companies and universities across the world have been instrumental in developing the standards including Aalto University in Finland, BIBA, Cambridge University, Infineon, InMedias, Politechnico di Milano, Promise Innovation, SAP and Trackway Ltd.

Främling likens these early IoT standards to what the HTML and HTTP protocols did for the Internet. For example, the Open Data Format (O-DF) Standard provides a common language for describing any kind of IoT object, much like HTML provided a language for the Web. The Open Messaging Interface (O-MI) Standard, on the other hand, describes a set of operations that enables users to read information about particular systems and then ask those systems for that information, much like HTTP. Write operations then allow users to also send information or new values to the system, for example, to update the system.

Users can also subscribe to information contained in other systems. For instance, Främling described a scenario in which he was able to create a program that allowed him to ask his car what was wrong with it via a smartphone when the “check engine” light came on. He was then able to use a smartphone application to send an O-MI message to the maintenance company with the error code and his location. Using an O-MI subscription the maintenance company would be able to send a message back asking for additional information. “Send these five sensor values back to us for the next hour and you should send them every 10 seconds, every 5 seconds for the temperature, and so on,” Främling said. Once that data is collected, the service center can analyze what’s wrong with the vehicle.

Främling says O-MI messages can easily be set up on-the-fly for a variety of connected systems with little programming. The standard also allows users to manage mobility and firewalls. O-MI communications are also run over systems that are already secure to help prevent security issues. Those systems can include anything from HTTP to USB sticks to SMTP, as well, Främling says.

Främling expects that these standards can also be applied to multiple types of functionalities across different industries, for example for connected systems in the healthcare industry or to help manage energy production and consumption across smart grids. With both standards now available, the Work Group is beginning to work on defining extensions for the Data Format so that vocabularies specific to certain industries, such as healthcare or manufacturing, can also be developed.

In addition, Främling expects that as protocols such as O-MI make it easier for machines to communicate amongst themselves, they will also be able to begin to optimize themselves over time. Cars, in fact, are already using this kind of capability, he says. But for other systems, such as buildings, that kind of communication is not happening yet. He says in Finland, his company has projects underway with manufacturers of diesel engines, cranes, elevators and even in Volkswagen factories to establish information flows between systems. Smart grids are also another potential use. In fact his home is wired to provide consumption rates in real-time to the electric company, although he says he does not believe they are currently doing anything with the data.

“In the past we used to speak about these applications for pizza or whatever that can tell a microwave oven how long it should be heated and the microwave oven also checks that the food hasn’t expired,” Främling said.

And while your microwave may not yet be able to determine whether your food has reached its expiration date, these recent developments by the Work Group are helping to bring the IoT vision to fruition by making it easier for systems to begin the process of “talking” to each other through a standardized messaging system.

By The Open GroupKary Främling is currently CEO of the Finnish company ControlThings, as well as Professor of Practice in Building Information Modeling (BIM) at Aalto University, Finland. His main research topics are on information management practices and applications for BIM and product lifecycle management in general. His main areas of competence are distributed systems, middleware, multi-agent systems, autonomously learning agents, neural networks and decision support systems. He is one of the worldwide pioneers in the Internet of Things domain, where he has been active since 2000.

@theopengroup; #ogchat

Comments Off on Catching Up with The Open Group Internet of Things Work Group

Filed under digital technologies, Enterprise Transformation, Future Technologies, Internet of Things, Open Platform 3.0, Uncategorized

Putting Information Technology at the Heart of the Business: The Open Group San Diego 2015

By The Open Group

The Open Group is hosting the “Enabling Boundaryless Information Flow™” event February 2 – 5, 2015 in San Diego, CA at the Westin San Diego Gaslamp Quarter. The event is set to focus on the changing role of IT within the enterprise and how new IT trends are empowering improvements in businesses and facilitating Enterprise Transformation. Key themes include Dependability through Assuredness™ (The Cybersecurity Connection) and The Synergy of Enterprise Architecture Frameworks. Particular attention throughout the event will be paid to the need for continued development of an open TOGAF® Architecture Development Method and its importance and value to the wider business architecture community. The goal of Boundaryless Information Flow will be featured prominently in a number of tracks throughout the event.

Key objectives for this year’s event include:

  • Explore how Cybersecurity and dependability issues are threatening business enterprises and critical infrastructure from an integrity and a Security perspective
  • Show the need for Boundaryless Information Flow™, which would result in more interoperable, real-time business processes throughout all business ecosystems
  • Outline current challenges in securing the Internet of Things, and about work ongoing in the Security Forum and elsewhere that will help to address the issues
  • Reinforce the importance of architecture methodologies to assure your enterprise is transforming its approach along with the ever-changing threat landscape
  • Discuss the key drivers and enablers of social business technologies in large organizations which play an important role in the co-creation of business value, and discuss the key building blocks of social business transformation program

Plenary speakers at the event include:

  • Chris Forde, General Manager, Asia Pacific Region & VP, Enterprise Architecture, The Open Group
  • John A. Zachman, Founder & Chairman, Zachman International, and Executive Director of FEAC Institute

Full details on the range of track speakers at the event can be found here, with the following (among many others) contributing:

  • Dawn C. Meyerriecks, Deputy Director for Science and Technology, CIA
  • Charles Betz, Founder, Digital Management Academy
  • Leonard Fehskens. Chief Editor, Journal of Enterprise Architecture, AEA

Registration for The Open Group San Diego 2015 is open and available to members and non-members. Please register here.

Join the conversation via Twitter – @theopengroup #ogSAN



Filed under Boundaryless Information Flow™, Dependability through Assuredness™, Internet of Things, Professional Development, Security, Standards, TOGAF®, Uncategorized

Open FAIR Certification for People Program

By Jim Hietala, VP Security, and Andrew Josey, Director of Standards, The Open Group

In this, the final installment of this Open FAIR blog series, we will look at the Open FAIR Certification for People program.

In early 2012, The Open Group Security Forum began exploring the idea of creating a certification program for Risk Analysts. Discussions with large enterprises regarding their risk analysis programs led us to the conclusion that there was a need for a professional certification program for Risk Analysts. In addition, Risk Analyst professionals and Open FAIR practitioners expressed interest in a certification program. Security and risk training organizations also expressed interest in providing training courses based upon the Open FAIR standards and Body of Knowledge.

The Open FAIR People Certification Program was designed to meet the requirements of employers and risk professionals. The certification program is a knowledge-based certification, testing candidates knowledge of the two standards, O-RA, and O-RT. Candidates are free to acquire their knowledge through self-study, or to take a course from an accredited training organization. The program currently has a single level (Foundation), with a more advanced certification level (Certified) planned for 2015.

Several resources are available from The Open Group to assist Risk Analysts preparing to sit for the exam, including the following:

  • Open FAIR Pocket Guide
  • Open FAIR Study Guide
  • Risk Taxonomy (O-RT), Version 2.0 (C13K, October 2013) defines a taxonomy for the factors that drive information security risk – Factor Analysis of Information Risk (FAIR).
  • Risk Analysis (O-RA) (C13G, October 2013) describes process aspects associated with performing effective risk analysis.

All of these can be downloaded from The Open Group publications catalog at

For training organizations, The Open Group accredits organizations wishing to offer training courses on Open FAIR. Testing of candidates is offered through Prometric test centers worldwide.

For more information on Open FAIR certification or accreditation, please contact us at:

By Jim Hietala and Andrew JoseyJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT Security, Risk Management and Healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on Information Security, Risk Management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.


By Andrew JoseyAndrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate® 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX® Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.




Comments Off on Open FAIR Certification for People Program

Filed under Accreditations, Certifications, Cybersecurity, Enterprise Architecture, Information security, Open FAIR Certification, Professional Development, RISK Management, Security, Uncategorized

Using the Open FAIR Body of Knowledge with Other Open Group Standards

By Jim Hietala, VP Security, and Andrew Josey, Director of Standards, The Open Group

This is the third in our four part blog series introducing the Open FAIR Body of Knowledge. In this blog, we look at how the Open FAIR Body of Knowledge can be used with other Open Group standards.

The Open FAIR Body of Knowledge provides a model with which to decompose, analyze, and measure risk. Risk analysis and management is a horizontal enterprise capability that is common to many aspects of running a business. Risk management in most organizations exists at a high level as Enterprise Risk Management, and it exists in specialized parts of the business such as project risk management and IT security risk management. Because the proper analysis of risk is a fundamental requirement for different areas of Enterprise Architecture (EA), and for IT system operation, the Open FAIR Body of Knowledge can be used to support several other Open Group standards and frameworks.

The TOGAF® Framework

In the TOGAF 9.1 standard, Risk Management is described in Part III: ADM Guidelines and Techniques. Open FAIR can be used to help improve the measurement of various types of Risk, including IT Security Risk, Project Risk, Operational Risk, and other forms of Risk. Open FAIR can help to improve architecture governance through improved, consistent risk analysis and better Risk Management. Risk Management is described in the TOGAF framework as a necessary capability in building an EA practice. Use of the Open FAIR Body of Knowledge as part of an EA risk management capability will help to produce risk analysis results that are accurate and defensible, and that are more easily communicated to senior management and to stakeholders.


The Open Information Security Management Maturity Model (O-ISM3) is a process-oriented approach to building an Information Security Management System (ISMS). Risk Management as a business function exists to identify risk to the organization, and in the context of O-ISM3, information security risk. Open FAIR complements the implementation of an O-ISM3-based ISMS by providing more accurate analysis of risk, which the ISMS can then be designed to address.


The Open Enterprise Security Architecture (O-ESA) from The Open Group describes a framework and template for policy-driven security architecture. O-ESA (in Sections 2.2 and 3.5.2) describes risk management as a governance principle in developing an enterprise security architecture. Open FAIR supports the objectives described in O-ESA by providing a consistent taxonomy for decomposing and measuring risk. Open FAIR can also be used to evaluate the cost and benefit, in terms of risk reduction, of various potential mitigating security controls.


The O-TTPS standard, developed by The Open Group Trusted Technology Forum, provides a set of guidelines, recommendations, and requirements that help assure against maliciously tainted and counterfeit products throughout commercial off-the-shelf (COTS) information and communication technology (ICT) product lifecycles. The O-TTPS standard includes requirements to manage risk in the supply chain (SC_RSM). Specific requirements in the Risk Management section of O-TTPS include identifying, assessing, and prioritizing risk from the supply chain. The use of the Open FAIR taxonomy and risk analysis method can improve these areas of risk management.

The ArchiMate® Modeling Language

The ArchiMate modeling language, as described in the ArchiMate Specification, can be used to model Enterprise Architectures. The ArchiMate Forum is also considering extensions to the ArchiMate language to include modeling security and risk. Basing this risk modeling on the Risk Taxonomy (O-RT) standard will help to ensure that the relationships between the elements that create risk are consistently understood and applied to enterprise security and risk models.


The O-DA standard ((Dependability Through Assuredness), developed by The Open Group Real-time and Embedded Systems Forum, provides the framework needed to create dependable system architectures. The requirements process used in O-DA requires that risk be analyzed before developing dependability requirements. Open FAIR can help to create a solid risk analysis upon which to build dependability requirements.

In the final installment of this blog series, we will look at the Open FAIR certification for people program.

The Open FAIR Body of Knowledge consists of the following Open Group standards:

  • Risk Taxonomy (O-RT), Version 2.0 (C13K, October 2013) defines a taxonomy for the factors that drive information security risk – Factor Analysis of Information Risk (FAIR).
  • Risk Analysis (O-RA) (C13G, October 2013) describes process aspects associated with performing effective risk analysis.

These can be downloaded from The Open Group publications catalog at

Our other publications include a Pocket Guide and a Certification Study Guide.

By Jim Hietala and Andrew JoseyJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT Security, Risk Management and Healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on Information Security, Risk Management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.


By Andrew JoseyAndrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate® 2.1, IEEE Std 1003.1,2013 edition (POSIX), and the core specifications of the Single UNIX® Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.


Comments Off on Using the Open FAIR Body of Knowledge with Other Open Group Standards

Filed under ArchiMate®, Cybersecurity, Enterprise Architecture, O-TTF, O-TTPS, OTTF, real-time and embedded systems, RISK Management, Security, Standards, TOGAF®, Uncategorized

The Open Group ArchiMate® Model File Exchange Format

By The Open Group

The Open Group ArchiMate Forum has released a snapshot of its ArchiMate® Model Exchange File Format. This aims to address the challenge of portability of models between tools.

Following is a Q&A with Andrew Josey, Phil Beauvoir and Frans Faase, members of the project team, to find out more.

Q.  What is The Open Group ArchiMate Model Exchange File Format?

A.  It is a specification of a standard file format for the exchange of ArchiMate models between different tools.

Q.  Why is it provided as a Snapshot release?

A.  The Snapshot makes public the direction and thinking the project is taking in the development of a standard file format supporting exchange of ArchiMate models between tools. We’re looking for feedback and guidance from the community at this stage.

Q.  When do you need feedback by and how should it be provided?

A.  Comments can be sent by email to no later than January 12, 2015.

Q.  What is addressed in the Snapshot release?

A.  The project is being implemented as two phases:

  •     Phase 1 includes the core exchange format.
  •     Phase 2 includes in addition the visual layout.

This Snapshot describes Phase 1 only, and excludes the detailed visual layout, which will be included in Phase 2.

Q.  Do you intend the format as a persistent file format for an ArchiMate model?

A.  No, The exchange file format is not intended as a persistent file format for the model itself, it is a mechanism to convey instance data from one tool to another (a simple analogy would be the csv file format for exchange of spreadsheet information). The data contained in the exchange file format is meant to be processed by an “ArchiMate aware” tool, thus ruling out standalone semantic inference. Once the instance data has been imported into an ArchiMate tool, that tool will probably save it in its own proprietary file format.

Q.  Where can I obtain the Snapshot release?

A.  The Snapshot can be obtained from The Open Group publications catalog.

Q.  What is provided with the Snapshot release?

A.  The deliverables included with this Snapshot are as follows:

  • Open Group Snapshot, ArchiMate® Model Exchange File Format
  • Schema Documentation for the ArchiMate® 2.1 XML/XML Schema Definition (XSD) Binding
  • A ZIP file containing: the XSD Schema file, an example Extended XSD Schema file, and example models in the exchange file format

Q.  What example models are provided with the Snapshot?

A.  The ArchiSurance and ArchiMetal case studies are provided, as is a Testall.xml model that can be used for interoperability testing.

Q.  Are all the elements defined in Exchange File Format mandatory?

A.  There are only two mandatory elements:

  • The main “model” tag itself with associated namespace declarations
  • Elements in the “elements” tag (with type and ID)

Everything else is optional. Of course, a minimal file containing only these two things would probably be unlikely, but it could be the case that there are no relationships in the model.

The following items are optional:

  • Metadata
  • Organization
  • The xml:lang=”xx” attribute

They are provided because they may be of use to the sender/receiver, but they don’t have to be there. For example, with the Organization element, this may be useful if the tool sending or receiving would like to know how the elements/relations are organised in folders for example, but not every tool might support that and could happily ignore it.

Similarly, not every tool supports multi-language so there is need to use the xml:lang=”xx” attribute. The example XML files provided with the Snapshot are more of a showcase of all the elements.

Q.  I am a tool provider, how can I get involved?

A.  You can get involved by joining The Open Group ArchiMate Forum, email

Q.  Are there interoperability tests with other tools suppliers?

A.  Yes, these are ongoing within the project within The Open Group ArchiMate Forum.

Q.  I have suggestions for improvement to the exchange file format, where do I send them?

A.  Please send comments by email to no later than January 12, 2015

Q.  I have suggestions for the Phase 2 visual layout, where do I send them?

A.  Please send comments by email to no later than January 12, 2015

By Andrew JoseyAndrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate® 2.1, IEEE Std 1003.1,2013 edition (POSIX), and the core specifications of the Single UNIX® Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.


philbeauvoirPhil Beauvoir has been developing, writing, and speaking about software tools and development for over 25 years. He was Senior Researcher and Developer at Bangor University, and, later, the Institute for Educational Cybernetics at Bolton University, both in the UK. During this time he co-developed a peer-to-peer learning management and groupware system, a suite of software tools for authoring and delivery of standards-compliant learning objects and meta-data, and tooling to create IMS Learning Design compliant units of learning.  In 2010, working with the Institute for Educational Cybernetics, Phil created the open source ArchiMate Modelling Tool, Archi. Since 2013 he has been curating the development of Archi independently. Phil holds a degree in Medieval English and Anglo-Saxon Literature.

Frans FaaseFrans Faase is a senior software engineer who has been working with BiZZdesign since 2002. He got an M.Sc. degree in Computer Science from the University of Twente. At BiZZdesign he has been involved in designing the repository being used by BiZZdesign Architect, which implements the ArchiMate standard. He designed a locking mechanism that allows smooth cooperation between multiple users on a single model. He also worked on many import functions from other tools requiring reverse engineering, scanning, and parsing of used file formats. Many of these file formats are based on XML.

Comments Off on The Open Group ArchiMate® Model File Exchange Format

Filed under ArchiMate®, Certifications, Standards, Uncategorized