Tag Archives: The Open Group

The Open Group San Diego 2015 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The Open Group San Diego 2015, with over 200 guests in attendance, had a powerful start on Monday, February 2 with a presentation by Dawn Meyerriecks, CIA Deputy Director for Science and Technology and Emeritus of The Open Group.

Dawn’s presentation, entitled “Emerging Tech Trends: The Role of Government”, focused on the US government, but she emphasized that global reach is necessary as are international relationships. US investment in R&D and basic research is accelerating. She continued, the government is great at R&D but not necessarily bringing it to market. Furthermore, as physical and virtual converge (i.e. Internet of Things), this fuels the state of IT. The US is well ahead of other countries in R&D, but China and Japan are increasing their spend, along with other markets. Discussions about supply chain, dependability, platforms, mobility and architectural framework are also required.

Big data analytics are key as is the access to analytics and the ability to exploit it to have market growth. The government works with partners on many levels. Leading experts from academia, governments, and industry collaborate to solve hard problems in big data analytics, including the impacts on life such as trying to predict societal unrest. The complex technologies focus must utilize incisive analysis, safe and secure operations and smart collection.

By The Open GroupDawn Meyerriecks

When Allen Brown, President and CEO of The Open Group, introduced Dawn, he mentioned her integral role in launching the Association of Enterprise Architects (AEA), formerly AOGEA, in 2007 which initially had 700 members. AEA now has over 40,000 members and is recognized worldwide as a professional association.

Dawn’s presentation was followed by a Q&A session with Allen who further addressed supply chain, ethical use of data, cloud systems and investment in cybersecurity. Allen emphasized that The Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program is in place but there needs to be a greater demand.

Next on the agenda was the The Open Group overview and Forum Highlights presented by Allen Brown. The Open Group has 488 member organizations signed in 40 countries such as Australia, Czech Republic, Japan, Nigeria, Philippines and United Arab Emirates. In 2014, The Open Group signed 93 new membership agreements in 22 countries.

Allen presented updates on all The Open Group Forums: ArchiMate®, Architecture, DirectNet®, EMMM, Enterprise Management, FACE®, Healthcare, IT4IT™, Open Platform 3.0™, Open Trusted Technology, Security. Highlights included:

  • ArchiMate® Forum – The Open Group now sponsors the Archi tool, a free open source tool; recently published TOGAF® Framework/ArchiMate® Modeling Language Harmonization guide
  • Architecture Forum – TOGAF® 9 reached 40,000 certifications; 75,000 TOGAF publications download in 166 countries; TOGAF books sales reached 11,000
  • Healthcare Forum – first round analysis submission for Federal Health Information Model (FHIM) was very well-accepted; white paper in development
  • The Open Group IT4IT™ Forum – launched in October 2014
  • Open Platform 3.0™ Forum – snapshot 2 in development; produced two Internet of Things (IoT) standards in 2014; relaunching UDEF in 2015
  • Security Forum – increasing activities with organizations in India

The plenary continued with a joint presentation by Dr. Ron Ross, Fellow of National Institute of Standards and Technology (NIST) and Mary Ann Davidson, Chief Security Officer, Oracle and entitled “Cybersecurity Standards: Finding the Right Balance for Securing Your Enterprise”.

Dr. Ross stated the world is developing the most complicated IT infrastructure ever. Critical missions and business functions are at risk – affecting national economic security interests of the US. Companies throughout the globe are trying to make systems secure, but there are challenges when getting to the engineering aspect. Assurance, trustworthiness, resiliency is the world we want to build. “Building stronger, more resilient systems requires effective security standards for software assurance, systems security engineering, supply chain risk management.” It is critical to focus on the architecture and engineering. The essential partnership is government, academia and industry.

Mary Ann posed the question “why cybersecurity standards?” with the answer being that standards help ensure technical correctness, interoperability, trustworthiness and best practices. In her view, the standards ecosystem consists of standards makers, reviewers, mandaters, implementers, certifiers, weaponizers (i.e. via regulatory capture). She stated the “Four Ps of Benevolent Standards” are problem statement, precise language and scope, pragmatic solutions and prescriptive minimization. “Buyers and practitioners must work hand in hand; standards and practice should be neck and neck, not miles apart.”

The plenary culminated with the Cybersecurity Panel. Dave Lounsbury, CTO, The Open Group, moderated the panel. Panelists were Edna Conway, Chief Security Officer for Global Supply Chain, Cisco; Mary Ann Mezzapelle, America CTO for Enterprise Security Services, HP, Jim Hietala, VP Security, The Open Group, Rance Delong, Security and High Assurance Systems, Santa Clara University. They all agreed the key to Security is to learn the risks, vulnerabilities and what can you control. Technology and controls are out there but organizations need to implement them effectively. Also, collaboration is important among public, private, industry and supply chain, and people, processes and technology.

By The Open Group

Rance Delong, Jim Hietala, Edna Conway, Mary Ann Mezzapelle, Dave Lounsbury

The afternoon featured tracks Risk, Dependability and Trusted Technology; IT4IT™; EA Practice and Professional Development. One of the sessions “Services Model Backbone – the IT4IT™ Nerve System” was presented by Lars Rossen, Distinguished Technologist and Chief Architect, IT4IT Initiative, HP.

In the evening, The Open Group hosted a lovely reception on the outside terrace at the Westin Gaslamp.

Most plenary sessions are available via: Livestream.com/opengroup

Please join the conversation – @theopengroup #ogSAN

By Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

 

 

 

 

 

 

 

 

4 Comments

Filed under Business Architecture, Conference, Cybersecurity, Enterprise Architecture, Enterprise Transformation, O-TTF, Professional Development, Security, Standards

Professional Training Trends (Part Two): A Q&A with Chris Armstrong, Armstrong Process Group

By The Open Group

This is part two in a two part series.

Professional development and training is a perpetually hot topic within the technology industry. After all, who doesn’t want to succeed at their job and perform better?

Ongoing education and training is particularly important for technology professionals who are already in the field. With new tech trends, programming languages and methodologies continuously popping up, most professionals can’t afford not to keep their skill sets up to date these days.

The Open Group member Chris Armstrong is well-versed in the obstacles that technology professionals face to do their jobs. President of Armstrong Process Group, Inc. (APG), Armstrong and his firm provide continuing education and certification programs for technology professionals and Enterprise Architects covering all aspects of the enterprise development lifecycle. We recently spoke with Armstrong about the needs of Architecture professionals and the skills and tools he thinks are necessary to do the job effectively today.

What are some of the tools that EAs can be using to do architecture right now?

There’s quite a bit out there. I’m kind of developing a perspective on how to lay them out across the landscape a bit better. I think there are two general classes of EA tools based on requirements, which is not necessarily the same as what is offered by commercial or open source solutions.

When you take a look at the EA Capability model and the value chain, the first two parts of it have to do with understanding and analyzing what’s going on in an enterprise. Those can be effectively implemented by what I call Enterprise Architecture content management tools, or EACM. Most of the modeling tools would fall within that categorization. Tools that we use? There’s Sparx Enterprise Architect. It’s a very effective modeling tool that covers all aspects of the architecture landscape top-to-bottom, left-to-right and it’s very affordable. Consequently, it’s one of the most popular tools in the world—I think there are upwards of 300,000 licenses active right now. There are lots of other modeling tools as well.

A lot of people find the price point for Sparx Enterprise Architect so appealing that when people go into an investment and it’s only $5K, $10K, or $15K, instead of $100K or $250K, find it’s a great way to get into coming to grips with what it means to really build models. It really helps you build those fundamental modeling skills, which are best learned via on-the-job experience in your real business domain, without having to mortgage the farm.

Then there’s the other part of it, and this is where I think there needs to be a shift in emphasis to some extent. A lot of times the architect community gets caught up in modeling. We’ve been modeling for decades—modeling is not a new thing. Despite that—and this is just an anecdotal observation—the level of formal, rigorous modeling, at least in our client base in the U.S. market, is still very low. There are lots of Fortune 1000 organizations that have not made investments in some of these solutions yet, or are fractured or not well-unified. As a profession, we have a big history of modeling and I’m a big fan of that, but it sometimes seems a bit self-serving to some extent, in that a lot of times the people we model for are ourselves. It’s all good from an engineering perspective—helps us frame stuff up, produce views of our content that are meaningful to other stakeholders. But there’s a real missed opportunity in making those assets available and useful to the rest of the organization. Because if you build a model, irrespective of how good and relevant and pertinent it is, if nobody knows about it and nobody can use it to make good decisions or can’t use it to accelerate their project, there’s some legitimacy to the question of “So how much value is this really adding?” I see a chasm between the production of Enterprise Architecture content and the ease of accessing and using that content throughout the enterprise. The consumer market for Enterprise Architecture is much larger than the provider community.

But that’s a big part of the problem, which is why I mentioned cross-training earlier–most enterprises don’t have the self-awareness that they’ve made some investment in Enterprise Architecture and then often ironically end up making redundant, duplicative investments in repositories to keep track of inventories of things that EA is already doing or could already be doing. Making EA content as easily accessible to the enterprise as going to Google and searching for it would be a monumental improvement. One of the big barriers to re-use is finding if something useful has already been created, and there’s a lot of opportunity to deliver better capability through tooling to all of the consumers throughout an enterprise.

If we move a bit further along the EA value chain to what we call “Decide and Respond,” that’s a really good place for a different class of tools. Even though there are modeling tool vendors that try to do it, we need a second class of tools for EA Lifecycle Management (EALM), which is really getting into the understanding of “architecture-in-motion”. Once architecture content has been described as the current and future state, the real $64,000 question is how do we get there? How do we build a roadmap? How do we distribute the requirements of that roadmap across multiple projects and tie that to the strategic business decisions and critical assets over time? Then there’s how do I operate all of this stuff once I build it? That’s another part of lifecycle management—not just how do I get to this future state target architecture, but how do I continue to operate and evolve it incrementally and iteratively over time?

There are some tools that are emerging in the lifecycle management space and one of them is another product we partner with—that’s a solution from HP called Enterprise Maps. From our perspective it meets all the key requirements of what we consider enterprise architecture lifecycle management.

What tools do you recommend EAs use to enhance their skillsets?

Getting back to modeling—that’s a really good place to start as it relates to elevating the rigor of architecture. People are used to drawing pictures with something like Visio to graphically represent ”here’s how the business is arranged” or “here’s how the applications landscape looks,” but there’s a big difference in transitioning how to think about building a model. Because drawing a picture and building a model are not the same thing. The irony, though, is that to many consumers it looks the same, because you often look into a model through a picture. But the skill and the experience that the practitioner needs is very different. It’s a completely different way of looking at the world when you start building models as opposed to solely drawing pictures.

I see still, coming into 2015, a huge opportunity to uplift that skill set because I find a lot of people say they know how to model but they haven’t really had that experience. You just can’t simply explain it to somebody, you have to do it. It’s not the be-all and end-all—it’s part of the architect’s toolkit, but being able to think architecturally and from a model-driven approach is a key skill set that people are going to need to keep pace with all the rapid changes going on in the marketplace right now.

I also see that there’s still an opportunity to get people better educated on some formal modeling notations. We’re big fans of the Unified Modeling Language, UML. I still think uptake of some of those specifications is not as prevalent as it could be. I do see that there are architects that have some familiarity with some of these modeling standards. For example, in our TOGAF® training we talk about standards in one particular slide, many architects have only heard of one or two of them. That just points to there being a lack of awareness about the rich family of languages that are out there and how they can be used. If a community of architects can only identify one or two modeling languages on a list of 10 or 15 that is an indirect representation of their background in doing modeling, in my opinion. That’s anecdotal, but there’s a huge opportunity to uplift architect’s modeling skills.

How do you define the difference between models and pictures?

Modeling requires a theory—namely you have to postulate a theory first and then you build a model to test that theory. Picture drawing doesn’t require a theory—it just requires you to dump on a piece of paper a bunch of stuff that’s in your head. Modeling encourages more methodical approaches to framing the problem.

One of the anti-patterns that I’ve seen in many organizations is they often get overly enthusiastic, particularly when they get a modeling tool. They feel like they can suddenly do all these things they’ve never done before, all that modeling stuff, and they end up “over modeling” and not modeling effectively because one of the key things for modeling is modeling just enough because there’s never enough time to build the perfect thing. In my opinion, it’s about building the minimally sufficient model that’s useful. And in order to do that, you need to take a step back. TOGAF does acknowledge this in the ADM—you need to understand who your stakeholders are, what their concerns are and then use those concerns to frame how you look at this content. This is where you start coming up with the theory for “Why are we building a model?” Just because we have tools to build models doesn’t mean we should build models with those tools. We need to understand why we’re building models, because we can build infinite numbers of models forever, where none of them might be useful, and what’s the point of that?

The example I give is, there’s a CFO of an organization that needs to report earnings to Wall Street for quarterly projections and needs details from the last quarter. And the accounting people say, “We’ve got you covered, we know exactly what you need.” Then the next day the CFO comes in and on his/her desk is eight feet of green bar paper. She/he goes out to the accountants and says, “What the heck is this?” And they say “This is a dump of the general ledger for the first quarter. Every financial transaction you need.” And he/she says, “Well it’s been a while since I’ve been a CPA, and I believe it’s all in there, but there’s just no way I’ve got time to weed through all that stuff.”

There are generally accepted accounting principles where if I want to understand the relationship between revenue and expense that’s called a P&L and if I’m interested in understanding the difference between assets and liabilities that’s a balance sheet. We can think of the general ledger as the model of the finances of an organization. We need to be able to use intelligence to give people views of that model that are pertinent and help them understand things. So, the CFO says “Can you take those debits and credits in that double entry accounting system and summarize them into a one-pager called a P&L?”

The P&L would be an example of a view into a model, like a picture or diagram. The diagram comes from a model, the general ledger. So if you want to change the P&L in an accounting system you don’t change the financial statement, you change the general ledger. When you make an adjustment in your general ledger, you re-run your P&L with different content because you changed the model underneath it.

You can kind of think of it as the difference between doing accounting on register paper like we did up until the early 21st Century and then saying “Why don’t we keep track of all the debits and credits based on a chart of accounts and then we can use reporting capabilities to synthesize any way of looking at the finances that we care to?” It’s allows a different way for thinking about the interconnectedness of things.

What are some of the most sought after classes at APG?

Of course TOGAF certification is one of the big ones. I’d say in addition to that we do quite a bit in systems engineering, application architecture, and requirements management. Sometimes those are done in the context of solution delivery but sometimes they’re done in the context of Enterprise Architecture. There’s still a lot of opportunity in supporting Enterprise Architecture in some of the fundamentals like requirements management and effective architectural modeling.

What kinds of things should EAs look for in training courses?

I guess the big thing is to try to look for are offerings that get you as close to practical application as possible. A lot of people start with TOGAF and that’s a great way to frame the problem space. I would set the expectation—and we always do when we deliver our TOGAF training—that this will not tell you “how” to do Enterprise Architecture, there’s just not enough time for that in four days. We talk about “what” Enterprise Architecture is and related emerging best practices. That needs to be followed up with “how do I actually do Enterprise Architecture modeling,” “how do I actually collect EA requirements,” “how do I actually do architecture trade-off analysis?” Then “How do I synthesize an architecture roadmap,” “how do I put together a migration plan,” and “how do I manage the lifecycle of applications in my portfolio over the long haul?” Looking for training that gets you closer to those experiences will be the most valuable ones.

But a lot of this depends on the level of maturity within the organization, because in some places, just getting everybody on the same page of what Enterprise Architecture means is a big victory. But I also think Enterprise Architects need to be very thoughtful about this cross-training. I know it’s something I’m trying to make an investment in myself, is becoming more attuned to what’s going on in other parts of the enterprise in which Enterprise Architecture has some context but perhaps is not a known player. Getting training experiences in other places and engaging those parts of your organizations to really find out what are the problems they’re trying to solve and how might Enterprise Architecture help them is essential.

One of the best ways to demonstrate that is part of the organizational learning related to EA adoption. That may even be the bigger question. As individual architects, there are always opportunities for greater skill development, but really, organizational learning is where the real investment needs to be made so you can answer the question, “Why do I care?” One of the best ways to respond to that is to have an internal success. After a pilot project say, “We did EA on a limited scale for a specific purpose and look what we got out of it and how could you not want to do more of it?”

But ultimate the question usually should be “How can we make Enterprise Architecture indispensible? How can we create an environment where people can perform their duties more rapidly, more efficiently, more effectively and more sustainably based on Enterprise Architecture?” This is part of the problem, especially in larger organizations. In 2015, it’s not really the first time people have been making investments in Enterprise Architecture, it’s the second or third or fourth time, so it’s a reboot. You want to make sure that EA can become indispensible but you want to be able to support those critical activities with EA support and then when the stakeholders become dependent on it, you can say “If you like that stuff, we need you to show up with some support for EA and get some funding and resources, so we can continue to operate and sustain this capability.”

What we’ve found is that it’s a double-edged sword, ironically. If an organization has success in propping up their Architecture capability and sustaining and demonstrating some value there, it can be a snowball effect where you can become victims of your own success and suddenly people are starting to get wind of “Oh, I don’t have to do that if the EA’s already done it,” or “I can align myself with a part of the business where the EA has already been done.” The architecture community can get very busy—more busy than they’re prepared for—because of the momentum that might exist to really exploit those EA investments. But at the end of the day, it’s all good stuff because the more you can show the enterprise that it’s worth the investment, that it delivers value, the more likely you’ll get increased funding to sustain the capability.

By The Open GroupChris Armstrong is president of Armstrong Process Group, Inc. and an internationally recognized thought leader and expert in iterative software development, enterprise architecture, object-oriented analysis and design, the Unified Modeling Language (UML), use case driven requirements and process improvement.

Over the past twenty years, Chris has worked to bring modern software engineering best practices to practical application at many private companies and government organizations worldwide. Chris has spoken at over 30 conferences, including The Open Group Enterprise Architecture Practitioners Conference, Software Development Expo, Rational User Conference, OMG workshops and UML World. He has been published in such outlets as Cutter IT Journal, Enterprise Development and Rational Developer Network.

Join the conversation! @theopengroup #ogchat

2 Comments

Filed under Business Architecture, Enterprise Architecture, Enterprise Transformation, Professional Development, Standards, TOGAF, TOGAF®, Uncategorized

Professional Training Trends (Part One): A Q&A with Chris Armstrong, Armstrong Process Group

By The Open Group

This is part one in a two part series.

Professional development and training is a perpetually hot topic within the technology industry. After all, who doesn’t want to succeed at their job and perform better?

Ongoing education and training is particularly important for technology professionals who are already in the field. With new tech trends, programming languages and methodologies continuously popping up, most professionals can’t afford not to keep their skill sets up to date these days.

The Open Group member Chris Armstrong is well-versed in the obstacles that technology professionals face to do their jobs. President of Armstrong Process Group, Inc. (APG), Armstrong and his firm provide continuing education and certification programs for technology professionals and Enterprise Architects covering all aspects of the enterprise development lifecycle. We recently spoke with Armstrong about the needs of Architecture professionals and the skills and tools he thinks are necessary to do the job effectively today.

What are some of the latest trends you’re seeing in training today?

If I look at the kinds of things we’ve been helping people with, we definitely continue to do professional certifications like TOGAF®. It appears that the U.S. is still lagging behind Europe with penetration of TOGAF certifications. For example, the trend has been that the U.K. is number one in certifications and the U.S. is number two. Based on sheer numbers of workers, there should actually be far more people certified in the U.S., but that could be related to cultural differences in regional markets as related to certification.

Another trend we’re seeing a lot of is “How do I do this in the real world?” TOGAF intentionally does not go to the level of detail that prescribes how you really do things. Many practitioners are looking for more focused, detailed training specific to different Enterprise Architecture (EA) domains. APG does quite a bit of that with our enterprise clients to help institutionalize EA practices. There are also many tool vendors that provide tools to help accomplish EA tasks and we help with training on those.

We also find that there’s a need for balance between how much to train someone in terms of formal training vs. mentoring and coaching them. As a profession, we do a lot of classroom training, but we need to follow up more with how we’re going to apply it in the real world and in our environment with on-the-job training. Grasping the concepts in an instructor-led class isn’t the same as doing it for real, when trying to solve a problem you actually care about.

When people are interested in becoming Enterprise Architects, what kind of training should they pursue?

That’s a pretty compelling question as it has to do with the state of the architecture profession, which is still in its infancy. From a milestone perspective, it’s still hard to call Enterprise Architecture a “true” profession if you can’t get educated on it. With other professions—attorneys or medical doctors—you can go to an accredited university and get a degree or a master’s and participate in continuing education. There are some indicators that things are progressing though. Now there are master’s programs in Enterprise Architecture at institutions like Penn State. We’ve donated some of our architecture curriculum as a gift- in-kind to the program and have a seat on their corporate advisory board. It was pretty awesome to make that kind of contribution to support and influence their program.

We talk about this in our Enterprise Architecture training to help to make people aware of that milestone. However, do you think that getting a four-year degree in Computer Science or Math or Engineering and then going on to get a master’s is sufficient to be a successful Enterprise Architect? Absolutely not. So if that’s insufficient, we have to agree what additional experiences individuals should have in order to become Enterprise Architects.

It seems like we need the kind of post-graduate experience of a medical doctor where there’s an internship and a residency, based on on-the-ground experience in the real world with guidance from seasoned professionals. That’s been that approach to most professional trades—apprentice, journeyman, to master—they require on-the-job training. You become a master artisan after a certain period of time and experience. Now there are board-level certifications and some elements of a true profession, but we’re just not there yet in Enterprise Architecture. Len Fehskens at the Association of Enterprise Architects (AEA) has been working on this a lot recently. I think it’s still unclear what it will take to legitimize this as a profession, and while I’m not sure I know the answer, there may be some indicators to consider.

I think as Enterprise Architecture becomes more commonplace, there will be more of an expectation for it. Part of the uptake issue is that most people running today’s organizations likely have an MBA and when they got it 20, 30 or 40 years ago, EA was not recognized as a key business capability. Now that there are EA master’s programs, future MBA candidates will have been exposed to it in their education, which will remove some of the organizational barriers to adoption.

I think it will still be another 20 or 30 years for mass awareness. As more organizations become successful in showing how they have exploited Enterprise Architecture to deliver real business benefits (increased profitability and reduced risk), the call for qualified people will increase. And because of the consequences of the decisions Enterprise Architects are involved in, business leaders will want assurance that their people are qualified and have the requisite accreditation and experience that they’d expect from an attorney or doctor.

Maybe one other thing to call out—in order for us to overcome some of these barriers, we need to be thinking about what kind of education do we need to be providing our business leaders about Enterprise Architecture so that they are making the right kinds of investments. It’s not just Architect education that we need, but also business leader education.

What kind of architecture skills are most in demand right now?

Business Architecture has a lot of legs right now because it’s an essential part of the alignment with the business. I do see some risks of bifurcation between the “traditional” EA community and the emerging Business Architecture community. The business is enterprise, so it’s critical that the EA and BA communities are unified. There is more in common amongst us than differences as professionals, and I think there’s strength in numbers. And while Business Architecture seems to have some good velocity right now, at the end of the day you still need to be able to support your business with IT Architecture.

There is a trend coming up I do wonder about, which is related to Technology Architecture, as it’s known in TOGAF. Some people may also call it Infrastructure Architecture. With the evolution of cloud as a platform, it’s becoming in my mind—and this might be just because I’m looking at it from the perspective of a start-up IT company with APG—it’s becoming less and less of an issue to have to care as much about the technology and the infrastructure because in many cases people are making investments in these platforms where that’s taken care of by other people. I don’t want to say we don’t care at all about the technology, but a lot of the challenges organizations have of standardizing on technology to make sure that things can be easily sustainable from a cost and risk perspective, many of those may change when more and more organizations start putting things in the cloud, so it could possibly mean that a lot of the investments that organizations have made in technical architecture could become less important.

Although, that will have to be compensated for from a different perspective, particularly an emerging domain that some people call Integration Architecture. And that also applies to Application Architecture as well—as many organizations move away from custom development to packaged solutions and SaaS solutions, when they think about where they want to make investments, it may be that when all these technologies and application offerings are being delivered to us via the cloud, we may need to focus more on how they’re integrated with one another.

But there’s still obviously a big case for the entirety of the discipline—Enterprise Architecture—and really being able to have that clear line of sight to the business.

What are some of the options currently available for ongoing continuing education for Enterprise Architects?

The Association of Enterprise Architects (AEA) provides a lot of programs to help out with that by supplementing the ecosystem with additional content. It’s a blend between formal classroom training and conference proceedings. We’re doing a monthly webinar series with the AEA entitled “Building an Architecture Platform,” which focuses on how to establish capabilities within the organization to deliver architecture services. The topics are about real-world concerns that have to do with the problems practitioners are trying to address. Complementing professional skills development with these types of offerings is another part of the APG approach.

One of the things APG is doing, and this is a project we’re working on with others at The Open Group, is defining an Enterprise Architecture capability model. One of the things that capability model will be used for is to decide where organizations need to make investments in education. The current capability model and value chain that we have is pretty broad and has a lot of different dimensions to it. When I take a look at it and think “How do people do those things?” I see an opportunity for education and development. Once we continue to elaborate the map of things that comprise Enterprise Architecture, I think we’ll see a lot of opportunity for getting into a lot of different dimensions of how Enterprise Architecture affects an organization.

And one of the things we need to think about is how we can deliver just-in-time training to a diverse, global community very rapidly and effectively. Exploiting online learning management systems and remote coaching are some of the avenues that APG is pursuing.

Are there particular types of continuing education programs that EAs should pursue from a career development standpoint?

One of the things I’ve found interesting is that I’ve seen a number of my associates in the profession going down the MBA path. My sense is that that’s a representation of an interest in understanding better how the business executives see the enterprise from their world and to help perhaps frame the question “How can I best anticipate and understand where they’re coming from so that I can more effectively position Enterprise Architecture at a different level?” So that’s cross-disciplinary training. Of course that makes a lot of sense, because at the end of the day, that’s what Enterprise Architecture is all about—how to exploit the synergy that exists within an enterprise. A lot of times that’s about going horizontal within the organization into places where people didn’t necessarily think you had any business in. So raising that awareness and understanding of the relevance of EA is a big part of it.

Another thing that certainly is driving many organizations is regulatory compliance, particularly general risk management. A lot of organizations are becoming aware that Enterprise Architecture plays an essential role in supporting that. Getting cross-training in those related disciplines would make a lot of sense. At the end of the day, those parts of an organizations typically have a lot more authority, and consequently, a lot more funding and money than Enterprise Architecture does, because the consequence of non-conformance is very punitive—the pulling of licenses to operate, heavy fines, bad publicity. We’re just not quite there that if an organization were to not do “good” on Enterprise Architecture, that it’d become front-page news in The New York Times. But when someone steals 30 million cardholders’ personal information, that does become headline news and the subject of regulatory punitive damages. And not to say that Enterprise Architecture is the savior of all things, but it is well-accepted within the EA community that Enterprise Architecture is an essential part of building an effective governance and a regulatory compliance environment.

By The Open GroupChris Armstrong is president of Armstrong Process Group, Inc. and an internationally recognized thought leader and expert in iterative software development, enterprise architecture, object-oriented analysis and design, the Unified Modeling Language (UML), use case driven requirements and process improvement.

Over the past twenty years, Chris has worked to bring modern software engineering best practices to practical application at many private companies and government organizations worldwide. Chris has spoken at over 30 conferences, including The Open Group Enterprise Architecture Practitioners Conference, Software Development Expo, Rational User Conference, OMG workshops and UML World. He has been published in such outlets as Cutter IT Journal, Enterprise Development and Rational Developer Network.

Join the conversation!  @theopengroup #ogchat

1 Comment

Filed under Uncategorized, Enterprise Architecture, TOGAF®, Standards, Business Architecture, Professional Development

A Historical Look at Enterprise Architecture with John Zachman

By The Open Group

John Zachman’s Zachman Framework is widely recognized as the foundation and historical basis for Enterprise Architecture. On Tuesday, Feb. 3, during The Open Group’s San Diego 2015 event, Zachman will be giving the morning’s keynote address entitled “Zachman on the Zachman Framework and How it Complements TOGAF® and Other Frameworks.”

We recently spoke to Zachman in advance of the event about the origins of his framework, the state of Enterprise Architecture and the skills he believes EAs need today.

As a discipline, Enterprise Architecture is still fairly young. It began getting traction in the mid to late 1980s after John Zachman published an article describing a framework for information systems architectures in the IBM Systems Journal. Zachman said he lived to regret initially calling his framework “A Framework for Information Systems Architecture,” instead of “Enterprise Architecture” because the framework actually has nothing to do with information systems.

Rather, he said, it was “A Framework for Enterprise Architecture.” But at the time of publication, the idea of Enterprise Architecture was such a foreign concept, Zachman said, that people didn’t understand what it was. Even so, the origins of his ontological framework were already almost 20 years old by the time he first published them.

In the late 1960s, Zachman was working as an account executive in the Marketing Division of IBM. His account responsibility was working with the Atlantic Richfield Company (better known as ARCO). In 1969, ARCO had just been newly formed out of the merger of three separate companies, Atlantic Refining out of Philadelphia and Richfield in California, which merged and then bought Sinclair Oil in New York in 1969.

“It was the biggest corporate merger in history at the time where they tried to integrate three separate companies into one company. They were trying to deal with an enterprise integration issue, although they wouldn’t have called it that at the time,” Zachman said.

With three large companies to merge, ARCO needed help in figuring out how to do the integration. When the client asked Zachman how they should handle such a daunting task, he said he’d try to get some help. So he turned to a group within IBM called the Information Systems Control and Planning Group and the group’s Director of Architecture, Dewey Walker, for guidance.

Historically, when computers were first used in commercial applications, there already were significant “Methods and Procedures” systems communities in most large organizations whose job was to formalize many manual systems in order to manage the organization, Zachman said. When computers came on the scene, they were used to improve organizational productivity by replacing the people performing the organizations’ processes. However, because manual systems defined and codified organizational responsibilities, when management made changes within an organization, as they often did, it would render the computer systems obsolete, which required major redevelopment.

Zachman recalled Walker’s observation that “organizational responsibilities” and “processes” were two different things. As such, he believed systems should be designed to automate the process, not to encode the organizational responsibilities, because the process and the organization changed independently from one another. By separating these two independent variables, management could change organizational responsibilities without affecting or changing existing systems or the organization. Many years later, Jim Champy and Mike Hammer popularized this notion in their widely read 1991 book, “Reengineering the Corporation,” Zachman said.

According to Zachman, Walker created a methodology for defining processes as separate entities from the organizational structure. Walker came out to Los Angeles, where Zachman and ARCO were based to help provide guidance on the merger. Zachman recalls Walker telling him that the key to defining the systems for Enterprise purposes was in the data, not necessarily the process itself. In other words, the data across the company needed to be normalized so that they could maintain visibility into the assets and structure of the enterprise.

“The secret to this whole thing lies in the coding and the classification of the data,” Zachman recalled Walker saying. Walker’s methodology, he said, began by classifying data by its existence not by its use.

Since all of this was happening well before anyone came up with the concept of data modeling, there were no data models from which to design their system. “Data-oriented words were not yet in anyone’s vocabulary,” Zachman said. Walker had difficulty articulating his concepts because the words he had at his disposal were inadequate, Zachman said.

Walker understood that to have structural control over the enterprise, they needed to look at both processes and data as independent variables, Zachman said. That would provide the flexibility and knowledge base to accommodate escalating change. This was critical, he said, because the system is the enterprise. Therefore, creating an integrated structure of independent variables and maintaining visibility into that structure are crucial if you want to be able to manage and change it. Otherwise, he says, the enterprise “disintegrates.”

Although Zachman says Walker was “onto this stuff early on,” Walker eventually left IBM, leaving Zachman with the methodology Walker had named “Business Systems Planning.” (Zachman said Walker knew that it wasn’t just about the information systems, but about the business systems.) According to Zachman, he inherited Walker’s methodology because he’d been working closely with Walker. “I was the only person that had any idea what Dewey was doing,” he said.

What he was left with, Zachman says, was what today he would call a “Row 1 methodology”—or the “Executive Perspective” and the “Scope Contexts” in what would eventually become his ontology.

According to Zachman, Walker had figured out how to transcribe enterprise strategy in such a fashion that engineering work could be derived from it. “What we didn’t know how to do,” Zachman said, “was to transform the strategy (Zachman Framework Row 1), which tends to be described at a somewhat abstract level of definition into the operating Enterprise (Row 6), which was comprised of very precise instructions (explicit or implicit) for behavior of people and/or machines.”

Zachman said that they knew that “Architecture” had something to do with the Strategy to Instantiation transformation logic but they didn’t know what architecture for enterprises was in those days. His radical idea was to ask someone who did architecture for things like buildings, airplanes, locomotives, computers or battleships. What the architecture was for those Industrial Age products. Zachman believed if he could find out what they thought architecture was for those products, he might be able to figure out what architecture was for enterprises and thereby figure out how to transform the strategy into the operating enterprise to align the enterprise implementation with the strategy.

With this in mind, Zachman began reaching out to people in other disciplines to see how they put together things like buildings or airplanes. He spoke to an architect friend and also to some of the aircraft manufacturers that were based in Southern California at the time. He began gathering different engineering specs and studying them.

One day while he was sitting at his desk, Zachman said, he began sorting the design artifacts he’d collected for buildings and airplanes into piles. Suddenly he noticed there was something similar in how the design patterns were described.

“Guess what?” he said. “The way you describe buildings is identical to the way you describe airplanes, which turns out to be identical to the way you describe locomotives, which is identical to the way you describe computers. Which is identical to the way you describe anything else that humanity has ever described.”

Zachman says he really just “stumbled across” the way to describe the enterprise and attributes his discovery to providence, a miracle! Despite having kick-started the discipline of Enterprise Architecture with this recognition, Zachman claims he’s “actually not very innovative,” he said.

“I just saw the pattern and put enterprise names on it,” he said

Once he understood that Architectural design descriptions all used the same categories and patterns, he knew that he could also define Architecture for Enterprises. All it would take would be to apply the enterprise vocabulary to the same pattern and structure of the descriptive representations of everything else.

“All I did was, I saw the pattern of the structure of the descriptive representations for airplanes, buildings, locomotives and computers, and I put enterprise names on the same patterns,” he says. “Now you have the Zachman Framework, which basically is Architecture for Enterprises. It is Architecture for every other object known to human kind.”

Thus the Zachman Framework was born.

Ontology vs. Methodology

According to Zachman, what his Framework is ultimately intended for is describing a complex object, an Enterprise. In that sense, the Zachman Framework is the ontology for Enterprise Architecture, he says. What it doesn’t do, is tell you how to do Enterprise Architecture.

“Architecture is architecture is architecture. My framework is just the definition and structure of the descriptive representation for enterprises,” he said.

That’s where methodologies, such as TOGAF®, an Open Group standard, DoDAF or other methodological frameworks come in. To create and execute an Architecture, practitioners need both the ontology—to help them define, translate and place structure around the enterprise descriptive representations—and they need a methodology to populate and implement it. Both are needed—it’s an AND situation, not an OR, he said. The methodology simply needs to use (or reuse) the ontological constructs in creating the implementation instantiations in order for the enterprise to be “architected.”

The Need for Architecture

Unfortunately, Zachman says, there are still a lot of companies today that don’t understand the need to architect their enterprise. Enterprise Architecture is simply not on the radar of general management in most places.

“It’s not readily acknowledged on the general management agenda,” Zachman said.

Instead, he says, most companies focus their efforts on building and running systems, not engineering the enterprise as a holistic unit.

“We haven’t awakened to the concept of Enterprise Architecture,” he says. “The fundamental reason why is people think it takes too long and it costs too much. That is a shibboleth – it doesn’t take too long or cost too much if you know what you’re doing and have an ontological construct.”

Zachman believes many companies are particularly guilty of this type of thinking, which he attributes to a tendency to think that there isn’t any work being done unless the code is up and running. Never mind all the work it took to get that code up and running in the first place.

“Getting the code to run, I’m not arguing against that, but it ought to be in the context of the enterprise design. If you’re just providing code, you’re going to get exactly what you have right now—code. What does that have to do with management’s intentions or the Enterprise in its entirety?”

As such, Zachman compares today’s enterprises to log cabins rather than skyscrapers. Many organizations have not gotten beyond that “primitive” stage, he says, because they haven’t been engineered to be integrated or changed.

According to Zachman, the perception that Enterprise Architecture is too costly and time consuming must change. And, people also need to stop thinking that Enterprise Architecture belongs solely under the domain of IT.

“Enterprise Architecture is not about building IT models. It’s about solving general management problems,” he said. “If we change that perception, and we start with the problem and we figure out how to solve that problem, and then, oh by the way we’re doing Architecture, then we’re going to get a lot of Architecture work done.”

Zachman believes one way to do this is to build out the Enterprise Architecture iteratively and incrementally. By tackling one problem at a time, he says, general management may not even need to know whether you’re doing Enterprise Architecture or not, as long as their problem is being solved. The governance system controls the architectural coherence and integration of the increments. He expects that EA will trend in that direction over the next few years.

“We’re learning much better how to derive immediate value without having the whole enterprise engineered. If we can derive immediate value, that dispels the shibboleth—the misperception that architecture takes too long and costs too much. That’s the way to eliminate the obstacles for Enterprise Architecture.”

As far as the skills needed to do EA into the future, Zachman believes that enterprises will eventually need to have multiple types of architects with different skill sets to make sure everything is aligned. He speculates that someday, there may need to be specialists for every cell in the framework, saying that there is potentially room for a lot of specialization and people with different skill sets and a lot of creativity. Just as aircraft manufacturers need a variety of engineers—from aeronautic to hydraulic and everywhere in between—to get a plane built. One engineer does not engineer the entire airplane or a hundred-story building or an ocean liner, or, for that matter, a personal computer. Similarly, increasingly complex enterprises will likely need multiple types of engineering specialties. No one person knows everything.

“Enterprises are far more complex than 747s. In fact, an enterprise doesn’t have to be very big before it gets really complex,” he said. “As enterprise systems increase in size, there is increased potential for failure if they aren’t architected to respond to that growth. And if they fail, the lives and livelihoods of hundreds of thousand of people can be affected, particularly if it’s a public sector Enterprise.”

Zachman believes it may ultimately take a generation or two for companies to understand the need to better architect the way they run. As things are today, he says, the paradigm of the “system process first” Industrial Age is still too ingrained in how systems are created. He believes it will be a while before that paradigm shifts to a more Information Age-centric way of thinking where the enterprise is the object rather than the system.

“Although this afternoon is not too early to start working on it, it is likely that it will be the next generation that will make Enterprise Architecture an essential way of life like it is for buildings and airplanes and automobiles and every other complex object,” he said.

By The Open GroupJohn A. Zachman, Founder & Chairman, Zachman International, Executive Director of FEAC Institute, and Chairman of the Zachman Institute

Join the conversation – @theopengroup, #ogchat, #ogSAN

3 Comments

Filed under Enterprise Architecture, Standards, TOGAF®, Uncategorized

Catching Up with The Open Group Internet of Things Work Group

By The Open Group

The Open Group’s Internet of Things (IoT) Work Group is involved in developing open standards that will allow product and equipment management to evolve beyond the traditional limits of product lifecycle management. Meant to incorporate the larger systems management that will be required by the IoT, these standards will help to handle the communications needs of a network that may encompass products, devices, people and multiple organizations. Formerly known as the Quantum Lifecycle Management (QLM) Work Group, its name was recently changed to the Internet of Things Work Group to more accurately reflect its current direction and focus.

We recently caught up with Work Group Chairman Kary Främling to discuss its two new standards, both of which are geared toward the Internet of Things, and what the group has been focused on lately.

Over the past few years, The Open Group’s Internet of Things Work Group (formerly the Quantum Lifecycle Management Work Group) has been working behind the scenes to develop new standards related to the nascent Internet of Things and how to manage the lifecycle of these connected products, or as General Electric has referred to it, the “Industrial Internet.”

What their work ultimately aims to do is help manage all the digital information within a particular system—for example, vehicles, buildings or machines. By creating standard frameworks for handling this information, these systems and their related applications can be better run and supported during the course of their “lifetime,” with the information collected serving a variety of purposes, from maintenance to improved design and manufacturing to recycling and even refurbishing them.

According to Work Group Chairman Kary Främling, CEO of ControlThings and Professor of Practice in Building Information Modeling at Aalto University in Finland, the group has been working with companies such as Caterpillar and Fiat, as well as refrigerator and machine tool manufacturers, to enable machines and equipment to send sensor and status data on how machines are being used and maintained to their manufacturers. Data can also be provided to machine operators so they are also aware of how the machines are functioning in order to make changes if need be.

For example, Främling says that one application of this system management loop is in HVAC systems within buildings. By building Internet capabilities into the system, now a ventilation system—or air-handling unit—can be controlled via a smartphone from the moment it’s turned on inside a building. The system can provide data and alerts to facilities management about how well it’s operating and whether there are any problems within the system to whomever needs it. Främling also says that the system can provide information to both the maintenance company and the system manufacturer so they can collect information from the machines on performance, operations and other indicators. This allows users to determine things as simple as when an air filter may need changing or whether there are systematic problems with different machine models.

According to Främling, the ability to monitor systems in this way has already helped ventilation companies make adjustments to their products.

“What we noticed was there was a certain problem with certain models of fans in these machines. Based on all the sensor readings on the machine, I could deduce that the air extraction fan had broken down,” he said.

The ability to detect such problems via sensor data as they are happening can be extremely beneficial to manufacturers because they can more easily and more quickly make improvements to their systems. Another advantage afforded by machines with Web connectivity, Främling says, is that errors can also be corrected remotely.

“There’s so much software in these machines nowadays, so just by changing parameters you can make them work better in many ways,” he says.

In fact, Främling says that the Work Group has been working on systems such as these for a number of years already—well before the term “Internet of Things” became part of industry parlance. They first worked on a system for a connected refrigerator in 2007 and even worked on systems for monitoring how vehicles were used before then.

One of the other things the Work Group is focused on is working with the Open Platform 3.0 Forum since there are many synergies between the two groups. For instance, the Work Group provided a number of the uses cases for the Forum’s recent business scenarios.

“I really see what we are doing is enabling the use cases and these information systems,” Främling says.

Two New Standards

In October, the Work Group also published two new standards, both of which are two of the first standards to be developed for the Internet of Things (IoT). A number of companies and universities across the world have been instrumental in developing the standards including Aalto University in Finland, BIBA, Cambridge University, Infineon, InMedias, Politechnico di Milano, Promise Innovation, SAP and Trackway Ltd.

Främling likens these early IoT standards to what the HTML and HTTP protocols did for the Internet. For example, the Open Data Format (O-DF) Standard provides a common language for describing any kind of IoT object, much like HTML provided a language for the Web. The Open Messaging Interface (O-MI) Standard, on the other hand, describes a set of operations that enables users to read information about particular systems and then ask those systems for that information, much like HTTP. Write operations then allow users to also send information or new values to the system, for example, to update the system.

Users can also subscribe to information contained in other systems. For instance, Främling described a scenario in which he was able to create a program that allowed him to ask his car what was wrong with it via a smartphone when the “check engine” light came on. He was then able to use a smartphone application to send an O-MI message to the maintenance company with the error code and his location. Using an O-MI subscription the maintenance company would be able to send a message back asking for additional information. “Send these five sensor values back to us for the next hour and you should send them every 10 seconds, every 5 seconds for the temperature, and so on,” Främling said. Once that data is collected, the service center can analyze what’s wrong with the vehicle.

Främling says O-MI messages can easily be set up on-the-fly for a variety of connected systems with little programming. The standard also allows users to manage mobility and firewalls. O-MI communications are also run over systems that are already secure to help prevent security issues. Those systems can include anything from HTTP to USB sticks to SMTP, as well, Främling says.

Främling expects that these standards can also be applied to multiple types of functionalities across different industries, for example for connected systems in the healthcare industry or to help manage energy production and consumption across smart grids. With both standards now available, the Work Group is beginning to work on defining extensions for the Data Format so that vocabularies specific to certain industries, such as healthcare or manufacturing, can also be developed.

In addition, Främling expects that as protocols such as O-MI make it easier for machines to communicate amongst themselves, they will also be able to begin to optimize themselves over time. Cars, in fact, are already using this kind of capability, he says. But for other systems, such as buildings, that kind of communication is not happening yet. He says in Finland, his company has projects underway with manufacturers of diesel engines, cranes, elevators and even in Volkswagen factories to establish information flows between systems. Smart grids are also another potential use. In fact his home is wired to provide consumption rates in real-time to the electric company, although he says he does not believe they are currently doing anything with the data.

“In the past we used to speak about these applications for pizza or whatever that can tell a microwave oven how long it should be heated and the microwave oven also checks that the food hasn’t expired,” Främling said.

And while your microwave may not yet be able to determine whether your food has reached its expiration date, these recent developments by the Work Group are helping to bring the IoT vision to fruition by making it easier for systems to begin the process of “talking” to each other through a standardized messaging system.

By The Open GroupKary Främling is currently CEO of the Finnish company ControlThings, as well as Professor of Practice in Building Information Modeling (BIM) at Aalto University, Finland. His main research topics are on information management practices and applications for BIM and product lifecycle management in general. His main areas of competence are distributed systems, middleware, multi-agent systems, autonomously learning agents, neural networks and decision support systems. He is one of the worldwide pioneers in the Internet of Things domain, where he has been active since 2000.

@theopengroup; #ogchat

1 Comment

Filed under digital technologies, Enterprise Transformation, Future Technologies, Internet of Things, Open Platform 3.0, Uncategorized

Putting Information Technology at the Heart of the Business: The Open Group San Diego 2015

By The Open Group

The Open Group is hosting the “Enabling Boundaryless Information Flow™” event February 2 – 5, 2015 in San Diego, CA at the Westin San Diego Gaslamp Quarter. The event is set to focus on the changing role of IT within the enterprise and how new IT trends are empowering improvements in businesses and facilitating Enterprise Transformation. Key themes include Dependability through Assuredness™ (The Cybersecurity Connection) and The Synergy of Enterprise Architecture Frameworks. Particular attention throughout the event will be paid to the need for continued development of an open TOGAF® Architecture Development Method and its importance and value to the wider business architecture community. The goal of Boundaryless Information Flow will be featured prominently in a number of tracks throughout the event.

Key objectives for this year’s event include:

  • Explore how Cybersecurity and dependability issues are threatening business enterprises and critical infrastructure from an integrity and a Security perspective
  • Show the need for Boundaryless Information Flow™, which would result in more interoperable, real-time business processes throughout all business ecosystems
  • Outline current challenges in securing the Internet of Things, and about work ongoing in the Security Forum and elsewhere that will help to address the issues
  • Reinforce the importance of architecture methodologies to assure your enterprise is transforming its approach along with the ever-changing threat landscape
  • Discuss the key drivers and enablers of social business technologies in large organizations which play an important role in the co-creation of business value, and discuss the key building blocks of social business transformation program

Plenary speakers at the event include:

  • Chris Forde, General Manager, Asia Pacific Region & VP, Enterprise Architecture, The Open Group
  • John A. Zachman, Founder & Chairman, Zachman International, and Executive Director of FEAC Institute

Full details on the range of track speakers at the event can be found here, with the following (among many others) contributing:

  • Dawn C. Meyerriecks, Deputy Director for Science and Technology, CIA
  • Charles Betz, Founder, Digital Management Academy
  • Leonard Fehskens. Chief Editor, Journal of Enterprise Architecture, AEA

Registration for The Open Group San Diego 2015 is open and available to members and non-members. Please register here.

Join the conversation via Twitter – @theopengroup #ogSAN

 

17 Comments

Filed under Uncategorized, TOGAF®, Standards, Professional Development, Dependability through Assuredness™, Boundaryless Information Flow™, Internet of Things, Security

Open FAIR Certification for People Program

By Jim Hietala, VP Security, and Andrew Josey, Director of Standards, The Open Group

In this, the final installment of this Open FAIR blog series, we will look at the Open FAIR Certification for People program.

In early 2012, The Open Group Security Forum began exploring the idea of creating a certification program for Risk Analysts. Discussions with large enterprises regarding their risk analysis programs led us to the conclusion that there was a need for a professional certification program for Risk Analysts. In addition, Risk Analyst professionals and Open FAIR practitioners expressed interest in a certification program. Security and risk training organizations also expressed interest in providing training courses based upon the Open FAIR standards and Body of Knowledge.

The Open FAIR People Certification Program was designed to meet the requirements of employers and risk professionals. The certification program is a knowledge-based certification, testing candidates knowledge of the two standards, O-RA, and O-RT. Candidates are free to acquire their knowledge through self-study, or to take a course from an accredited training organization. The program currently has a single level (Foundation), with a more advanced certification level (Certified) planned for 2015.

Several resources are available from The Open Group to assist Risk Analysts preparing to sit for the exam, including the following:

  • Open FAIR Pocket Guide
  • Open FAIR Study Guide
  • Risk Taxonomy (O-RT), Version 2.0 (C13K, October 2013) defines a taxonomy for the factors that drive information security risk – Factor Analysis of Information Risk (FAIR).
  • Risk Analysis (O-RA) (C13G, October 2013) describes process aspects associated with performing effective risk analysis.

All of these can be downloaded from The Open Group publications catalog at http://www.opengroup.org/bookstore/catalog.

For training organizations, The Open Group accredits organizations wishing to offer training courses on Open FAIR. Testing of candidates is offered through Prometric test centers worldwide.

For more information on Open FAIR certification or accreditation, please contact us at: openfair-cert-auth@opengroup.org

By Jim Hietala and Andrew JoseyJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT Security, Risk Management and Healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on Information Security, Risk Management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

 

By Andrew JoseyAndrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate® 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX® Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

 

 

 

Leave a comment

Filed under Accreditations, Certifications, Cybersecurity, Enterprise Architecture, Information security, Open FAIR Certification, Professional Development, RISK Management, Security, Uncategorized