Tag Archives: Business Architecture

Professional Training Trends (Part Two): A Q&A with Chris Armstrong, Armstrong Process Group

By The Open Group

This is part two in a two part series.

Professional development and training is a perpetually hot topic within the technology industry. After all, who doesn’t want to succeed at their job and perform better?

Ongoing education and training is particularly important for technology professionals who are already in the field. With new tech trends, programming languages and methodologies continuously popping up, most professionals can’t afford not to keep their skill sets up to date these days.

The Open Group member Chris Armstrong is well-versed in the obstacles that technology professionals face to do their jobs. President of Armstrong Process Group, Inc. (APG), Armstrong and his firm provide continuing education and certification programs for technology professionals and Enterprise Architects covering all aspects of the enterprise development lifecycle. We recently spoke with Armstrong about the needs of Architecture professionals and the skills and tools he thinks are necessary to do the job effectively today.

What are some of the tools that EAs can be using to do architecture right now?

There’s quite a bit out there. I’m kind of developing a perspective on how to lay them out across the landscape a bit better. I think there are two general classes of EA tools based on requirements, which is not necessarily the same as what is offered by commercial or open source solutions.

When you take a look at the EA Capability model and the value chain, the first two parts of it have to do with understanding and analyzing what’s going on in an enterprise. Those can be effectively implemented by what I call Enterprise Architecture content management tools, or EACM. Most of the modeling tools would fall within that categorization. Tools that we use? There’s Sparx Enterprise Architect. It’s a very effective modeling tool that covers all aspects of the architecture landscape top-to-bottom, left-to-right and it’s very affordable. Consequently, it’s one of the most popular tools in the world—I think there are upwards of 300,000 licenses active right now. There are lots of other modeling tools as well.

A lot of people find the price point for Sparx Enterprise Architect so appealing that when people go into an investment and it’s only $5K, $10K, or $15K, instead of $100K or $250K, find it’s a great way to get into coming to grips with what it means to really build models. It really helps you build those fundamental modeling skills, which are best learned via on-the-job experience in your real business domain, without having to mortgage the farm.

Then there’s the other part of it, and this is where I think there needs to be a shift in emphasis to some extent. A lot of times the architect community gets caught up in modeling. We’ve been modeling for decades—modeling is not a new thing. Despite that—and this is just an anecdotal observation—the level of formal, rigorous modeling, at least in our client base in the U.S. market, is still very low. There are lots of Fortune 1000 organizations that have not made investments in some of these solutions yet, or are fractured or not well-unified. As a profession, we have a big history of modeling and I’m a big fan of that, but it sometimes seems a bit self-serving to some extent, in that a lot of times the people we model for are ourselves. It’s all good from an engineering perspective—helps us frame stuff up, produce views of our content that are meaningful to other stakeholders. But there’s a real missed opportunity in making those assets available and useful to the rest of the organization. Because if you build a model, irrespective of how good and relevant and pertinent it is, if nobody knows about it and nobody can use it to make good decisions or can’t use it to accelerate their project, there’s some legitimacy to the question of “So how much value is this really adding?” I see a chasm between the production of Enterprise Architecture content and the ease of accessing and using that content throughout the enterprise. The consumer market for Enterprise Architecture is much larger than the provider community.

But that’s a big part of the problem, which is why I mentioned cross-training earlier–most enterprises don’t have the self-awareness that they’ve made some investment in Enterprise Architecture and then often ironically end up making redundant, duplicative investments in repositories to keep track of inventories of things that EA is already doing or could already be doing. Making EA content as easily accessible to the enterprise as going to Google and searching for it would be a monumental improvement. One of the big barriers to re-use is finding if something useful has already been created, and there’s a lot of opportunity to deliver better capability through tooling to all of the consumers throughout an enterprise.

If we move a bit further along the EA value chain to what we call “Decide and Respond,” that’s a really good place for a different class of tools. Even though there are modeling tool vendors that try to do it, we need a second class of tools for EA Lifecycle Management (EALM), which is really getting into the understanding of “architecture-in-motion”. Once architecture content has been described as the current and future state, the real $64,000 question is how do we get there? How do we build a roadmap? How do we distribute the requirements of that roadmap across multiple projects and tie that to the strategic business decisions and critical assets over time? Then there’s how do I operate all of this stuff once I build it? That’s another part of lifecycle management—not just how do I get to this future state target architecture, but how do I continue to operate and evolve it incrementally and iteratively over time?

There are some tools that are emerging in the lifecycle management space and one of them is another product we partner with—that’s a solution from HP called Enterprise Maps. From our perspective it meets all the key requirements of what we consider enterprise architecture lifecycle management.

What tools do you recommend EAs use to enhance their skillsets?

Getting back to modeling—that’s a really good place to start as it relates to elevating the rigor of architecture. People are used to drawing pictures with something like Visio to graphically represent ”here’s how the business is arranged” or “here’s how the applications landscape looks,” but there’s a big difference in transitioning how to think about building a model. Because drawing a picture and building a model are not the same thing. The irony, though, is that to many consumers it looks the same, because you often look into a model through a picture. But the skill and the experience that the practitioner needs is very different. It’s a completely different way of looking at the world when you start building models as opposed to solely drawing pictures.

I see still, coming into 2015, a huge opportunity to uplift that skill set because I find a lot of people say they know how to model but they haven’t really had that experience. You just can’t simply explain it to somebody, you have to do it. It’s not the be-all and end-all—it’s part of the architect’s toolkit, but being able to think architecturally and from a model-driven approach is a key skill set that people are going to need to keep pace with all the rapid changes going on in the marketplace right now.

I also see that there’s still an opportunity to get people better educated on some formal modeling notations. We’re big fans of the Unified Modeling Language, UML. I still think uptake of some of those specifications is not as prevalent as it could be. I do see that there are architects that have some familiarity with some of these modeling standards. For example, in our TOGAF® training we talk about standards in one particular slide, many architects have only heard of one or two of them. That just points to there being a lack of awareness about the rich family of languages that are out there and how they can be used. If a community of architects can only identify one or two modeling languages on a list of 10 or 15 that is an indirect representation of their background in doing modeling, in my opinion. That’s anecdotal, but there’s a huge opportunity to uplift architect’s modeling skills.

How do you define the difference between models and pictures?

Modeling requires a theory—namely you have to postulate a theory first and then you build a model to test that theory. Picture drawing doesn’t require a theory—it just requires you to dump on a piece of paper a bunch of stuff that’s in your head. Modeling encourages more methodical approaches to framing the problem.

One of the anti-patterns that I’ve seen in many organizations is they often get overly enthusiastic, particularly when they get a modeling tool. They feel like they can suddenly do all these things they’ve never done before, all that modeling stuff, and they end up “over modeling” and not modeling effectively because one of the key things for modeling is modeling just enough because there’s never enough time to build the perfect thing. In my opinion, it’s about building the minimally sufficient model that’s useful. And in order to do that, you need to take a step back. TOGAF does acknowledge this in the ADM—you need to understand who your stakeholders are, what their concerns are and then use those concerns to frame how you look at this content. This is where you start coming up with the theory for “Why are we building a model?” Just because we have tools to build models doesn’t mean we should build models with those tools. We need to understand why we’re building models, because we can build infinite numbers of models forever, where none of them might be useful, and what’s the point of that?

The example I give is, there’s a CFO of an organization that needs to report earnings to Wall Street for quarterly projections and needs details from the last quarter. And the accounting people say, “We’ve got you covered, we know exactly what you need.” Then the next day the CFO comes in and on his/her desk is eight feet of green bar paper. She/he goes out to the accountants and says, “What the heck is this?” And they say “This is a dump of the general ledger for the first quarter. Every financial transaction you need.” And he/she says, “Well it’s been a while since I’ve been a CPA, and I believe it’s all in there, but there’s just no way I’ve got time to weed through all that stuff.”

There are generally accepted accounting principles where if I want to understand the relationship between revenue and expense that’s called a P&L and if I’m interested in understanding the difference between assets and liabilities that’s a balance sheet. We can think of the general ledger as the model of the finances of an organization. We need to be able to use intelligence to give people views of that model that are pertinent and help them understand things. So, the CFO says “Can you take those debits and credits in that double entry accounting system and summarize them into a one-pager called a P&L?”

The P&L would be an example of a view into a model, like a picture or diagram. The diagram comes from a model, the general ledger. So if you want to change the P&L in an accounting system you don’t change the financial statement, you change the general ledger. When you make an adjustment in your general ledger, you re-run your P&L with different content because you changed the model underneath it.

You can kind of think of it as the difference between doing accounting on register paper like we did up until the early 21st Century and then saying “Why don’t we keep track of all the debits and credits based on a chart of accounts and then we can use reporting capabilities to synthesize any way of looking at the finances that we care to?” It’s allows a different way for thinking about the interconnectedness of things.

What are some of the most sought after classes at APG?

Of course TOGAF certification is one of the big ones. I’d say in addition to that we do quite a bit in systems engineering, application architecture, and requirements management. Sometimes those are done in the context of solution delivery but sometimes they’re done in the context of Enterprise Architecture. There’s still a lot of opportunity in supporting Enterprise Architecture in some of the fundamentals like requirements management and effective architectural modeling.

What kinds of things should EAs look for in training courses?

I guess the big thing is to try to look for are offerings that get you as close to practical application as possible. A lot of people start with TOGAF and that’s a great way to frame the problem space. I would set the expectation—and we always do when we deliver our TOGAF training—that this will not tell you “how” to do Enterprise Architecture, there’s just not enough time for that in four days. We talk about “what” Enterprise Architecture is and related emerging best practices. That needs to be followed up with “how do I actually do Enterprise Architecture modeling,” “how do I actually collect EA requirements,” “how do I actually do architecture trade-off analysis?” Then “How do I synthesize an architecture roadmap,” “how do I put together a migration plan,” and “how do I manage the lifecycle of applications in my portfolio over the long haul?” Looking for training that gets you closer to those experiences will be the most valuable ones.

But a lot of this depends on the level of maturity within the organization, because in some places, just getting everybody on the same page of what Enterprise Architecture means is a big victory. But I also think Enterprise Architects need to be very thoughtful about this cross-training. I know it’s something I’m trying to make an investment in myself, is becoming more attuned to what’s going on in other parts of the enterprise in which Enterprise Architecture has some context but perhaps is not a known player. Getting training experiences in other places and engaging those parts of your organizations to really find out what are the problems they’re trying to solve and how might Enterprise Architecture help them is essential.

One of the best ways to demonstrate that is part of the organizational learning related to EA adoption. That may even be the bigger question. As individual architects, there are always opportunities for greater skill development, but really, organizational learning is where the real investment needs to be made so you can answer the question, “Why do I care?” One of the best ways to respond to that is to have an internal success. After a pilot project say, “We did EA on a limited scale for a specific purpose and look what we got out of it and how could you not want to do more of it?”

But ultimate the question usually should be “How can we make Enterprise Architecture indispensible? How can we create an environment where people can perform their duties more rapidly, more efficiently, more effectively and more sustainably based on Enterprise Architecture?” This is part of the problem, especially in larger organizations. In 2015, it’s not really the first time people have been making investments in Enterprise Architecture, it’s the second or third or fourth time, so it’s a reboot. You want to make sure that EA can become indispensible but you want to be able to support those critical activities with EA support and then when the stakeholders become dependent on it, you can say “If you like that stuff, we need you to show up with some support for EA and get some funding and resources, so we can continue to operate and sustain this capability.”

What we’ve found is that it’s a double-edged sword, ironically. If an organization has success in propping up their Architecture capability and sustaining and demonstrating some value there, it can be a snowball effect where you can become victims of your own success and suddenly people are starting to get wind of “Oh, I don’t have to do that if the EA’s already done it,” or “I can align myself with a part of the business where the EA has already been done.” The architecture community can get very busy—more busy than they’re prepared for—because of the momentum that might exist to really exploit those EA investments. But at the end of the day, it’s all good stuff because the more you can show the enterprise that it’s worth the investment, that it delivers value, the more likely you’ll get increased funding to sustain the capability.

By The Open GroupChris Armstrong is president of Armstrong Process Group, Inc. and an internationally recognized thought leader and expert in iterative software development, enterprise architecture, object-oriented analysis and design, the Unified Modeling Language (UML), use case driven requirements and process improvement.

Over the past twenty years, Chris has worked to bring modern software engineering best practices to practical application at many private companies and government organizations worldwide. Chris has spoken at over 30 conferences, including The Open Group Enterprise Architecture Practitioners Conference, Software Development Expo, Rational User Conference, OMG workshops and UML World. He has been published in such outlets as Cutter IT Journal, Enterprise Development and Rational Developer Network.

Join the conversation! @theopengroup #ogchat

Comments Off on Professional Training Trends (Part Two): A Q&A with Chris Armstrong, Armstrong Process Group

Filed under Business Architecture, Enterprise Architecture, Enterprise Transformation, Professional Development, Standards, TOGAF, TOGAF®, Uncategorized

The Onion From The Inside Out

By Stuart Boardman, Senior Business Consultant, Business & IT Advisory, KPN Consulting and Ed Harrington, Senior Consulting Associate, Conexiam

The Open Group Open Platform 3.0™ (OP3.0) services often involve a complex network of interdependent parties[1]. Each party has its own concept of the value it expects from the service. One consequence of this is that each party depends on the value other parties place on the service. If it’s not core business for one of them, its availability and reliability could be in doubt. So the others need to be aware of this and have some idea of how much that matters to them.

In a previous post, we used the analogy of an onion to model various degrees of relationship between parties. At a high level the onion looks like this:

By Stuart Boardman, KPN“Onion”

Every player has their own version of this onion. Every player’s own perspective is from the middle of it. The complete set of players will be distributed across different layers of the onion depending on whose onion we are looking at.

In a short series of blogs, we’re going to use a concrete use-case to explore what various players’ onions look like. To understand that onion involves working from the middle out. We all know that you can’t peel an onion starting in the middle, so let’s not get hung up on the metaphor. It’s only useful in as far as it fits with our real business objective. In this case the objective is to have the best possible chance of understanding and then realizing the potential value of a service.

Defining and Realizing Value

Earlier this year, The Open Group published a set of Open Platform 3.0 use cases. One of these use cases (#15) considers the energy market ecosystem involved in smart charging of electric vehicles. The players in this use case include:

  • The Vehicle User
  • Supplier/Charging Operator(s)
  • Distribution Service Operator (DSO).
  • Electricity Bulk Generators
  • Transmission (National Grid) Operator
  • Local Government

By Stuart Boardman, KPN

The use case describes a scenario involving these players:

A local controller (a device – known in OP3.0 as part of the Internet of Things) controls one or more charging stations. The Charging Operator informs the vehicle (and possibly the Vehicle User) via the local controller how much capacity is available to it. If the battery is nearly full the vehicle can inform the local controller that it needs less capacity and this capacity can then be made available to other vehicles at other charging stations.

The Charging Operator determines the capacity to be made available on the basis of information provided by the DSO (maximum allowable capacity at that time), possibly combined with commercial information (e.g., current spot prices, predicted trends, flexibility agreements with vehicle-owners/customers where applicable). The DSO has predicted available capacity on the basis of currently predicted weather conditions and long-term usage patterns in the relevant area. The DSO is able to adapt to unexpected changes in real-time and restrict or increase the locally available capacity.

Value For The Various Parties

The Vehicle User

For the sake of making it interesting let’s say that the vehicle user is a taxi driver. For her, the value is primarily in being able to charge the vehicle at a convenient time, place, speed and cost. But the perception of what constitutes value in those categories may vary depending on whether she uses a public charging station or charges at home. In either case the service she uses is focused on the Supplier/Charging operator, because that is who she pays for the service. The bill includes generic DSO costs but the customer has no direct relationship with a DSO and is only really aware of them when maintenance is carried out. Factors like convenient time and place may bring Local Government into the picture, because they are often the party who make parking spaces for electric vehicles available.

By Stuart Boardman, KPN“The Taxi Driver’s Onion”

Local Government

Local government is then also responsible for policing the proper use of these spaces. The importance assigned by local government to making these facilities available is a question of policy balanced by cost/gain (licenses and parking fees). Policy is influenced by the economy, by the convictions of the councilors, by lobbyists (especially those connected with the DSO, Bulk Generators and Transmission Operators), by innovation and natural resources and by the attitude of the public towards electric vehicles, which in turn may be influenced by national government policy. In some countries (e.g. The Netherlands) there are tax incentives for the acquisition of electric cars. If this policy changes in a country, the number of electric vehicles could increase or decrease dramatically. Local government has a dependency on and formal relationship with the Supplier that manages the Charging Stations. The relationship with the DSO is indirect unless they have been partners in an initiative to promote electric vehicles.

 By Stuart Boardman, KPN “Local Government’s Onion”

The Distribution Service OperatorBy Stuart Boardman, KPN

Value for the DSO involves balancing its regulatory obligation to provide continuity of energy supply with the cost of investment to achieve that and with the public perception of the value of that service. The DSO also gains value in terms of reputation from investing in innovation and energy saving. That value is expressed in its own long-term future as an enterprise. The DSO, being very much the hub in this use case, is dependent on the Supplier and the Vehicle User (with the vehicle’s battery as proxy) to provide the information needed to ensure continuity – and of course on the Transmission Operator the Bulk Generators to provide power. It does not, however, have any direct relationship with any Bulk Generator or even necessarily know who they are or where they are located.

 

By Stuart Boardman, KPN“The Distribution Service Operator’s Onion”

The Bulk Generator

The Bulk Generator has no direct involvement in this use case but has an indirect dependency on anything affecting the level of usage of electricity, as this affects the market price and long-term future of its product. So there is generic value (or anti-value) in the use case if it is widely implemented.

To be continued…

Those were the basics of the approach. There’s a lot more to be done before you can say you have a grip on value realization in such a scenario.

In the next blog, we’ll dive deeper into the use case, identify other relevant stakeholders and look at other dependencies that may influence value across the chain.

[1] Open Platform 3.0 refers to this as a “wider business ecosystem”. In fact such ecosystems exist for all kinds of services. We just happen to be focusing on this kind of service.

By Stuart Boardman, KPNStuart Boardman is a Senior Business Consultant with KPN Consulting where he leads the Enterprise Architecture practice and consults to clients on Cloud Computing, Enterprise Mobility and The Internet of Everything. He is Co-Chair of The Open Group Open Platform 3.0™ Forum and was Co-Chair of the Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by KPN, the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI as well as several Open Group white papers, guides and standards. He is a frequent speaker at conferences on the topics of Open Platform 3.0 and Identity.

harrington_ed_0Ed Harrington is a Senior Consulting Associate with Conexiam, a Calgary, Canada headquartered consultancy. He also heads his own consultancy, EPH Associates. Prior positions include Principle Consultant with Architecting the Enterprise where he provided TOGAF and other Enterprise Architecture (EA) discipline training and consultancy; EVP and COO for Model Driven Solutions, an EA, SOA and Model Driven Architecture Consulting and Software Development company; various positions for two UK based companies, Nexor and ICL and 18 years at General Electric in various marketing and financial management positions. Ed has been an active member of The Open Group since 2000 when the EMA became part of The Open Group and is past chair of various Open Group Forums (including past Vice Chair of the Architecture Forum). Ed is TOGAF® 9 certified.

Comments Off on The Onion From The Inside Out

Filed under Business Architecture, Interoperability, Open Platform 3.0, Service Oriented Architecture, Strategy, Uncategorized

The Business of Managing IT: The Open Group IT4IT™ Forum

By The Open Group

At The Open Group London 2014 event in October, the launch of The Open Group IT4IT™ Forum was announced. The goal of the new Forum is to create a Reference Architecture and standard that will allow IT departments to take a more holistic approach to managing the business of IT with continuous insight and control, enabling Boundaryless Information Flow™ across the IT Value Chain.

We recently spoke to Forum member Charlie Betz, Founder, Digital Management Academy, LLC, about the new Forum, its origins and why it’s time for IT to be managed as if it were a business in itself.

As IT has become more central to organizations, its role has changed drastically from the days when companies had one large mainframe or just a few PCs. For many organizations today, particularly large enterprises, IT is becoming a business within the business.

The problem with most IT departments, though, is that IT has never really been run as if it was a business.

In order for IT to better cope with rapid technological change and become more efficient at transitioning to the service-based model that most businesses today require, IT departments need guidance as to how the business of IT can be run. What’s at stake are things such as how to better manage IT at scale, how to understand IT as a value chain in its own right and how organizations can get better visibility into the vast amount of economic activity that’s currently characterized in organizations through technology.

The Open Group’s latest Forum aims to do just that.

The Case for IT Management

In the age of digital transformation, IT has become an integral part of how business is done. So says Charlie Betz, one of the founding members of the IT4IT Forum. From the software in your car to the supply chain that brings you your bananas, IT has become an irreplaceable component of how things work.

Quoting industry luminary Marc Andreessen, Betz says “software is eating the world.” Similarly, Betz says, IT management is actually beginning to eat management, too. Although this might seem laughable, we have become increasingly dependent on computing systems in our everyday lives. With that dependence comes significant concerns about the complexity of those systems and the potential they carry for chaotic behaviors. Therefore, he says, as technology becomes pervasive, how IT is managed will increasingly dictate how businesses are managed.

“If IT is increasing in its proportion of all product management, and all markets are increasingly dependent on managing IT, then understanding pure IT management becomes critically important not just for IT but for all business management,” Betz says.

According to Betz, the conversation about running the business of IT has been going on in the industry for a number of years under the guise of ideas such as “enterprise resource planning for IT” and the like. Ultimately, though, Betz says managing IT comes down to determining what IT’s value chain is and how to deliver on it.

By The Open GroupBetz compares modern IT departments to atoms, cells and bits where atoms represent hardware, including servers, data centers and networks; cells represent people; and bits are represented by software. In this analogy, these three things comprise the fundamental resources that an IT department manages. When reduced to economic terms, Betz says, what is currently lacking in most IT departments is a sense of how much things are worth, what the total costs are for acquisition and maintenance for capabilities and the supply and demand dynamics for IT services.

For example, in traditional IT management, workloads are defined by projects, tickets and also a middle ground characterized by work that is smaller than a project and larger than a ticket, Betz says. Often IT departments lack an understanding of how the three relate to each other and how they affect resources—particularly in the form of people—which becomes problematic because there is no holistic view of what the department is doing. Without that aggregate view, management is not only difficult but nearly impossible.

Betz says that to get a grasp on the whole, IT needs to take a cue from the lean management movement and first understand where the work originates and what it’s nature is so activities and processes don’t continue to proliferate without being managed.

Betz believes part of the reason IT has not better managed itself to date is because the level of complexity within IT has grown so quickly. He likens it to the frog in the boiling water metaphor—if the heat is turned up incrementally, the frog doesn’t know what’s hit him until it’s too late.

“Back when you had one computer it just wasn’t a concern,” he said. “You had very few systems that you were automating. It’s not that way nowadays. You have thousands of them. The application portfolio in major enterprises—depending on how you count applications, which is not an easy question in and of itself—the range is between 5000-10,000 applications. One hundred thousand servers is not unheard of. These are massive numbers, and the complexity is unimaginable. The potential for emergent chaotic behavior is unprecedented in human technological development.”

Betz believes the reason there is a perception that IT is poorly managed is also because it’s at the cutting-edge of every management question in business today. And because no one has ever dealt with systems and issues this complex before, it’s difficult to get a handle on them. Which is why the time for creating a framework for how IT can be managed has come.

IT4IT

The IT4IT Forum grew out of a joint initiative that was originally undertaken by Royal Dutch Shell and HP. Begun as a high-level user group within HP, companies such as Accenture, Achmea, Munich RE and PwC have also been integral in pulling together the initial work that has been provided to The Open Group to create the Forum. As the group began to develop a framework, it was clear that what they were developing needed to become an open standard, Betz says, so the group turned to The Open Group.

“It was pretty clear that The Open Group was the best fit for this,” he says. “There was clearly recognition and understanding on the part of The Open Group senior staff that this was a huge opportunity. They were very positive about it from the get-go.”

Currently in development, the IT4IT standard will provide guidance and specifications for how IT departments can provide consistent end-to-end service across the IT Value Chain and lifecycle. The IT Value Chain is meant to provide a model for managing the IT services life cycle and for how those service can be brokered with enterprises. By providing the IT similar level functionality as other critical business functions (such as finance or HR), IT is enabled to achieve better levels of predictability and efficiency.

By The Open Group

Betz says developing a Reference Architecture for IT4IT will be helpful for IT departments because it will provide a tested model for departments to begin the process of better management. And having that model be created by a vendor-neutral consortium helps provide credibility for users because no one company is profiting from it.

“It’s the community telling itself a story of what it wants to be,” he said.

The Reference Architecture will not only include prescriptive methods for how to design, procure and implement the functionality necessary to better manage IT departments but will also include real-world use cases related to current industry trends such as Cloud-sourcing, Agile, Dev-Ops and service brokering. As an open standard, it will also be designed to work with existing industry standards that IT departments may already be using including ITIL®, CoBIT®, SAFe® and TOGAF®, an Open Group standard.

With almost 200 pages of material already developed toward a standard, Betz says the Forum released its initial Snapshot for the standard available in late November. From there the Forum will need to decide which sections should be included as normative parts for the standard. The hope is to have the first version of the IT4IT Reference Architecture standard available next summer, Betz says.

For more on The Open Group IT4IT Forum or to become a member, please visit http://www.opengroup.org/IT4IT.

 

Comments Off on The Business of Managing IT: The Open Group IT4IT™ Forum

Filed under architecture, IT, IT4IT, Standards, Uncategorized, Value Chain

The Open Group London 2014: Eight Questions on Retail Architecture

By The Open Group

If there’s any vertical sector that has been experiencing constant and massive transformation in the ages of the Internet and social media, it’s the retail sector. From the ability to buy goods whenever and however you’d like (in store, online and now, through mobile devices) to customers taking to social media to express their opinions about brands and service, retailers have a lot to deal with.

Glue Reply is a UK-based consulting firm that has worked with some of Europe’s largest retailers to help them plan their Enterprise Architectures and deal with the onslaught of constant technological change. Glue Reply Partner Daren Ward and Senior Consultant Richard Veryard sat down recently to answer our questions about how the challenges of building architectures for the retail sector, the difficulties of seasonal business and the need to keep things simple and agile. Ward spoke at The Open Group London 2014 on October 20.

What are some of the biggest challenges facing the retail industry right now?

There are a number of well-documented challenges facing the retail sector. Retailers are facing new competitors, especially from discount chains, as well as online-only retailers such as Amazon. Retailers are also experiencing an increasing fragmentation of spend—for example, grocery customers buying smaller quantities more frequently.

At the same time, the customer expectations are higher, especially across multiple channels. There is an increased intolerance of poor customer service, and people’s expectations of prompt response is increasing rapidly, especially via social media.

There is also an increasing concern regarding cost. Many retailers have huge amounts invested in physical space and human resources. They can’t just keep increasing these costs, they must understand how to become more efficient and create new ways to make use of these resources.

What role is technology playing in those changes, and which technologies are forcing the most change?

New technologies are allowing us to provide shoppers with a personalized customer experience more akin to an old school type service like when the store manager knew my name, my collar size, etc. Combining technologies such as mobile and iBeacons is allowing us to not only reach out to our customers, but to also provide a context and increase relevance.

Some retailers are becoming extremely adept in using social media. The challenge here is to link the social media with the business process, so that the customer service agent can quickly check the relevant stock position and reserve the stock before posting a response on Facebook.

Big data is becoming one of the key technology drivers. Large retailers are able to mobilize large amounts of data, both from their own operations as well as external sources. Some retailers have become highly data-driven enterprises, with the ability to make rapid adjustments to marketing campaigns and physical supply chains. As we gather more data from more devices all plugged into the Internet of Things (IoT), technology can help us make sense of this data and spot trends we didn’t realize existed.

What role can Enterprise Architecture play in helping retailers, and what can retailers gain from taking an architectural approach to their business?

One of the key themes of the digital transformation is the ability to personalize the service, to really better understand our customers and to hold a conversation with them that is meaningful. We believe there are four key foundation blocks to achieving this seamless digital transformation: the ability to change, to integrate, to drive value from data and to understand the customer journey. Core to the ability to change is a business-driven roadmap. It provides all involved with a common language, a common set of goals and a target vision. This roadmap is not a series of hurdles that must be delivered, but rather a direction of travel towards the target allowing us to assess the impact of course corrections as we go and ensure we are still capable of arriving at our destination. This is how we create an agile environment, where tactical changes are still simple course corrections continuing on the right direction of travel.

Glue Reply provides a range of architecture services to our retail clients, from capability led planning to practical development of integration solutions. For example, we produced a five-year roadmap for Sainsbury’s, which allows IT investment to combine longer-term foundation projects with short-term initiatives that can respond rapidly to customer demand.

Are there issues specific to the retail sector that are particularly challenging to deal with in creating an architecture and why?

Retail is a very seasonal business—sometimes this leaves a very small window for business improvements. This also exaggerates the differences in the business and IT lifecycles. The business strategy can change at a pace often driven by external factors, whilst elements of IT have a lifespan of many years. This is why we need a roadmap—to assess the impact of these changes and re-plan and prioritize our activities.

Are there some retailers that you think are doing a good job of handling these technology challenges? Which ones are getting it right?

Our client John Lewis has just been named ‘Omnichannel Retailer of the Year’ at the World Retail Awards 2014. They have a vision, and they can assess the impact of change. We have seen similar success at Sainsbury’s, where initiatives such as brand match are brought to market with real pace and quality.

How can industry standards help to support the retail industry?

Where appropriate, we have used industry standards such as the ARTS (Association for Retail Standards) data model to assist our clients in creating a version that is good enough. But mostly, we use our own business reference models, which we have built up over many years of experience working with a range of different retail businesses.

What can other industries learn from how retailers are incorporating architecture into their operations?

The principle of omnichannel has a lot of relevance for other consumer-facing organizations, but also retail’s focus on loyalty. It’s not about creating a sale stampede, it’s about the brand. Apple is clearly an excellent example—when people queue for hours to be the first to buy the new product, at a price that will only reduce over time. Some retailers are making great use of customer data and profiling. And above, all successful retailers understand three key architectural principles that will drive success in any other sector—keep it simple, drive value and execute well.

What can retailers do to continue to best meet customer expectations into the future?

It’s no longer about the channel, it’s about the conversation. We have worked with the biggest brands in Europe, helping them deliver multichannel solutions that consider the conversation. The retailer that enables this conversation will better understand their customers’ needs and build long-term relationships.

By The Open GroupDaren Ward is a Partner at Reply in the UK. As well as being a practicing Enterprise Architecture, Daren is responsible for the development of the Strategy and Architecture business as well as playing a key role in driving growth of Reply in the UK. He is committed to helping organizations drive genuine business value from IT investments, working with both commercial focused business units and IT professionals.  Daren has helped establish Architecture practices at many organizations. Be it enterprise, solutions, integration or information architecture, he has helped these practices delivery real business value through capability led architecture and business-driven roadmaps.

 

RichardVeryard 2 June 2014Richard Veryard is a Business Architect and author, specializing in capability-led planning, systems thinking and organizational intelligence. Last year, Richard joined Glue Reply as a senior consultant in the retail sector.

 

Comments Off on The Open Group London 2014: Eight Questions on Retail Architecture

Filed under big data, Business Architecture, digital technologies, Enterprise Architecture, Internet of Things, Uncategorized

The Open Group London 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

On a crisp October Monday in London yesterday, The Open Group hosted the first day of its event at Central Methodist Hall, Westminster. Almost 200 attendees from 32 countries explored how to “Empower Your Business; Enabling Boundaryless Information Flow™”.

Just across the way from another landmark in the form of Westminster Abbey, the day began with a welcome from Allen Brown, President and CEO of The Open Group, before Magnus Lindkvist, the Swedish trendspotter and futurologist, began his keynote on “Competition and Creation in Globulent Times”.

In a very thought-provoking talk, Magnus pondered on how quickly the world now moves, declaring that we now live in a 47 hour world, where trends can spread quicker than ever before. Magnus argued that this was a result of an R&D process – rip off and duplicate, rather than organic innovation occurring in multiple places.

Magnus went on to consider the history of civilization which he described as “nothing, nothing, a little bit, then everything” as well as providing a comparison of vertical and horizontal growth. Magnus posited that while we are currently seeing a lot of horizontal growth globally (the replication of the same activity), there is very little vertical growth, or what he described as “magic”. Magnus argued that in business we are seeing companies less able to create as they are focusing so heavily on simply competing.

To counter this growth, Magnus told attendees that they should do the following in their day-to-day work:

  • Look for secrets – Whether it be for a certain skill or a piece of expertise that is as yet undiscovered but which could reap significant benefit
  • Experiment – Ensure that there is a place for experimentation within your organization, while practicing it yourself as well
  • Recycle failures – It’s not always the idea that is wrong, but the implementation, which you can try over and over again
  • Be patient and persistent – Give new ideas time and the good ones will eventually succeed

Following this session was the long anticipated launch of The Open Group IT4IT™ Forum, with Christopher Davis from the University of South Florida detailing the genesis of the group before handing over to Georg Bock from HP Software who talked about the Reference Architecture at the heart of the IT4IT Forum.

Hans Van Kesteren, VP & CIO of Global Functions at Shell, then went into detail about how his company has helped to drive the growth of the IT4IT Forum. Starting with an in-depth background to the company’s IT function, Hans described how as a provider of IT on a mass scale, the changing technology landscape has had a significant impact on Shell and the way it manages IT. He described how the introduction of the IT4IT Forum will help his organization and others like it to adapt to the convergence of technologies, allowing for a more dynamic yet structured IT department.

Subsequently Daniel Benton, Global Managing Director of IT Strategy at Accenture, and Georg Bock, Senior Director IT Management Software Portfolio Strategy at HP, provided their vision for the IT4IT Forum before a session where the speakers took questions from the floor. Those individuals heavily involved in the establishment of the IT4IT Forum received particular thanks from attendees for their efforts, as you can see in the accompanying picture.

In its entirety, the various presentations from the IT4IT Forum members provided a compelling vision for the future of the group. Watch this space for further developments now it has been launched.

IT4IT

The Open Group IT4IT™ Forum Founding Members

In the afternoon, the sessions were split into tracks illustrating the breadth of the material that The Open Group covers. On Monday this provided an opportunity for a range of speakers to present to attendees on topics from the architecture of banking to shaping business transformation. Key presenters included Thomas Obitz, Senior Manager, FSO Advisory Performance Improvement, EY, UK and Dr. Daniel Simon, Managing Partner, Scape Consulting, Germany.

The plenary and many of the track presentations are available at livestream.com.

The day concluded with an evening drinks reception within Central Hall Westminster, where attendees had the opportunity to catch up with acquaintances old and new. More to come on day two!

Join the conversation – @theopengroup #ogLON

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Comments Off on The Open Group London 2014 – Day One Highlights

Filed under architecture, Boundaryless Information Flow™, Business Architecture, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Open Platform 3.0, Professional Development, Standards, Uncategorized

Discussing Enterprise Decision-Making with Penelope Everall Gordon

By The Open Group

Most enterprises today are in the process of jumping onto the Big Data bandwagon. The promise of Big Data, as we’re told, is that if every company collects as much data as they can—about everything from their customers to sales transactions to their social media feeds—executives will have “the data they need” to make important decisions that can make or break the company. Not collecting and using your data, as the conventional wisdom has it, can have deadly consequences for any business.

As is often the case with industry trends, the hype around Big Data contains both a fair amount of truth and a fair amount of fuzz. The problem is that within most organizations, that conventional wisdom about the power of data for decision-making is usually just the tip of the iceberg when it comes to how and why organizational decisions are made.

According to Penelope Gordon, a consultant for 1Plug Corporation who was recently a Cloud Strategist at Verizon and was formerly a Service Product Strategist at IBM, that’s why big “D” (Data) needs to be put back into the context of enterprise decision-making. Gordon, who spoke at The Open Group Boston 2014, in the session titled “Putting the D Back in Decision” with Jean-Francois Barsoum of IBM, argues that a focus on collecting a lot of data has the potential to get in the way of making quality decisions. This is, in part, due to the overabundance of data that’s being collected under the assumption that you never know where there’s gold to be mined in your data, so if you don’t have all of it at hand, you may just miss something.

Gordon says that assuming the data will make decisions obvious also ignores the fact that ultimately decisions are made by people—and people usually make decisions based on their own biases. According to Gordon, there are three types of natural decision making styles—heart, head and gut styles—corresponding to different personality types, she said; the greater the amount of data the more likely that it will not balance the natural decision-making style.

Head types, Gordon says, naturally make decisions based on quantitative evidence. But what often happens is that head types often put off making a decision until more data can be collected, wanting more and more data so that they can make the best decision based on the facts. She cites former President Bill Clinton as a classic example of this type. During his presidency, he was famous for putting off decision-making in favor of gathering more and more data before making the decision, she says. Relying solely on quantitative data also can mean you may miss out on other important factors in making optimal decisions based on either heart (qualitative) or instinct. Conversely, a gut-type presented with too much data will likely just end up ignoring data and acting on instinct, much like former President George W. Bush, Gordon says.

Gordon believes part of the reason that data and decisions are more disconnected than one might think is because IT and Marketing departments have become overly enamored with what technology can offer. These data providers need to step back and first examine the decision objectives as well as the governance behind those decisions. Without understanding the organization’s decision-making processes and the dynamics of the decision-makers, it can be difficult to make optimal and effective strategic recommendations, she says, because you don’t have the full picture of what the stakeholders will or will not accept in terms of a recommendation, data or no data.

Ideally, Gordon says, you want to get to a point where you can get to the best decision outcome possible by first figuring out the personal and organizational dynamics driving decisions within the organization, shifting the focus from the data to the decision for which the data is an input.

“…what you’re trying to do is get the optimal outcome, so your focus needs to be on the outcome, so when you’re collecting the data and assessing the data, you also need to be thinking about ‘how am I going to present this data in a way that it is going to be impactful in improving the decision outcomes?’ And that’s where the governance comes into play,” she said.

Governance is of particular importance now, Gordon says, because decisions are increasingly being made by individual departments, such as when departments buy their own cloud-enabled services, such as sales force automation. In that case, an organization needs to have a roadmap in place with compensation to incent decision-makers to adhere to that roadmap and decision criteria for buying decisions, she said.

Gordon recommends that companies put in place 3-5 top criteria for each decision that needs to be made so that you can ensure that the decision objectives are met. This distillation of the metrics gives decision-makers a more comprehensible picture of their data so that their decisions don’t become either too subjective or disconnected from the data. Lower levels of metrics can be used underneath each of those top-level criteria to facilitate a more nuanced valuation. For example, if an organization needing to find good partner candidates scored and ranked (preferably in tiers) potential partners using decision criteria based on the characteristics of the most attractive partner, rather than just assuming that companies with the best reputation or biggest brands will be the best, then they will expeditiously identify the optimal partner candidates.

One of the reasons that companies have gotten so concerned with collecting and storing data rather than just making better decisions, Gordon believes, is that business decisions have become inherently more risky. The required size of investment is increasing in tandem with an increase in the time to return; time to return is a key determinant of risk. Data helps people feel like they are making competent decisions but in reality does little to reduce risk.

“If you’ve got lots of data, then the thinking is, ‘well, I did the best that I could because I got all of this data.’ People are worried that they might miss something,“ she said. “But that’s where I’m trying to come around and say, ‘yeah, but going and collecting more data, if you’ve got somebody like President Clinton, you’re just feeding into their tendency to put off making decisions. If you’ve got somebody like President Bush and you’re feeding into their tendency to ignore it, then there may be some really good information, good recommendations they’re ignoring.”

Gordon also says that having all the data possible to work with isn’t usually necessary—generally a representative sample will do. For example, she says the U.S Census Bureau takes the approach where it tries to count every citizen; consequently certain populations are chronically undercounted and inaccuracies pass undetected. The Canadian census, on the other hand, uses representative samples and thus tends to be much more accurate—and much less expensive to conduct. Organizations should also think about how they can find representative or “proxy” data in cases where collecting data that directly addresses a top-level decision criteria isn’t really practical.

To begin shifting the focus from collecting data inputs to improving decision outcomes, Gordon recommends clearly stating the decision objectives for each major decision and then identifying and defining the 3-5 criteria that are most important for achieving the decision objectives. She also recommends ensuring that there is sufficient governance and a process in place for making decisions including mechanisms for measuring the performance of the decision-making process and the outcomes resulting from the execution of that process. In addition, companies need to consider whether their decisions are made in a centralized or decentralized manner and then adapt decision governance accordingly.

One way that Enterprise Architects can help to encourage better decision-making within the organizations in which they work is to help in developing that governance rather than just providing data or data architectures, Gordon says. They should help stakeholders identify and define the important decision criteria, determine when full population rather than representative sampling is justified, recognize better methods for data analysis, and form decision recommendations based on that analysis. By gauging the appropriate blend of quantitative and qualitative data for a particular decision maker, an Architect can moderate gut types’ reliance on instinct and stimulate head and heart types’ intuition – thereby producing an optimally balanced decision. Architects should help lead and facilitate execution of the decision process, as well as help determine how data is presented within organizations in order to support the recommendations with the highest potential for meeting the decision objectives.

Join the conversation – #ogchat

penelopegordonPenelope Gordon recently led the expansion of the channel and service packaging strategies for Verizon’s cloud network products. Previously she was an IBM Strategist and Product Manager bringing emerging technologies such as predictive analytics to market. She helped to develop one of the world’s first public clouds.

 

 

2 Comments

Filed under architecture, Conference, Data management, Enterprise Architecture, Governance, Professional Development, Uncategorized

Q&A with Allen Brown, President and CEO of The Open Group

By The Open Group

Last month, The Open Group hosted its San Francisco 2014 conference themed “Toward Boundaryless Information Flow™.” Boundaryless Information Flow has been the pillar of The Open Group’s mission since 2002 when it was adopted as the organization’s vision for Enterprise Architecture. We sat down at the conference with The Open Group President and CEO Allen Brown to discuss the industry’s progress toward that goal and the industries that could most benefit from it now as well as The Open Group’s new Dependability through Assuredness™ Standard and what the organization’s Forums are working on in 2014.

The Open Group adopted Boundaryless Information Flow as its vision in 2002, and the theme of the San Francisco Conference has been “Towards Boundaryless Information Flow.” Where do you think the industry is at this point in progressing toward that goal?

Well, it’s progressing reasonably well but the challenge is, of course, when we established that vision back in 2002, life was a little less complex, a little bit less fast moving, a little bit less fast-paced. Although organizations are improving the way that they act in a boundaryless manner – and of course that changes by industry – some industries still have big silos and stovepipes, they still have big boundaries. But generally speaking we are moving and everyone understands the need for information to flow in a boundaryless manner, for people to be able to access and integrate information and to provide it to the teams that they need.

One of the keynotes on Day One focused on the opportunities within the healthcare industry and The Open Group recently started a Healthcare Forum. Do you see Healthcare industry as a test case for Boundaryless Information Flow and why?

Healthcare is one of the verticals that we’ve focused on. And it is not so much a test case, but it is an area that absolutely seems to need information to flow in a boundaryless manner so that everyone involved – from the patient through the administrator through the medical teams – have all got access to the right information at the right time. We know that in many situations there are shifts of medical teams, and from one medical team to another they don’t have access to the same information. Information isn’t easily shared between medical doctors, hospitals and payers. What we’re trying to do is to focus on the needs of the patient and improve the information flow so that you get better outcomes for the patient.

Are there other industries where this vision might be enabled sooner rather than later?

I think that we’re already making significant progress in what we call the Exploration, Mining and Minerals industry. Our EMMM™ Forum has produced an industry-wide model that is being adopted throughout that industry. We’re also looking at whether we can have an influence in the airline industry, automotive industry, manufacturing industry. There are many, many others, government and retail included.

The plenary on Day Two of the conference focused on The Open Group’s Dependability through Assuredness standard, which was released last August. Why is The Open Group looking at dependability and why is it important?

Dependability is ultimately what you need from any system. You need to be able to rely on that system to perform when needed. Systems are becoming more complex, they’re becoming bigger. We’re not just thinking about the things that arrive on the desktop, we’re thinking about systems like the barriers at subway stations or Tube stations, we’re looking at systems that operate any number of complex activities. And they bring an awful lot of things together that you have to rely upon.

Now in all of these systems, what we’re trying to do is to minimize the amount of downtime because downtime can result in financial loss or at worst human life, and we’re trying to focus on that. What is interesting about the Dependability through Assuredness Standard is that it brings together so many other aspects of what The Open Group is working on. Obviously the architecture is at the core, so it’s critical that there’s an architecture. It’s critical that we understand the requirements of that system. It’s also critical that we understand the risks, so that fits in with the work of the Security Forum, and the work that they’ve done on Risk Analysis, Dependency Modeling, and out of the dependency modeling we can get the use cases so that we can understand where the vulnerabilities are, what action has to be taken if we identify a vulnerability or what action needs to be taken in the event of a failure of the system. If we do that and assign accountability to people for who will do what by when, in the event of an anomaly being detected or a failure happening, we can actually minimize that downtime or remove it completely.

Now the other great thing about this is it’s not only a focus on the architecture for the actual system development, and as the system changes over time, requirements change, legislation changes that might affect it, external changes, that all goes into that system, but also there’s another circle within that system that deals with failure and analyzes it and makes sure it doesn’t happen again. But there have been so many evidences of failure recently. In the banks for example in the UK, a bank recently was unable to process debit cards or credit cards for customers for about three or four hours. And that was probably caused by the work done on a routine basis over a weekend. But if Dependability through Assuredness had been in place, that could have been averted, it could have saved an awfully lot of difficulty for an awful lot of people.

How does the Dependability through Assuredness Standard also move the industry toward Boundaryless Information Flow?

It’s part of it. It’s critical that with big systems the information has to flow. But this is not so much the information but how a system is going to work in a dependable manner.

Business Architecture was another featured topic in the San Francisco plenary. What role can business architecture play in enterprise transformation vis a vis the Enterprise Architecture as a whole?

A lot of people in the industry are talking about Business Architecture right now and trying to focus on that as a separate discipline. We see it as a fundamental part of Enterprise Architecture. And, in fact, there are three legs to Enterprise Architecture, there’s Business Architecture, there’s the need for business analysts, which are critical to supplying the information, and then there are the solutions, and other architects, data, applications architects and so on that are needed. So those three legs are needed.

We find that there are two or three different types of Business Architect. Those that are using the analysis to understand what the business is doing in order that they can inform the solutions architects and other architects for the development of solutions. There are those that are more integrated with the business that can understand what is going on and provide input into how that might be improved through technology. And there are those that can actually go another step and talk about here we have the advances and the technology and here are the opportunities for advancing our competitiveness and organization.

What are some of the other key initiatives that The Open Group’s forum and work groups will be working on in 2014?

That kind question is like if you’ve got an award, you’ve got to thank your friends, so apologies to anyone that I leave out. Let me start alphabetically with the Architecture Forum. The Architecture Forum obviously is working on the evolution of TOGAF®, they’re also working with the harmonization of TOGAF with Archimate® and they have a number of projects within that, of course Business Architecture is on one of the projects going on in the Architecture space. The Archimate Forum are pushing ahead with Archimate—they’ve got two interesting activities going on at the moment, one is called ArchiMetals, which is going to be a sister publication to the ArchiSurance case study, where the ArchiSurance provides the example of Archimate is used in the insurance industry, ArchiMetals is going to be used in a manufacturing context, so there will be a whitepaper on that and there will be examples and artifacts that we can use. They’re also working on in Archimate a standard for interoperability for modeling tools. There are four tools that are accredited and certified by The Open Group right now and we’re looking for that interoperability to help organizations that have multiple tools as many of them do.

Going down the alphabet, there’s DirecNet. Not many people know about DirecNet, but Direcnet™ is work that we do around the U.S. Navy. They’re working on standards for long range, high bandwidth mobile networking. We can go to the FACE™ Consortium, the Future Airborne Capability Environment. The FACE Consortium are working on their next version of their standard, they’re working toward accreditation, a certification program and the uptake of that through procurement is absolutely amazing, we’re thrilled about that.

Healthcare we’ve talked about. The Open Group Trusted Technology Forum, where they’re working on how we can trust the supply chain in developed systems, they’ve released the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program, that was launched this week, and we already have one accredited vendor and two certified test labs, assessment labs. That is really exciting because now we’ve got a way of helping any organization that has large complex systems that are developed through a global supply chain to make sure that they can trust their supply chain. And that is going to be invaluable to many industries but also to the safety of citizens and the infrastructure of many countries. So the other part of the O-TTPS is that standard we are planning to move toward ISO standardization shortly.

The next one moving down the list would be Open Platform 3.0™. This is really exciting part of Boundaryless Information Flow, it really is. This is talking about the convergence of SOA, Cloud, Social, Mobile, Internet of Things, Big Data, and bringing all of that together, this convergence, this bringing together of all of those activities is really something that is critical right now, and we need to focus on. In the different areas, some of our Cloud computing standards have already gone to ISO and have been adopted by ISO. We’re working right now on the next products that are going to move through. We have a governance standard in process and an ecosystem standard has recently been published. In the area of Big Data there’s a whitepaper that’s 25 percent completed, there’s also a lot of work on the definition of what Open Platform 3.0 is, so this week the members have been working on trying to define Open Platform 3.0. One of the really interesting activities that’s gone on, the members of the Open Platform 3.0 Forum have produced something like 22 different use cases and they’re really good. They’re concise and they’re precise and the cover a number of different industries, including healthcare and others, and the next stage is to look at those and work on the ROI of those, the monetization, the value from those use cases, and that’s really exciting, I’m looking forward to peeping at that from time to time.

The Real Time and Embedded Systems Forum (RTES) is next. Real-Time is where we incubated the Dependability through Assuredness Framework and that was where that happened and is continuing to develop and that’s really good. The core focus of the RTES Forum is high assurance system, and they’re doing some work with ISO on that and a lot of other areas with multicore and, of course, they have a number of EC projects that we’re partnering with other partners in the EC around RTES.

The Security Forum, as I mentioned earlier, they’ve done a lot of work on risk and dependability. So they’ve not only their standards for the Risk Taxonomy and Risk Analysis, but they’ve now also developed the Open FAIR Certification for People, which is based on those two standards of Risk Analysis and Risk Taxonomy. And we’re already starting to see people being trained and being certified under that Open FAIR Certification Program that the Security Forum developed.

A lot of other activities are going on. Like I said, I probably left a lot of things out, but I hope that gives you a flavor of what’s going on in The Open Group right now.

The Open Group will be hosting a summit in Amsterdam May 12-14, 2014. What can we look forward to at that conference?

In Amsterdam we have a summit – that’s going to bring together a lot of things, it’s going to be a bigger conference that we had here. We’ve got a lot of activity in all of our activities; we’re going to bring together top-level speakers, so we’re looking forward to some interesting work during that week.

 

 

 

1 Comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Cybersecurity, EMMMv™, Enterprise Architecture, FACE™, Healthcare, O-TTF, RISK Management, Standards, TOGAF®