Monthly Archives: July 2011

How strategic planning relates to Enterprise Architecture

By Serge Thorn, Architecting the Enterprise

TOGAF® often refers to Strategic Planning without specifying the details of what it consists of. This document explains why there is a perfect fit between the two.

Strategic Planning means different things to different people. The one constant is its reference to Business Planning which usually occurs annually in most companies. One of the activities of this exercise is the consideration of the portfolio of projects for the following financial year, also referred to as Project Portfolio Management (PPM). This activity may also be triggered when a company modifies its strategy or the priority of its current developments.

Drivers for Strategic Planning may be

  • New products or services
  • A need for greater business flexibility and agility
  • Merger & acquisition
  • Company’s reorganization
  • Consolidation of manufacturing plants, lines of business, partners, information systems
  • Cost reduction
  • Risk mitigation
  • Business process management initiatives
  • Business process outsourcing
  • Facilities outsourcing or insourcing
  • Off-shoring

Strategic Planning as a process may include activities such as:

1. The definition of the mission and objectives of the enterprise

Most companies have a mission statement depicting the business vision, the purpose and value of the company and the visionary goals to address future opportunities. With that business vision, the board of the company defines the strategic (e.g. reputation, market share) and financial objectives (e.g. earnings growth, sales targets).

2. Environmental analysis

The environmental analysis may include the following activities:

  • Internal analysis of the enterprise
  • Analysis of the enterprise’s industry
  • A PEST Analysis (Political, Economic, Social, and Technological factors). It is very important that an organization considers its environment before beginning the marketing process. In fact, environmental analysis should be continuous and feed all aspects of planning, identify the strengths and weaknesses, the opportunities and threats (SWOT)

3. Strategy definition

Based on the previous activities, the enterprise matches strengths to opportunities and addressing its weaknesses and external threats and elaborate a strategic plan. This plan may then be refined at different levels in the enterprise. Below is a diagram explaining the various levels of plans.

To build that strategy, an Enterprise Strategy Model may be used to represent the Enterprise situation accurately and realistically for both past and future views. This can be based on Business Motivation Modeling (BMM) which allows developing, communicating and managing a Strategic Plan. Another possibility is the use of Business Model Canvas which allows the company to develop and sketch out new or existing business models. (Refer to the work from Alexander Osterwalder).

The model’s analyses should consider important strategic variables such as customers demand expectations, pricing and elasticity, competitor behavior, emissions regulations, future input, and labor costs.

These variables are then mapped to the main important business processes (capacity, business capabilities, constraints), and economic performance to determine the best decision for each scenario. The strategic model can be based on business processes such as customer, operation or background processes. Scenarios can then are segmented and analyzed by customer, product portfolio, network redesign, long term recruiting and capacity, mergers and acquisitions to describe Segment Business Plans.

4. Strategy Implementation

The selected strategy is implemented by means of programs, projects, budgets, processes and procedures. The way in which the strategy is implemented can have a significant impact on whether it will be successful, and this is where Enterprise Architecture may have a significant role to play. Often, the people formulating the strategy are different from those implementing it. The way the strategy is communicated is a key element of the success and should be clearly explained to the different layers of management including the Enterprise Architecture team.

To support that strategy, different levels or architecture can be considered such as strategic, segment or capability architectures.

This diagram below illustrates different examples of new business capabilities linked to a Strategic Architecture.

It also illustrates how Strategic Architecture supports the enterprise’s vision and the strategic plan communicated to an Enterprise Architecture team.

Going to the next level allows better detail the various deliverables and the associated new business capabilities. The segment architecture maps perfectly to the Segment Business Plan.

5. Evaluation and monitoring

The implementation of the strategy must be monitored and adjustments made as required.

Evaluation and monitoring consists of the following steps:

  • Definition of KPIs, measurement and metrics
  • Definition of target values for these KPIs
  • Perform measurements
  • Compare measured results to the pre-defined standard
  • Make necessary changes

Strategic Planning and Enterprise Architecture should ensure that information systems do not operate in a vacuum. At its core, TOGAF® 9 uses/supports a strong set of guidelines that were promoted in the previous version, and have surrounded them with guidance on how to adopt and apply TOGAF® to the enterprise for Strategic Planning initiatives. The ADM diagram below clearly indicates the integration between the two processes.

The company’s mission and vision must be communicated to the Enterprise Architecture team which then maps Business Capabilities to the different Business Plans levels.

Many Enterprise Architecture projects are focused at low levels but should be aligned with Strategic Corporate Planning. Enterprise Architecture is a critical discipline, one Strategic Planning mechanism to structure an enterprise. TOGAF® 9 is without doubt an effective framework for working with stakeholders through Strategic Planning and architecture work, especially for organizations who are actively transforming themselves.

This article has previously appeared in Serge Thorn’s personal blog and appears here with his permission.

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He has more than 20 years of experience in Banking and Finance and 5 years of experience in the Pharmaceuticals industry. Among various roles, he has been responsible for the Architecture team in an international bank, where he gained wide experience in the deployment and management of information systems in Private Banking, Wealth Management, and also in IT architecture domains such as the Internet, dealing rooms, inter-banking networks, and Middle and Back-office. He then took charge of IT Research and Innovation (a function which consisted of motivating, encouraging creativity, and innovation in the IT Units), with a mission to help to deploy a TOGAF based Enterprise Architecture, taking into account the company IT Governance Framework. He also chaired the Enterprise Architecture Governance worldwide program, integrating the IT Innovation initiative in order to identify new business capabilities that were creating and sustaining competitive advantage for his organization. Serge has been a regular speaker at various conferences, including those by The Open Group. His topics have included, “IT Service Management and Enterprise Architecture”, “IT Governance”, “SOA and Service Management”, and “Innovation”. Serge has also written several articles and whitepapers for different magazines (Pharma Asia, Open Source Magazine). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

2 Comments

Filed under Enterprise Architecture, TOGAF®

PODCAST: Industry moves to fill gap for building trusted supply chain technology accreditation

By Dana Gardner, Interabor Solutions

Listen to this recorded podcast here: BriefingsDirect-IT Industry Looks to Open Trusted Technology Forum to Help Secure Supply Chains That Support Technology Products

The following is the transcript of a sponsored podcast panel discussion on how the OTTF is developing an accreditation process for trusted technology, in conjunction with the The Open Group Conference, Austin 2011.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect. Today, we present a sponsored podcast discussion in conjunction with The Open Group Conference in Austin, Texas, the week of July 18, 2011.

We’ve assembled a distinguished panel to update us on The Open Group Trusted Technology Forum, also known as the OTTF, and an accreditation process to help technology acquirers and buyers safely conduct global procurement and supply chain commerce. We’ll examine how the security risk for many companies and organizations has only grown, even as these companies form essential partnerships and integral supplier relationships. So, how can all the players in a technology ecosystem gain assurances that the other participants are adhering to best practices and taking the proper precautions?

Here to help us better understand how established standard best practices and an associated accreditation approach can help make supply chains stronger and safer is our panel. We’re here with Dave Lounsbury, the Chief Technical Officer at The Open Group. Welcome back, Dave.

Dave Lounsbury: Hello Dana. How are you?

Gardner: Great. We are also here with Steve Lipner, Senior Director of Security Engineering Strategy in the Trustworthy Computing Security at Microsoft. Welcome back, Steve.

Steve Lipner: Hi, Dana. Glad to be here.

Gardner: We’re here also with Joshua Brickman, Director of the Federal Certification Program Office at CA Technologies. Welcome, Joshua.

Joshua Brickman: Thanks for having me.

Gardner: And, we’re here too with Andras Szakal. He’s the Vice President and CTO of IBM’s Federal Software Group. Welcome back, Andras.

Andras Szakal: Thank you very much, Dana. I appreciate it.

Gardner: Dave, let’s start with you. We’ve heard so much lately about “hacktivism,” break-ins, and people being compromised. These are some very prominent big companies, both public and private. How important is it that we start to engage more with things like the OTTF?

No backup plan

Dave LounsburyLounsbury: Dana, a great quote coming out of this week’s conference was that we have moved the entire world’s economy to being dependent on the Internet, without a backup plan. Anyone who looks at the world economy will see, not only are we dependent on it for exchange of value in many cases, but even information about how our daily lives are run, traffic, health information, and things like that. It’s becoming increasingly vitally important that we understand all the aspects of what it means to have trust in the chain of components that deliver that connectivity to us, not just as technologists, but as people who live in the world.

Gardner: Steve Lipner, your thoughts on how this problem seems to be only getting worse?

Lipner: Well, the attackers are becoming more determined and more visible across the Internet ecosystem. Vendors have stepped up to improve the security of their product offerings, but customers are concerned. A lot of what we’re doing in The Open Group and in the OTTF is about trying to give them additional confidence of what vendors are doing, as well as inform vendors what they should be doing.

Gardner: Joshua Brickman, this is obviously a big topic and a very large and complex area. From your perspective, what is it that the OTTF is good at? What is it focused on? What should we be looking to it for in terms of benefit in this overall security issue?

Brickman: One of the things that I really like about this group is that you have all of the leaders, everybody who is important in this space, working together with one common goal. Today, we had a discussion where one of the things we were thinking about is, whether there’s a 100 percent fail-safe solution to cyber? And there really isn’t. There is just a bar that you can set, and the question is how much do you want to make the attackers spend, before they can get over that bar? What we’re going to try to do is establish that level, and working together, I feel very encouraged that we are getting there, so far.

Gardner: Andras, we are not just trying to set the bar, but we’re also trying to enforce, or at least have clarity into, what other players in an ecosystem are doing. So that accreditation process seems to be essential.

Szakal: We’re going to develop a standard, or are in the process of developing a specification and ultimately an accreditation program, that will validate suppliers and providers against that standard. It’s focused on building trust into a technology provider organization through this accreditation program, facilitated through either one of several different delivery mechanisms that we are working on. We’re looking for this to become a global program, with global partners, as we move forward.

Gardner: It seems as if almost anyone is a potential target, and when someone decides to target you, you do seem to suffer. We’ve seen things with Booz Allen, RSA, and consumer organizations like Sony. Is this something that almost everyone needs to be more focused on? Are we at the point now where there is no such thing as turning back, Dave Lounsbury?

Global effort

Lounsbury: I think there is, and we have talked about this before. Any electronic or information system now is really built on components and software that are delivered from all around the globe. We have software that’s developed in one continent, hardware that’s developed in another, integrated in a third, and used globally. So, we really do need to have the kinds of global standards and engagement that Andras has referred to, so that there is that one bar for all to clear in order to be considered as a provider of trusted components.

Gardner: As we’ve seen, there is a weak link in any chain, and the hackers or the cyber criminals or the state sponsored organizations will look for those weak links. That’s really where we need to focus.

Lounsbury: I would agree with that. In fact, some of the other outcomes of this week’s conference have been the change in these attacks, from just nuisance attacks, to ones that are focused on monetization of cyber crimes and exfiltration of data. So the spectrum of threats is increasing a lot. More sophisticated attackers are looking for narrower and narrower attack vectors each time. So we really do need to look across the spectrum of how this IT technology gets produced in order to address it.

Gardner: Steve Lipner, it certainly seems that the technology supply chain is essential. If there is weakness there, then it’s difficult for the people who deploy those technologies to cover their bases. It seems that focusing on the technology providers, the ecosystems that support them, is a really necessary first step to taking this to a larger, either public or private, buyer side value.

Lipner: The tagline we have used for The Open Group TTF is “Build with Integrity, Buy with Confidence.” We certainly understand that customers want to have confidence in the hardware and software of the IT products that they buy. We believe that it’s up to the suppliers, working together with other members of the IT community, to identify best practices and then articulate them, so that organizations up and down the supply chain will know what they ought to be doing to ensure that customer confidence.

Gardner: Let’s take a step back and get a little bit of a sense of where this process that you are all involved with is. I know you’re all on working groups and in other ways involved in moving this forward, but it’s been about six months now since The OTTF was developed initially, and there was a white paper to explain that. Perhaps, one of you will volunteer to give us sort of a state of affairs where things are,. Then, we’d also like to hear an update about what’s been going on here in Austin. Anyone?

Szakal: Well, as the chair, I have the responsibility of keeping track of our milestones, so I’ll take that one. A, we completed the white paper earlier this year, in the first quarter. The white paper was visionary in nature, and it was obviously designed to help our constituents understand the goals of the OTTF. However, in order to actually make this a normative specification and design a program, around which you would have conformance and be able to measure suppliers’ conformity to that specification, we have to develop a specification with normative language.

First draft

We’re finishing that up as we speak and we are going to have a first draft here within the next month. We’re looking to have that entire specification go through company review in the fourth quarter of this year.

Simultaneously, we’ll be working on the accreditation policy and conformance criteria and evidence requirements necessary to actually have an accreditation program, while continuing to liaise with other evaluation schemes that are interested in partnering with us. In a global international environment, that’s very important, because there exist more than one of these regimes that we will have to exist, coexist, and partner with. Over the next year, we’ll have completed the accreditation program and have begun testing of the process, probably having to make some adjustments along the way. We’re looking at sometime within the first half of 2012 for having a completed program to begin ramping up.

Gardner: Is there an update on the public sector’s, or in the U.S., the federal government’s, role in this? Are they active? Are they leading? How would you characterize the public role or where you would like to see that go?

Szakal: The Forum itself continues to liaise with the government and all of our constituents. As you know, we have several government members that are part of the TTF and they are just as important as any of the other members. We continue to provide update to many of the governments that we are working with globally to ensure they understand the goals of the OTTF and how they can provide value synergistically with what we are doing, as we would to them.

Gardner: I’ll throw this back out to the panel? How about the activities this week at the conference? What have been the progress or insights that you can point to from that?

Brickman: We’ve been meeting for the first couple of days and we have made tremendous progress on wrapping up our framework and getting it ready for the first review. We’ve also been meeting with several government officials. I can’t say who they are, but what’s been good about it is that they’re very positive on the work that we’re doing, they support what we are doing and want to continue this discussion. It’s very much a partnership, and we do feel like it’s not just an industry-led project, where we have participation from folks who could very much be the consumers of this initiative.

Gardner: Clearly, there are a lot of stakeholders around the world, across both the public and private domains. Dave Lounsbury, what’s possible? What would we gain if this is done correctly? How would we tangibly look to improvements? I know that’s hard with security. It’s hard to point out what doesn’t happen, which is usually the result of proper planning, but how would you characterize the value of doing this all correctly say a year or two from now?

Awareness of security

Lounsbury: One of the trends we’ll see is that people are increasingly going to be making decisions about what technology to produce and who to partner with, based on more awareness of security.

A very clear possible outcome is that there will be a set of simple guidelines and ones that can be implemented by a broad spectrum of vendors, where a consumer can look and say, “These folks have followed good practices. They have baked secure engineering, secure design, and secure supply chain processes into their thing, and therefore I am more comfortable in dealing with them as a partner.”

Of course, what the means is that, not only do you end up with more confidence in your supply chain and the components for getting to that supply chain, but also it takes a little bit of work off your plate. You don’t have to invest as much in evaluating your vendors, because you can use commonly available and widely understood sort of best practices.

From the vendor perspective, it’s helpful because we’re already seeing places where a company, like a financial services company, will go to a vendor and say, “We need to evaluate you. Here’s our checklist.” Of course, the vendor would have to deal with many different checklists in order to close the business, and this will give them some common starting point.

Of course, everybody is going to customize and build on top of what that minimum bar is, depending on what kind of business they’re in. But at least it gives everybody a common starting point, a common reference point, some common vocabulary for how they are going to talk about how they do those assessments and make those purchasing decisions.

Gardner: Steve Lipner, do you think that this is going to find its way into a lot of RFPs, beginning a sales process, looking to have a major checkbox around these issues? Is that sort of how you see this unfolding?

Lipner: If we achieve the sort of success that we are aiming for and anticipating, you’ll see requirements for the OTTF, not only in RFPs, but also potentially in government policy documents around the world, basically aiming to increase the trust of broad collections of products that countries and companies use.

Gardner: Joshua Brickman, I have to imagine that this is a living type of an activity that you never really finish. There’s always something new to be done, a type of threat that’s evolving that needs to be reacted to. Would the TTF over time take on a larger role? Do you see it expanding into larger set of requirements, even as it adjusts to the contemporary landscape?

Brickman: That’s possible. I think that we are going to try to get something achievable out there in a timeframe that’s useful and see what sticks. One of the things that will happen is that as companies start to go out and test this, as with any other standard, the 1.0 standard will evolve to something that will become more germane, and as Steve said, will hopefully be adopted worldwide.

Agile and useful

It’s absolutely possible. It could grow. I don’t think anybody wants it to become a behemoth. We want it to be agile, useful, and certainly something readable and achievable for companies that are not multinational billion dollar companies, but also companies that are just out there trying to sell their piece of the pie into the space. That’s ultimately the goal of all of us, to make sure that this is a reasonable achievement.

Lounsbury: Dana, I’d like to expand on what Joshua just said. This is another thing that has come out of our meetings this week. We’ve heard a number of times that governments, of course, feel the need to protect their infrastructure and their economies, but also have a realization that because of the rapid evolution of technology and the rapid evolution of security threats that it’s hard for them to keep up. It’s not really the right vehicle.

There really is a strong preference. The U.S. strategy on this is to let industry take the lead. One of the reasons for that is the fact that industry can evolve, in fact must evolve, at the pace of the commercial marketplace. Otherwise, they wouldn’t be in business.

So, we really do want to get that first stake in the ground and get this working, as Joshua said. But there is some expectation that, over time, the industry will drive the evolution of security practices and security policies, like the ones OTTF is developing at the pace of commercial market, so that governments won’t have to do that kind of regulation which may not keep up.

Gardner: Andras, any thoughts from your perspective on this ability to keep up in terms of market forces? How do you see the dynamic nature of this being able to be proactive instead of reactive?

Szakal: One of our goals is to ensure that the viability of the specification itself, the best practices, are updated periodically. We’re talking about potentially yearly. And to include new techniques and the application of potentially new technologies to ensure that providers are implementing the best practices for development engineering, secure engineering, and supply chain integrity. It’s going to be very important for us to continue to evolve these best practices over a period of time and not allow them to fall into a state of static disrepair.

I’m very enthusiastic, because many of the members are very much in agreement that this is something that needs to be happening in order to actually raise the bar on the industry, as we move forward, and help the entire industry adopt the practices and then move forward in our journey to secure our critical infrastructure.

Gardner: Given that this has the potential of being a fairly rapidly evolving standard that may start really appearing in RFPs and be impactful for real world business success, how should enterprises get involved from the buy side? How should suppliers get involved from the sell side, given that this is seemingly a market driven, private enterprise driven activity?

I’ll throw this out to the crowd. What’s the responsibility from the buyers and the sellers to keep this active and to keep themselves up-to-date?

Lounsbury: Let me take the first stab at this. The reason we’ve been able to make the progress we have is that we’ve got the expertise in security from all of these major corporations and government agencies participating in the TTF. The best way to maintain that currency and maintain that drive is for people who have a problem, if you’re on the buy side or expertise from either side, to come in and participate.

Hands-on awareness

You have got the hands-on awareness of the market, and bringing that in and adding that knowledge of what is needed to the specification and helping move its evolution along is absolutely the best thing to do.

That’s our steady state, and of course the way to get started on that is to go and look at the materials. The white paper is out there. I expect we will be doing snapshots of early versions of this that would be available, so people can take a look at those. Or, come to an Open Group Conference and learn about what we are doing.

Gardner: Anyone else have a reaction to that? I’m curious. Given that we are looking to the private sector and market forces to be the drivers of this, will they also be the drivers in terms of enforcement? Is this voluntary? One would hope that market forces reward those who seek accreditation and demonstrate adhesion to the standard, and that those who don’t would suffer. Or is there a potential for more teeth and more enforcement? Again, I’ll throw this out to the panel at large.

Szakal: As vendors, we’d would like to see minimal regulation and that’s simply the nature of the beast. In order for us to conduct our business and lower the cost of market entry, I think that’s important.

I think it’s important that we provide leadership within the industry to ensure that we’re following the best practices to ensure the integrity of the products that we provide. It’s through that industry leadership that we will avoid potential damaging regulations across different regional environments.

We certainly wouldn’t want to see different regulations pop-up in different places globally. It makes for very messy technology insertion opportunity for us. We’re hoping that by actually getting engaged and providing some self-regulation, we won’t see additional government or international regulation.

Lipner: One of the things that my experience has taught me is that customers are very aware these days of security, product integrity, and the importance of suppliers paying attention to those issues. Having a robust program like the TTF and the certifications that it envisions will give customers confidence, and they will pay attention to that. That will change their behavior in the market even without formal regulations.

Gardner: Joshua Brickman, any thoughts on the self-regulation benefits? If that doesn’t work, is it self-correcting? Is there a natural approach that if this doesn’t work at first, that a couple of highly publicized incidents and corporations that suffer for not regulating themselves properly, would ride that ship, so to speak?

Brickman: First of all, industry setting the standard is an idea that has been thrown around a while, and I think that it’s great to see us finally doing it in this area, because we know our stuff the best.

But as far as an incident indicating that it’s not working, I don’t think so. We’re going to try to set up a standard, whereby we’re providing public information about what our products do and what we do as far as best practices. At the end of the day the acquiring agency, or whatever, is going to have to make decisions, and they’re going to make intelligent decisions, based upon looking at folks that choose to go through this and folks that choose not to go through it.

It will continue

The bad news that continues to come out is going to continue to happen. The only thing that they’ll be able to do is to look to the companies that are the experts in this to try to help them with that, and they are going to get some of that with the companies that go through these evaluations. There’s no question about it.

At the end of the day, this accreditation program is going to shake out the products and companies that really do follow best practices for secure engineering and supply chain best practices.

Gardner: What should we expect next? As we heard, there has been a lot of activity here in Austin at the conference. We’ve got that white paper. We’re working towards more mature definitions and approaching certification and accreditation types of activities. What’s next? What milestone should we look to? Andras, this is for you.

Szakal: Around November, we’re going to be going through company review of the specification and we’ll be publishing that in the fourth quarter.

We’ll also be liaising with our government and international partners during that time and we’ll also be looking forward to several upcoming conferences within The Open Group where we conduct those activities. We’re going to solicit some of our partners to be speaking during those events on our behalf.

As we move into 2012, we’ll be working on the accreditation program, specifically the conformance criteria and the accreditation policy, and liaising again with some of our international partners on this particular issue. Hopefully we will, if all things go well and according to plan, come out of 2012 with a viable program.

Gardner: Dave Lounsbury, any further thoughts about next steps, what people should be looking for, or even where they should go for more information?

Lounsbury: Andras has covered it well. Of course, you can always learn more by going to www.opengroup.org and looking on our website for information about the OTTF. You can find drafts of all the documents that have been made public so far, and there will be our white paper and, of course, more information about how to become involved.

Gardner: Very good. We’ve been getting an update about The Open Group Trusted Technology Forum, OTTF, and seeing how this can have a major impact from a private sector perspective and perhaps head off issues about lack of trust and lack of clarity in a complex evolving technology ecosystem environment.

I’d like to thank our guests. We’ve been joined by Dave Lounsbury, Chief Technical Officer at The Open Group. Thank you, sir.

Lounsbury: Thank you, Dana.

Gardner: Steve Lipner, the Senior Director of Security Engineering Strategy in the Trustworthy

Computing Security Group at Microsoft. Thank you, Steve.

Lipner: Thanks, Dana.

Gardner: Joshua Brickman, who is the Director of the Federal Certification Program Office in CA Technologies, has also joined us. Thank you.

Brickman: I enjoyed it very much.

Gardner: And Andras Szakal, Vice President and CTO of IBM’s Federal Software Group. Thank you, sir.

Szakal: It’s my pleasure. Thank you very much, Dana.

Gardner: This discussion has come to you as a sponsored podcast in conjunction with The Open Group Conference in Austin, Texas. We are here the week of July 18, 2011. I want to thank our listeners as well. This is Dana Gardner, Principal Analyst at Interarbor Solutions. Don’t forget to come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com.

Copyright The Open Group 2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirect™ blogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

3 Comments

Filed under Cybersecurity, Supply chain risk

Google+, spiral galaxies and Louisa’s bright idea

By Stuart Boardman, Getronics

Even a social media lightweight like me could hardly avoid getting caught up in the Google+ hype. It got me thinking about the rate and unpredictability of change in the web world, and the effect on large enterprises of phenomena originating in the consumer and small business market.

The concept of the enterprise is experiencing a change – maybe a radical one. The role of technology is also changing (not for the first time).  New business models are developing which, whilst not technological in nature, would never have been thought of without the technology developments of the last few years. Other business models, around for a bit longer and not technological in nature are pushing technology in a different direction. Business models themselves are subject to increasingly frequent and not always predictable change. What does this mean for the practice of enterprise architecture?

Back to Google+. A few years ago, when Web 2.0 was the buzzword and everyone conveniently forgot that the web actually started out as a vehicle for user-generated content and collaboration (sorry, had to get that off my chest), there was quite a battery of social media providers all with their own specializations: Facebook, MySpace, LinkedIn, Plaxo, Flickr and a whole bunch of sites for gamers and metal fans, etc. In Holland, where I live, we had our own, very successful variant on Facebook. Had. In the period since then there’s been increasing consolidation with Facebook developing an astonishing hegemony. I’ll admit that I assumed that’s how it would stay until a new Zuckerberg came up with a totally new game changer. But now here comes Google with a new spin on a familiar story and they look set to chew a big chunk out of the market. Perhaps even the enterprise market.

What’s this have to do with enterprises? Well, the fact is that everyone in the enterprise is out there exchanging ideas via Twitter and LinkedIn and Facebook and Google+ (and whatever specialized sites they might use) and they’re even using those media to tell the rest of the enterprise that they published something internally – because otherwise no one will notice. And then there’s co-creation, which is becoming increasingly common – even in large enterprises. So like it or not, the enterprise is being irreversibly extended out into the blogosphere. And that means that the enterprise is far more exposed to the trends and rapid shifts in the world outside its own boundaries than it has ever been before.

In the meantime, a lot of other stuff has been changing for the enterprise. Extended Enterprise, the idea that an enterprise’s business processes (some of them) are performed by third parties, who themselves are part of a broad value network, is pretty much established fact for many large and medium-sized organizations. And there are unexpected new business models emerging. Think about app stores. I can’t see inside Steve Jobs’ head but I suspect the app store was developed to support the iPhone – not the other way around. Just like iTunes was developed to support the iPod. But now everyone has app stores (even if Apple doesn’t want them to use the name). The end result of all this has been to create a whole new market, where new entrepreneurs can develop low-cost software and sell it in bulk across multiple platforms and where those platforms could hardly exist without the app developers. I’m even using an iPhone app (also available on Android) to drive my domestic hi-fi system (from a very respectable English high end designer – not some uber-nerd). The app strengthens the business case for the equipment and makes money for the developer. The app didn’t come with the equipment; I bought it at the app store. App stores themselves are new value propositions for their owners (Apple, etc). In some ways we could regard this as a commercial instantiation of the old Virtual Enterprise idea – an “enterprise” consisting of a loosely coupled, shifting alliance of unrelated legal entities. I like this recent quote from Verna Allee (@vernaallee): “Business models often assume the world revolves around our organization when we really revolve in spiral galaxy ecosystems”. Louisa Leontiades (@MoneyDecisions) is launching a web based, social media driven consultancy, which provides a sort of app store where independent experts can sell tools and frameworks (and yes, get consultancy deals too). Brilliant. And of course all this represents a very scattered field of players, business models and solutions.

How are these developments reflected in Enterprise Architecture? In particular what is the effect on architecture vision and the idea of a target state?  I came across another interesting discussion recently. Robert Phipps (@robert_phipps) suggested in a discussion with Tom Graves (@tetradian) that an enterprise consists of many vectors, each with its own direction and velocity and each potentially colliding with and therefore affecting the direction and velocity of the others. Sounds pretty abstract but if you accept the metaphor you can see that the target state is going to be different depending on how the various collisions work out. In a “traditional” enterprise, the power relationships between the various vectors is pretty stable and the influence of external factors limited to macro-economic effects. The metaphor is still valid but the scale of the problem much smaller (less entropy). If what I wrote above is correct, there aren’t too many “traditional” enterprises these days.  Tom took the metaphor a bit further and made reference to Quantum theory. That’s also interesting, because it focuses on a probabilistic situation. Architecting for uncertainty. Welcome to the real world. That doesn’t mean there is no value in a target. You have to have some idea what you want to achieve based on what you know now. It just doesn’t need to be too prescriptive. Or put another way, it needs not to be too sensitive to unpredictability. Everything (not just the technology) is likely to have changed before you get there. It certainly increases the relative importance of the first steps on the road to that target. The less particle/vector collisions take place within one step, the more chance of achieving something useful. After each step we re-evaluate both target and roadmap. Iterate. Agile EA. And guess what? This is what we’re supposed to do anyway – design for change, constant delivery of value. No “wait a year and we’ll have something for you”. So if we’ve not been doing that, we’ve not been doing what the enterprise needed from us. All that’s changed is that we will become increasingly irrelevant, if we don’t do it.

Stuart Boardman is a Senior Business Consultant with Getronics Consulting where he co-leads the Enterprise Architecture practice as well as the Cloud Computing solutions group. He is co-lead of The Open Group Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI. He is a frequent speaker at conferences on the topics of Cloud, SOA, and Identity.

Comments Off

Filed under Enterprise Architecture

Twtpoll results from The Open Group Conference, Austin

The Open Group set up two informal Twitter polls this week during The Open Group Conference, Austin. If you wondered about the results, or just want to see what our Twitter followers think about some topline issues in the industry in very simple terms, see our twtpoll.com results below.

On Day One of the Conference, when the focus of the discussions was on Enterprise Architecture, we polled our Twitter followers about the profession of EA: Do you think we will see a shortage of enterprise architects within the next decade? Why or why not?

The results were split right down the middle.  A sampling of responses:

  • “Yes, if you mean good enterprise architects. No, if you are just referring to those who take the training but have no clue.”
  • “Yes, retirement of Boomers; not enough professionalization.”
  • “Yes, we probably will. EA is becoming more and more important because of fast-changing economies which request fast company change.”
  • “No: budgets, not a priority.”
  • “No. Over just one year, I can see the significant increase of the number of people who are talking EA and realizing the benefits of EA practices.”
  • “No, a majority of companies will still be focusing on short-term improvement because of ongoing current economic status, etc. EA is not a priority.”

On Day Two, while we focused on security, we queried our Twitter followers about data security protection: What type of data security do you think provides the most comprehensive protection of PII? Again, the results were split evenly into thirds:

What do you think of our informal poll results? Do you agree? Disagree? And why?

And let us know if you have thoughts on this one: Do you think SOA is essential for Cloud implementation?

Want some survey results you can really sink your teeth into? View the results of The Open Group’s State of the Industry Cloud Survey. Download the slide deck from The Open Group Bookstore, or read a previous blog post about it.

The Open Group Conference, Austin is now in member meetings. Join us in Taipei or San Francisco for our next Conferences! Hear best practices and case studies on Enterprise Architecture, Cloud, Security and more, presented by preeminent thought leaders in the industry.

Comments Off

Filed under Cloud/SOA, Cybersecurity, Enterprise Architecture

Improve Data Quality and Enable Semantic Interoperability by Adopting the UDEF

By Ron Schuldt, UDEF-IT, LLC

For many years I have been promoting UDEF as an enabler for semantic interoperability. The problem with being an early adopter of UDEF where the benefit is semantic interoperability is that multiple systems need to adopt UDEF before you can begin to realize the benefits. The semantic interoperability benefit is realized by leveraging the UDEF ID that is language and application independent.

Within the last seven or eight months, I realized that UDEF provides an immediate benefit – specifically, when you follow the six basic steps of mapping your data to the UDEF, you improve the clarity of the name associated with the data. The UDEF name adds substantial clarity when compared to the typically cryptic names assigned to fields within an application. The garbage-in, garbage-out problem is likely heavily affected by poor names assigned to the fields. UDEF is a means for correcting the poor names issue which gives the early adopters of UDEF an immediate benefit while enabling the system for interoperability.

Semantic interoperability is one of the topics being discussed at The Open Group Conference, Austin, currently underway this week.

Ron Schuldt is a Senior Partner of UDEF-IT, LLC. He has more twenty years experience with national and international data standards covering the gamut from Electronic Data Interchange (EDI) to the National Information Exchange Model (NIEM). He is Chairman of The Open Group UDEF Project.

Comments Off

Filed under Semantic Interoperability

The Open Group releases O-ACEML standard, automates compliance configuration

By Jim Hietala, The Open Group

The Open Group recently published the Open Automated Compliance Expert Markup Language (O-ACEML) standard. This new technical standard addresses needs to automate the process of configuring IT environments to meet compliance requirements. O-ACEML will also enable customer organizations and their auditors to streamline data gathering and reporting on compliance postures.

O-ACEML is aimed at helping organizations to reduce the cost of compliance by easing manual compliance processes. The standard is an open, simple, and well defined XML schema that allows compliance requirements to be described in machine understandable XML, as opposed to requiring humans to interpret text from documents. The standard also allows for a remediation element, which enables multiple requirements (from different compliance regulations) to be blended into a single policy. An example of where this is needed would be in password length and complexity requirements, which may differ between different regulations. O-ACEML allows for the most secure setting to be selected and applied, enabling all of the regulations to be met or exceeded.

O-ACEML is intended to allow platform vendors and compliance management and IT-GRC providers to utilize a common language for exchanging compliance information. The existence of a single common standard will benefit platform vendors and compliance management tool vendors, by reducing development costs and providing a single data interchange format. Customer organizations will benefit by reducing costs for managing compliance in complex IT environments, and by increasing effectiveness. Where previously organizations might have just polled a small but representative sample of their environment to assess compliance, the existence of a standard allowing automated compliance checking makes it feasible to survey the entire environment rather than just a small sample. Organizations publishing government compliance regulations, as well as the de facto standard compliance organizations that have emerged in many industries will benefit by enabling more cost effective adoption and simpler compliance with their regulations and standards.

In terms of how O-ACEML relates to other compliance related standards and content frameworks, it has similarities and differences to NIST’s Security Content Automation Protocol (SCAP), and to the Unified Compliance Framework (UCF). One of the main differences is that O-ACEML was architected such that a Compliance Organization could author its IT security requirements in a high-level language, without the need to understand the specific configuration command and settings an OS or device will use to implement the requirement. A distinguishing capability of O-ACEML is that it gathers artifacts as it moves from Compliance Organization directive, implementation on a particular device, and the result of the configuration command. The final step of this automation not only produces a computer system configured meet or exceed the compliance requirements, it also produces an xml document from which compliance reporting can be simplified. The Open Group plans to work with NIST and the creators of the UCF to ensure interoperability and integration between O-ACEML and SCAP and UCF.

If you have responsibility for managing compliance in your organization, or if you are a vendor whose software product involves compliance or security configuration management, we invite you to learn more about O-ACEML.

An IT security industry veteran, Jim Hietala is Vice President of Security at The Open Group, where he is responsible for security programs and standards activities. He holds the CISSP and GSEC certifications. Jim is based in the U.S.

8 Comments

Filed under Cybersecurity, Standards

What’s in a name? A change of name for our ITAC and ITSC professional certifications

By Steve Philp, The Open Group

With the launch of the new Open Group website this week, we have taken the opportunity to rebrand our two skills- and experience-based certification programs. The IT Architect Certification (ITAC) program has now become The Open Group Certified Architect (Open CA) program. The IT Specialist Certification (ITSC) program has now become The Open Group Certified IT Specialist (Open CITS) program.

The new website (and our new logo for that matter) places much more emphasis on the word “Open”.  This is one of the reasons for us changing the names away from something that is not readily associated with The Open Group (i.e. ITAC) to something that is more recognizable as an Open Group certification, i.e. Open CA.  However, besides the name change, there hasn’t been any changes made to the way in which either program operates. For example, the Open CA program still requires candidates to submit a comprehensive certification package detailing their skills and experience gained on working on architecture-related projects, followed by a rigorous peer review process.

The Open CA program still currently focuses on IT-related work. However, the architecture profession is constantly evolving and to reflect this, The Open Group will incorporate dedicated Business Architecture and Enterprise Architecture streams into the Open CA program at some point in the near future. Our members are working on defining the core skills that an architect needs to have and the specific competencies one needs for each of these three specialist areas. Therefore, going forward, applicants will be able to become an Open CA in:

  • IT Architecture
  • Business Architecture
  • Enterprise Architecture

There are approximately 3,200 individuals who are certified in our Open CA program, and by broadening the scope of the program we hope to certify many more architects. There are more than 2,300 certified IT Specialists in the Open CITS program, and many organizations around the world have identified this type of skills- and experienced-based program as a necessary part of the process to develop their own internal IT profession frameworks.

Open CA and Open CITs can be used in the recruitment process and help to guarantee a consistent and quality assured service on project proposals, procurements and on service level agreements. They can also help in the assessment of individuals in specific IT domains and provide a roadmap for their future career development.  You can find out more about our programs by visiting the professional certification area of our website.

Steve PhilpSteve Philp is the Marketing Director for the Open CA and Open CITS certification programs at The Open Group. Over the past 20 years, Steve has worked predominantly in sales, marketing and general management roles within the IT training industry. Based in Reading, UK, he joined the Open Group in 2008 to promote and develop the organization’s skills and experience-based IT certifications.

Comments Off

Filed under Certifications