Tag Archives: supply chain management

Operational Resilience through Managing External Dependencies

By Ian Dobson & Jim Hietala, The Open Group

These days, organizations are rarely self-contained. Businesses collaborate through partnerships and close links with suppliers and customers. Outsourcing services and business processes, including into Cloud Computing, means that key operations that an organization depends on are often fulfilled outside their control.

The challenge here is how to manage the dependencies your operations have on factors that are outside your control. The goal is to perform your risk management so it optimizes your operational success through being resilient against external dependencies.

The Open Group’s Dependency Modeling (O-DM) standard specifies how to construct a dependency model to manage risk and build trust over organizational dependencies between enterprises – and between operational divisions within a large organization. The standard involves constructing a model of the operations necessary for an organization’s success, including the dependencies that can affect each operation. Then, applying quantitative risk sensitivities to each dependency reveals those operations that have highest exposure to risk of not being successful, informing business decision-makers where investment in reducing their organization’s exposure to external risks will result in best return.

O-DM helps you to plan for success through operational resilience, assured business continuity, and effective new controls and contingencies, enabling you to:

  • Cut costs without losing capability
  • Make the most of tight budgets
  • Build a resilient supply chain
  •  Lead programs and projects to success
  • Measure, understand and manage risk from outsourcing relationships and supply chains
  • Deliver complex event analysis

The O-DM analytical process facilitates organizational agility by allowing you to easily adjust and evolve your organization’s operations model, and produces rapid results to illustrate how reducing the sensitivity of your dependencies improves your operational resilience. O-DM also allows you to drill as deep as you need to go to reveal your organization’s operational dependencies.

O-DM support training on the development of operational dependency models conforming to this standard is available, as are software computation tools to automate speedy delivery of actionable results in graphic formats to facilitate informed business decision-making.

The O-DM standard represents a significant addition to our existing Open Group Risk Management publications:

The O-DM standard may be accessed here.

Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world.  In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Security Architecture

Cybersecurity Threats Key Theme at Washington, D.C. Conference – July 16-20, 2012

By The Open Group Conference Team

Identify risks and eliminating vulnerabilities that could undermine integrity and supply chain security is a significant global challenge and a top priority for governments, vendors, component suppliers, integrators and commercial enterprises around the world.

The Open Group Conference in Washington, D.C. will bring together leading minds in technology and government policy to discuss issues around cybersecurity and how enterprises can establish and maintain the necessary levels of integrity in a global supply chain. In addition to tutorial sessions on TOGAF and ArchiMate, the conference offers approximately 60 sessions on a varied of topics, including:

  • Cybersecurity threats and key approaches to defending critical assets and securing the global supply chain
  • Information security and Cloud security for global, open network environments within and across enterprises
  • Enterprise transformation, including Enterprise Architecture, TOGAF and SOA
  • Cloud Computing for business, collaborative Cloud frameworks and Cloud architectures
  • Transforming DoD avionics software through the use of open standards

Keynote sessions and speakers include:

  • America the Vulnerable: Inside the New Threat Matrix of Digital Espionage, Crime and Warfare - Keynote Speaker: Joel Brenner, author and attorney at Cooley LLP
  • Meeting the Challenge of Cybersecurity Threats through Industry-Government Partnerships - Keynote Speaker: Kristin Baldwin, principal deputy, deputy assistant secretary of defense for Systems Engineering
  • Implementation of the Federal Information Security Management Act (FISMA) - Keynote Speaker: Dr. Ron Ross, project leader at NIST (TBC)
  • Supply Chain: Mitigating Tainted and Counterfeit Products - Keynote Panel: Andras Szakal, VP and CTO at IBM Federal; Daniel Reddy, consulting product manager in the Product Security Office at EMC Corporation; John Boyens, senior advisor in the Computer Security Division at NIST; Edna Conway, chief security strategist of supply chain at Cisco; and Hart Rossman, VP and CTO of Cyber Security Services at SAIC
  • The New Role of Open Standards – Keynote Speaker: Allen Brown, CEO of The Open Group
  • Case Study: Ontario Healthcare - Keynote Speaker: Jason Uppal, chief enterprise architect at QRS
  • Future Airborne Capability Environment (FACE): Transforming the DoD Avionics Software Industry Through the Use of Open Standards - Keynote Speaker: Judy Cerenzia, program director at The Open Group; Kirk Avery of Lockheed Martin; and Robert Sweeney of Naval Air Systems Command (NAVAIR)

The full program can be found here: http://www3.opengroup.org/events/timetable/967

For more information on the conference tracks or to register, please visit our conference registration page. Please stay tuned throughout the next month as we continue to release blog posts and information leading up to The Open Group Conference in Washington, D.C. and be sure to follow the conference hashtag on Twitter – #ogDCA!

1 Comment

Filed under ArchiMate®, Cloud, Cloud/SOA, Conference, Cybersecurity, Enterprise Architecture, Information security, OTTF, Standards, Supply chain risk

PODCAST: Why data and information management remain elusive after decades of deployments; and how to fix it

By Dana Gardner, Interabor Solutions

Listen to this recorded podcast here: BriefingsDirect-Effective Data Management Remains Elusive Even After Decades of Deployments

The following is the transcript of a sponsored podcast panel discussion on the state of data and information management strategies, in conjunction with the The Open Group Conference, Austin 2011.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect. Today, we present a sponsored podcast discussion in conjunction with the latest Open Group Conference in Austin, Texas, the week of July 18, 2011. We’ve assembled a distinguished panel to update us on the state of data and information management strategies. We’ll examine how it remains difficult for businesses to get the information they want in the way they can use, and why this has been a persistent problem. We’ll uncover the latest in the framework approach to information and data and look at how an information architect can make a big difference.

Here to help us better understand the role and impact of the information architect and also how to implement a successful data in information strategy is our panel. We’re here with Robert Weisman. He is CEO of Build The Vision Incorporated. Welcome to BriefingsDirect, Robert.

Robert Weisman: Thank you.

Gardner: We’re also here with Eugene Imbamba. He is Information Management Architect in IBM‘s Software Group. Welcome, Eugene.

Eugene Imbamba: Thank you very much.

Gardner: And we’re here also with Mei Selvage. She is the Lead in the IBM Community of Information Architects. Welcome to the show, Mei.

Mei Selvage: Thank you for having us.

Gardner: Tell me, Robert, why it is that it’s so hard for IT to deliver information access in the way that businesses really want.

Weisman: It’s the general insensitivity to information management concerns within the industry itself, which is very much becoming much more technology and tool-driven with the actual information not being taken into consideration. As a consequence, a lot of the solutions might work, but they don’t last, and they don’t, generally speaking, get the right information to the right person at the right time. Within The Open Group, we recognized this split about four years ago and that’s one reason that in TOGAF® 9 we redefined that information technology as “The lifecycle management of information and related technology within an organization.” We didn’t want to see an IM/IT split in organizations. We wanted to make sure that the architecture addressed the needs of the entire community, especially those requiring information and knowledge.

Gardner: Eugene, do you think if we focus more on the lifecycle management of information and the architecture frameworks like TOGAF, that we’ll get more to this requirement that business has that single view of reality?

Imbamba: Definitely, focusing on reference architecture methodologies are a good way to get going in the right direction. I don’t think it’s the end of all means to getting there. But, in terms of leveraging what’s been done, some of the architectures that have been developed, whether it’s TOGAF or some of the other artifacts out there, would help organizations, instead of spinning their wheels and reinventing the wheel, start building some of the foundational capabilities needed to have an enterprise information architecture.

Getting to the finish line

As a result, we’re seeing that each year with information management, projects starting up and projects collapsing for various reasons, whether it’s cost or just the process or people in place. Leveraging some of these artifacts, methods, and reference architectures is a way to help get started, and of course employing other areas of the information management disciplines to help get to the finish line.

Gardner: Mei, when it comes to learning from those that have done this well, what do we know about what works when it comes to data and information management? What can we point to and say, “Without question, moving in this direction is allowing us to be inclusive, move beyond just the data and databases, and get that view that the business is really looking for?”

Selvage: Eugene and I had a long debate over how we know that we’ve delivered a successful information architecture. Our conclusion comes out three plus one. The first piece is just like any strategy roadmap. You need to have a vision and strategy. To have a successful information architecture vision you really have to understand your business problem and your business vision. Then, you use applicable, proven referenced architecture and methodology to support that.

Once you have vision, then you come to the execution. How do you leverage your existing IT environments, integrates with them, keep good communication, and use the best practices? Finally, you have to get implemented on time and on schedule within the budget — and the end-user is satisfied.

Those are three parts. Then, the plus part is data governance, not just one-time project delivery. You’ll have to make sure that data governance is getting consistently implemented across the projects.

Gardner: How about in the direction of this organizational definition of what works and what doesn’t work? How important is it rather for an information architect role to emerge? Let’s start with you, Robert. Then, I’d like to take this to all of you. What is it about the information architect role that can play an important element here?

Weisman: The information architect will soon be called the knowledge architect to start realizing some of the promise that was seen in the 1980s and in the 1990s. The information architect’s role is essentially is to harmonize all manner of information and make sure it’s properly managed and accessible to the people who are authorized to see it. It’s not just the information architect. He has to be a team player, working closely with technology, because more and more information will be not just machine-readable, but machine-processable and interpretable. So he has to work with the people not only in technology, but with those developing applications, and especially those dealing with security because we’re creating more homogenous enterprise information-sharing environments with consolidated information holdings.

The paradigm is going to be changing. It’s going to be much more information-centric. The object-oriented paradigm, from a technical perspective, meant the encapsulation of the information. It’s happened, but at the process level.

When you have a thousand processes in the organization, you’ve got problems. Whereas, now we’d be looking at encapsulation of the information much more at the enterprise level so that information can be reused throughout the organization. It will be put in once and used many times.

Quality of information

The quality of the information will also be addressed through governance, particularly incorporating something called data stewardship, where people would be accountable, not only for the structure of the information but for the actual quality of the informational holdings.

Gardner: Thank you. Eugene, how do you see the role of the information architect as important in solidifying people’s thinking about this at that higher level, and as Robert said, being an advocate for the information across these other disciplines?

Imbamba: It’s inevitable that this role will definitely emerge and is going to take a higher-level position within organizations. Back to my earlier comment about information really becoming an issue, we have lots of information. We have variety of information and varied velocity of information requirements.

We don’t have enough folks today who are really involved in this discipline and some of the projections we have are within the next 20 years, we’re going to have a lot more information that needs to be managed. We need folks who are engaged in this space, folks who understand the space and really can think outside the box, but also understand what the business users want, what they are trying to drive to, and be able to provide solutions that really not only look at the business problem at hand but also what is the organization trying to do.

The role is definitely emerging, and within the next couple of years, as Robert said, the term might change from information architects to knowledge architects, based on where information is and what information provides to business.

Gardner: Mei, how far along are we actually on this definition and even professionalization of the information architect role?

Selvage: I’d like to share a little bit of what IBM is doing internally. We have a major change to our professional programs and certification programs. We’ve removed IT out of architect as title. We just call architect. Under architect we have business architecture, IT architecture, and enterprise architecture. Information architecture falls under IT architecture. Even though we were categorized one of the sub components of IT architecture.

Information architect, in my opinion, is more business-friendly than any other professionals. I’m not trying to put others down, but a lot of new folks come from data modeling backgrounds. They really have to understand business language, business process, and their roles.

When we have this advantage, we need to leverage those and not just keep thinking about how I create database structures and how I make my database perform better. Rather, my tasks today contribute to my business. I want to doing the right thing, rather than doing the wrong things sooner.

IBM reflects an industry shift. The architect is a profession and we all need to change our mindsets to be even broader.

Delivering business value

Weisman: I’d like to add to that. I fully agree, as I said, that The Open Group has created TOGAF 9 as a capability-based planning paradigm for the business planning. IM and IT are just two dimensions of that overall capability, and everything is pushed toward the delivery of business value.

You don’t have to align IM/IT with the business. IM and IT become an integral part of the business. This came out of the defense world in many cases and it has proven very successful.

IM, IT, and all of the architecture domains are going to have to really understand the business for that. It’ll be an interesting time in the next couple of years in the organizations that really want to derive competitive advantage from their information holdings, which is certainly becoming a key differentiator amongst large companies.

Gardner: Robert, perhaps while you’re talking about The Open Group, you could update us a bit on what took place at the Austin Conference, particularly vis-à-vis the workgroups. What was the gist of the development and perhaps any maturation that you can point to?

Weisman: We had some super presentations, in particular the one that Eugene and Mei gave that addressed information architecture and various associated processes and different types of sub- architectures/frameworks as well.

The Information Architecture Working Group, which is winding down after two years, has created a series of whitepapers. The first one addressed the concerns of the data management architecture and maps the data management body of knowledge processes to The Open Group Architecture Framework. That whitepaper went through final review in the Information Architecture Working Group in Austin.

We have an Information Architecture Vision paper, which is an overall rethinking of how information within an organization is going to be addressed in a holistic manner, incorporating what we’d like to think as all of the modern trends, all types of information, and figure out some sort of holistic way that we can represent that in an architecture. The vision paper is right now in the final review. Following that, we’re preparing a consolidated request for change to the TOGAF 9 specification. The whitepapers should be ready and available within the next three months for public consultation. This work should address many significant concerns in the domain of information architecture and management. I’m really confident the work that working group has done has been very productive.

Gardner: Now, you mentioned that Mei and Eugene delivered a presentation. I wonder if we can get an overview, a quick summary of the main points. Mei, would you care to go first?

Selvage: We’ve already talked a lot about what we have described in our presentation. Essentially, we need to understand what it means to have a successful solution information architecture. We need to leverage all those best practices, which come in a form of either a proven reference architecture or methodology, and use that to achieve alignment within the business. Eugene, do you have anything you want to specifically point out in our presentation?

Three keys

Imbamba: No, just to add to what you said. The three keys that we brought were the alignment of business and IT, using and leveraging reference architectures to successfully implement information architectures, and last was the adoption of proven methodology.

In our presentation, we defined these constructs, or topics, based on our understanding and to make sure that the audience had a common understanding of what these components meant. Then, we gave examples and actually gave some use cases of where we’ve seen this actually happen in organizations, and where there has been some success in developing successful projects through the implementation of these methods. That’s some of what we touched on.

Weisman: Just as a postscript from The Open Group, we’re coming with an Information Architecture and Planning Model. We have a comprehensive definition of data and information and knowledge; we’ve come up with a good generic lifecycle that can be used by all organizations. And, we addressed all the issues associated with them in a holistic way with respect to the information management functions of governance, planning, operations, decision support and business intelligence, records and archiving, and accessibility and privacy.

This is one of the main contributions that these whitepapers are going to provide is a good planning basis for the holistic management of all manner of information in the form of a complete model.

Gardner: We’ve heard about how the amount of data is going to be growing exponentially, perhaps 44 times in less than 10 years, and we’ve also heard that knowledge, information, and your ability to exploit it could be a huge differentiator in how successful you are in business. I even expect that many businesses will make knowledge and information of data part of their business, part of their major revenue capabilities — a product in itself.

Let’s look into the future. Why will the data and information management professionalization, this role of the information architect be more important based on some of the trends that we expect? Let’s start with you, Robert. What’s going to happen in the next few year that’s going to make it even more important to have the holistic framework, strategic view of data information?

Weisman: Right now, it’s competitive advantage upon which companies may rise and fall. Harvard Business School Press, Davenport in particular, has produced some excellent books on competitive analytics and the like, with good case studies. For example, a factory halfway through construction is stopped because they didn’t have timely access to the their information indicating the factory didn’t even need to be constructed. This speaks of information quality.

In the new service-based rather than industry-based economic paradigm, information will become absolutely key. With respect to the projected increase of information available, I actually see a decrease in information holdings within the enterprise itself.

This will be achieved through a) information management techniques, you will actually get rid of information; b) you will consolidate information; and c) with paradigms such as cloud, you don’t necessarily have to have information within the organization itself.

More with less

So you will be dealing with information holdings, that are accessible by the enterprise, and not necessarily just those that are held by the enterprise. There will also be further issues such as knowledge representation and the like, that will become absolutely key, especially with demographics as it stands now. We have to do more with less.

The training and professionalization of information architecture, or knowledge architecture, I anticipate will become key. However, knowledge architects cannot be educated totally in a silo, they also have to have a good understanding of the other architecture domains. A successful enterprise architect must understand all the the other architecture domains.

Gardner: Eugene, how about you, in terms of future trends that impact the increased importance of this role in this perspective on information?

Imbamba: From an IBM perspective, we’ve seen over the last 20 years organizations focusing on what I call an “application agenda,” really trying to implement enterprise resource planning (ERP) systems, supply chain management systems, and these systems have been very valuable for various reasons, reducing cost, bringing efficiencies within the business.

But, as you know, over the last 20 years, a lot of companies now have these systems in place, so the competitive advantage has been lost. So what we’re seeing right now is companies focusing on an information agenda, and the reason is that each organization has information about its customers, its products, its accounts like no other business would have.

So, what we’re seeing today is leveraging that information for competitive advantage, trying to optimize your business, gleaning the information that you have so that you can understand the relationships between your customers, between your partners, your suppliers, and optimize that to deliver the kinds of services and needs, the business wants and the customer’s needs. It’s a focus from application agenda to an information agenda to try and push what’s going on in that space.

Gardner: Mei, last word to you, future trends and why would they increase the need for the information architecture role?

Selvage: I like to see that from two perspectives. One is from the vendor perspective, just taking IBM as an example. The information management brand is the one that has the largest software products, which reflects market needs and the market demands. So there are needs to have information architects who are able to look over all those different software offerings in IBM and other major vendors too.

From the customer perspective, where I see a lot of trends is that many outsource basic database administration, kind of a commodity or activity out to a third-party where they keep the information architects in-house. That’s where we can add in the value. We can talk to the business. We can talk to the other components of IT, and really brings things together. That’s a trend I see more organizations are adopting.

Gardner: Very good. We’ve been discussing the role and impact of an information architect and perhaps how to begin to implement a more successful data and information strategy.

This comes to you as a sponsored podcast in conjunction with The Open Group Conference in Austin, Texas in the week of July 18, 2011. I’d like to thank our guests. We’ve been joined by Robert Weisman, CEO of Build The Vision Incorporated. Thanks so much, Robert.

Weisman: You’re very welcome. Thank you for inviting.

Gardner: And we’ve been here with Eugene Imbamba. He is Information Management Architect in IBM Software Group. Thank you, Eugene.

Imbamba: Thank you for having me.

Gardner: And Mei Selvage, she is Lead of the IBM Community of Information Architects. Thanks to you as well.

Selvage: You’re welcome. Thank you too.

Gardner: This is Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks to our viewers and listeners as well, and come back next time.

Listen to the podcast. Find it on iTunes/iPod and Podcast.com.

Copyright The Open Group 2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirect™ blogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

Comments Off

Filed under Data management

PODCAST: Exploring the role and impact of the Open Trusted Technology Framework (OTTF)

by Dana Gardner, Interarbor Solutions

Listen to this recorded podcast here: BriefingsDirect-Discover the Open Trusted Technology Provider Framework

The following is the transcript of a sponsored podcast panel discussion from The Open Group Conference, San Diego 2011 on The Open Trusted Technology Forum and its impact on business and government.

Dana Gardner: Hi, this is Dana Gardner, Principal Analyst at Interarbor Solutions, and you’re listening to BriefingsDirect.

Today, we present a sponsored podcast discussion in conjunction with The Open Group Conference held in San Diego, the week of February 7, 2011. We’ve assembled a panel to examine The Open Group’s new Open Trusted Technology Forum (OTTF), which was established in December.

The forum is tasked with finding ways to better conduct global procurement and supply-chain commerce among and between technology acquirers and buyers and across the ecosystem of technology providers. By providing transparency, collaboration, innovation, and more trust on the partners and market participants in the IT environment, the OTTF will lead to improved business risk for global supply activities in the IT field.

We’ll examine how the OTTF will function, what its new framework will be charged with providing, and we will examine ways that participants in the global IT commerce ecosystem can become involved with and perhaps use the OTTF’s work to its advantage.

Here with us to delve into the mandate and impact of the Trusted Technology Forum, we’re here with Dave Lounsbury. He is the Chief Technology Officer for The Open Group. Welcome, Dave.

Dave Lounsbury: Hi, Dana. How are you?

Gardner: I’m great. We’re also here with Steve Lipner, the Senior Director of Security Engineering Strategy in Microsoft’s Trustworthy Computing Group. Welcome, Steve.

Steve Lipner: Hi, Dana. Glad to be here.

Gardner: And, we’re also here with Andras Szakal, the Chief Architect in IBM’s Federal Software Group and an IBM distinguished engineer. Welcome.

Andras Szakal: Welcome. Thanks for having me.

Gardner: We’re also here with Carrie Gates, Vice President and Research Staff Member at CA Labs. Welcome.Carrie Gates: Thank you.

Gardner: Let’s start with you, Dave. Tell us in a nutshell what the OTTF is and why it came about?

Lounsbury: The OTTF is a group that came together under the umbrella of The Open Group to identify and develop standards and best practices for trusting supply chain. It’s about how one consumer in a supply chain could trust their partners and how they will be able to indicate their use of best practices in the market, so that people who are buying from the supply chain or buying from a specific vendor will be able to know that they can procure this with a high level of confidence.

Gardner: Clearly, people have been buying these sorts of products for some time. What’s new? What’s changed that makes this necessary?

Concerns by DoD

Lounsbury: There are a couple of dimensions on it, and I will start this off because the other folks in the room are far more expert in this than I am.

This actually started a while ago at The Open Group by a question from the U.S. Department of Defense (DoD), which faced the challenge of buying commercial off-the-shelf product. Obviously, they wanted to take advantage of the economies of scale and the pace of technology in the commercial supply chain, but realized that means they’re not going to get purpose-built equipment, that they are going to buy things from a global supply chain.

They asked, “What would we look for in these things that we are buying to know that people have used good engineering practices and good supply chain management practices? Do they have a good software development methodology? What would be those indicators?”

Now, that was a question from the DoD, but everybody is on somebody’s supply chain. People buy components. The big vendors buy components from smaller vendors. Integrators bring multiple systems together.

So, this is a really broad question in the industry. Because of that, we felt the best way to address this was bring together a broad spectrum of industry to come in, identify the practices that they have been using — your real, practical experience — and bring that together within a framework to create a standard for how we would do that.

Gardner: And this is designed with that word “open” being important to being inclusive. This is about a level playing field, but not necessarily any sort of exclusionary affair.

Lounsbury: Absolutely. Not only is the objective of all The Open Group activities to produce open standards and conformance programs that are available to everyone,

but in this case, because we are dealing with a global supply chain, we know that we are going to have not only vendors at all scales, but also vendors from all around the world.

If you pick up any piece of technology, it will be designed in the USA, assembled in Mexico, and built in China. So we need that international and global dimension in production of this set of standards as well.

Gardner: Andras, you’ve been involved with this quite a bit. For the edification of our listeners, is this mostly software we’re talking about? Is it certain components? Can we really put a bead on what will be the majority of technologies that would probably be affected?

Szakal: That’s a great question, Dana. I’d like to provide a little background. In today’s environment, we’re seeing a bit of a paradigm shift. We’re seeing technology move out of the traditional enterprise infrastructure. We’re seeing these very complex value chains be created. We’re seeing cloud computing.

Smarter infrastructures

We’re actually working to create smarter infrastructures that are becoming more intelligent, automated, and instrumented, and they are very much becoming open-loop systems. Traditionally, they were closed loop systems, in other words, closed environments, for example, the energy and utility (E&U) industry, the transportation industry, and the health-care industry.

As technology becomes more pervasive and gets integrated into these environments, into the critical infrastructure, we have to consider whether they are vulnerable and how the components that have gone into these solutions are trustworthy.

Governments worldwide are asking that question. They’re worried about critical infrastructure and the risk of using commercial, off-the-shelf technology — software and hardware — in a myriad of ways, as it gets integrated into these more complex solutions.

That’s part of the worry internationally from a government and policy perspective, and part of our focus here is to help our constituents, government customers and critical infrastructure customers, understand how the commercial technology manufacturers, the software development manufactures, go about engineering and managing their supply chain integrity.

Gardner: I got the impression somehow, listening to some of the presentations here at the Conference, that this was mostly about software. Maybe at the start, would that be the case?

Szakal: No, it’s about all types of technology. Software obviously is a particularly important focus, because it’s at the center of most technology anyway. Even if you’re developing a chip, a chip has some sort of firmware, which is ultimately software. So that perception is valid to a certain extent, but no, not just software, hardware as well.

Gardner: Steve, I heard also the concept of “build with integrity,” as applied to the OTTF. What does that mean, build with integrity?

Lipner: Build with integrity really means that the developer who is building a technology product, whether it be hardware or software, applies best practices and understood techniques to prevent the inclusion of security problems, holes, bugs, in the product — whether those problems arise from some malicious act in the supply chain or whether they arise from inadvertent errors. With the complexity of modern software, it’s likely that security vulnerabilities can creep in.

So, what build with integrity really means is that the developer applies best practices to reduce the likelihood of security problems arising, as much as commercially feasible.

And not only that, but any given supplier has processes for convincing himself that upstream suppliers, component suppliers, and people or organizations that he relies on, do the same, so that ultimately he delivers as secure a product as possible.

Gardner: Carrie, one of the precepts of good commerce is a lack of friction between borders, where more markets can become involved, where the highest quality at the lowest cost types of effects can take place. This notion of trust, when applied to IT resources and assets, seems to be important to try to keep this a global market and to allow for the efficiencies that are inherent in an open market to take place. How do you see this as a borderless technology ecosystem? How does this help?

International trust

Gates: This helps tremendously in improving trust internationally. We’re looking at developing a framework that can be applied regardless of which country you’re coming from. So, it is not a US-centric framework that we’ll be using and adhering to.

We’re looking for a framework so that each country, regardless of its government, regardless of the consumers within that country, all of them have confidence in what it is that we’re building, that we’re building with integrity, that we are concerned about both, as Steve mentioned, malicious acts or inadvertent errors.

And each country has its own bad guy, and so by adhering to international standard we can say we’re looking for bad guys for every country and ensuring that what we provide is the best possible software.

Gardner: Let’s look a little bit at how this is going to shape up as a process. Dave, let’s explain the idea of The Open Group being involved as a steward. What is The Open Group’s role in this?

Lounsbury: The Open Group provides the framework under which both buyers and suppliers at any scale could come together to solve a common problem — in this case, the question of providing trusted technology best practices and standards. We operate a set of proven processes that ensure that everyone has a voice and that all these standards go forward in an orderly manner.

We provide infrastructure for doing that in the meetings and things like that. The third leg is that The Open Group operates industry-based conformance programs, the certification programs, that allow someone who is not a member to come in and indicate their conformance standard and give evidence that they’re using the best practices there.

Gardner: That’s important. I think there is a milestone set that you were involved with. You’ve created the forum. You’ve done some gathering of information. Now, you’ve come out right here at this conference with the framework, with the first step towards a framework, that could be accepted across the community. There is also a white paper that explains how that’s all going to work. But, eventually, you’re going to get to an accreditation capability. What does that mean? Is that a stamp of approval?

Lounsbury: Let me back up just a little bit. The white paper actually lays out the framework. The work of forum is to turn that framework into an Open Group standard and populate it. That will provide the standards and best practice foundation for this conformance program.

We’re just getting started on the vision for a conformance program. One of the challenges here is that first, not only do we have to come up with the standard and then come up with the criteria by which people would submit evidence, but you also have to deal with the problem of scale.

If we really want to address this problem of global supply chains, we’re talking about a very large number of companies around the world. It’s a part of the challenge that the forum faces.

Accrediting vendors

Part of the work that they’ve embarked on is, in fact, to figure out how we wouldn’t necessarily do that kind of conformance one on one, but how we would accredit either vendors themselves who have their own duty of quality processes as a big vendor would or third parties who can do assessments and then help provide the evidence for that conformance.

We’re getting ahead of ourselves here, but there would be a certification authority that would verify that all the evidence is correct and grant some certificate that says that they have met some or all of the standards.

Szakal: Our vision is that we want to leverage some of the capability that’s already out there. Most of us go through common criteria evaluations and that is actually listed as a best practice for a validating security function and products.

Where we are focused, from an accreditation point of view, affects more than just security products. That’s important to know. However, we definitely believe that the community of assessment labs that exists out there that already conducts security evaluations, whether they be country-specific or that they be common criteria, needs to be leveraged. We’ll endeavor to do that and integrate them into both the membership and the thinking of the accreditation process.

Gardner: Thank you, Andras. Now, for a company that is facing some hurdles — and we heard some questions in our sessions earlier about: “What do I have to do? Is this going to be hard for an SMB? — the upside could be pretty significant. If you’re a company and you do get that accreditation, you’re going to have some business value. Steve Lipner, what from your perspective is the business rationale for these players to go about this accreditation to get this sort of certification?

Lipner: To the extent that the process is successful, why then customers will really value the certification? And will that open markets or create preferences in markets for organizations that have sought and achieved the certification?

Obviously, there will be effort involved in achieving the certification, but that will be related to real value, more trust, more security, and the ability of customers to buy with confidence.

The challenge that we’ll face as a forum going forward is to make the processes deterministic and cost-effective. I can understand what I have to do. I can understand what it will cost me. I won’t get surprised in the certification process and I can understand that value equation. Here’s what I’m going to have to do and then here are the markets and the customer sets, and the supply chains it’s going to open up to me.

Gardner: So, we understand that there is this effort afoot that the idea is to create more trust and a set of practices in place, so that everyone understands that certain criteria have been met and vulnerabilities have been reduced. And, we understand that this is going to be community effort and you’re going to try to be inclusive.

What I’m now curious about is what is it this actually consists of — a list of best practices, technology suggestions? Are there certain tests and requirements that are already in place that one would have to tick off? Let me take that to you, Carrie, and we’ll go around the panel. How do you actually assure that this is safe stuff?

Different metrics

Gates: If you refer to our white paper, we start to address that there. We were looking at a number of different metrics across the board. For example, what do you have for documentation practices? Do you do code reviews? There are a number of different best practices that are already in the field that people are using. Anyone who wants to be a certified, can go and look at this document and say, “Yes, we are following these best practices” or “No, we are missing this. Is it something that we really need to add? What kind of benefit it will provide to us beyond the certification?”

Gardner: Dave, anything to add as to how a company would go about this? What are some of the main building blocks to a low-vulnerability technology creation and distribution process?

Lounsbury: Again, I refer everybody to the white paper, which is available on The Open Group website. You’ll see there in the categories that we’ve divided these kinds of best practice into four broad categories: product engineering and development methods, secure engineering development methods, supply chain integrity methods and the product evaluation methods.

Under there those are the categories, we’ll be looking at the attributes that are necessary to each of those categories and then identifying the underlying standards or bits of evidence, so people can submit to indicate their conformance.

I want to underscore this point about the question of the cost to a vendor. Steve said it very well. The objective here is to raise best practices across the industry and make the best practice commonplace. One of the great things about an industry-based conformance program is that it gives you the opportunity to take the standards and those categories that we’ve talked about as they are developed by OTTF and incorporate those in your engineering and development processes.

So you’re baking in the quality as you go along, and not trying to have an expensive thing going on at the end.

Gardner: Andras, IBM is perhaps one of the largest providers to governments and defense agencies when it comes to IT and certainly, at the center of a large ecosystem around the world, you probably have some insights into best practices that satisfy governments and military and defense organizations.

Can you offer a few major building blocks that perhaps folks that have been in a completely commercial environment would need to start thinking more about as they try to think about reaching accreditation?

Szakal: We have three broad categories here and we’ve broken each of the categories into a set of principles, what we call best practice attributes. One of those is secure engineering. Within secure engineering, for example, one of the attributes is threat assessment and threat modeling.

Another would be to focus on lineage of open-source. So, these are some of the attributes that go into these large-grained categories.

Unpublished best practices

You’re absolutely right, we have thought about this before. Steve and I have talked a lot about this. We’ve worked on his secure engineering initiative, his SDLC initiative within Microsoft. I worked on and was co-author of the IBM Secure Engineering Framework. So, these are living examples that have been published, but are proprietary, for some of the best practices out there. There are others, and in many cases, most companies have addressed this internally, as part of their practices without having to publish them.

Part of the challenge that we are seeing, and part of the reason that Microsoft and IBM went to the length of publishing there is that government customers and critical infrastructure were asking what is the industry practice and what were the best practices.

What we’ve done here is taken the best practices in the industry and bringing them together in a way that’s a non-vendor specific. So you’re not looking to IBM, you’re not having to look at the other vendors’ methods of implementing these practices, and it gives you a non-specific way of addressing them based on outcome.

These have all been realized in the field. We’ve observed these practices in the wild, and we believe that this is going to actually help vendors mature in these specific areas. Governments recognize that, to a certain degree, the industry is not a little drunk and disorderly and we do actually have a view on what it means to develop product in a secure engineering manner and that we have supply chain integrity initiatives out there. So, those are very important.

Gardner: Somebody mentioned earlier that technology is ubiquitous across so many products and services. Software in particular growing more important in how it affects all sorts of different aspects of different businesses around the world. It seems to me this is an inevitable step that you’re taking here and that it might even be overdue.

If we can take the step of certification and agreement about technology best practices, does this move beyond just technology companies in the ecosystem to a wider set of products and services? Any thoughts about whether this is a framework for technology that could become more of a framework for general commerce, Dave?

Lounsbury: Well, Dana, you asked me a question I’m not sure I have an answer for. We’ve got a quite a task in front of us doing some of these technology standards. I guess there might be cases where vertical industries that are heavy technology employers or have similar kinds of security problems might look to this or there might be some overlap. The one that comes to my mind immediately is health care, but we will be quite happy if we get the technology industry, standards and best practices in place in the near future.

Gardner: I didn’t mean to give you more work to do necessarily. I just wanted to emphasize how this is an important and inevitable step and that the standardization around best practices trust and credibility for lack of malware and other risks that comes in technology is probably going to become more prevalent across the economy and the globe. Would you agree with that, Andras?

Szakal: This approach is, by the way, our best practices approach to solving this problem. It’s an approach that’s been taken before by the industry or industries from a supply chain perspective. There are several frameworks out there that abstract the community practice into best practices and use it as a way to help global manufacturing and development practices, in general, ensure integrity.

Our approach is not all that unique, but it’s certainly the first time the technology industry has come together to make sure that we have an answer to some of these most important questions.

Gardner: Any thoughts, Steve?

Lipner: I think Andras was right in terms of the industry coming together to articulate best practices. You asked a few minutes ago about existing certifications and beyond in the trust and assurance space. Beyond common criteria for security features, security products, there’s really not much in terms of formal evaluation processes today.

Creating a discipline

One of the things we think that the forum can contribute is a discipline that governments and potentially other customers can use to say, “What is my supplier actually doing? What assurance do I have? What confidence do I have?”

Gardner: Dave?

Lounsbury: I want to expand on that point a little bit. The white paper’s name, “The Open Trusted Technology Provider Framework” was quite deliberately chosen. There are a lot of practices out there that talk about how you would establish specific security criteria or specific security practices for products. The Open Trusted Technology Provider Forum wants to take a step up and not look at the products, but actually look at the practices that the providers employ to do that. So it’s bringing together those best practices.

Now, good technology providers will use good practices, when they’re looking at their products, but we want to make sure that they’re doing all of the necessary standards and best practices across the spectrum, not just, “Oh, I did this in this product.”

Szakal: I have to agree 100 percent. We’re not simply focused on a bunch of security controls here. This is industry continuity and practices for supply chain integrity, as well as our internal manufacturing practices around the actual practice and process of engineering or software development, as well as supply chain integrity practices.

That’s a very important point to be made. This is not a traditional security standard, insomuch as that we’ve got a hundred security controls that you should always go out and implement. You’re going to have certain practices that make sense in certain situations, depending on the context of the product you’re manufacturing.

Gardner: Carrie, any suggestions for how people could get started at least from an educational perspective? What resources they might look to or what maybe in terms of a mindset they should start to develop as they move towards wanting to be a trusted part of a larger supply chain?

Gates: I would say an open mindset. In terms of getting started, the white paper is an excellent resource to get started and understand how the OTTF is thinking about the problem. How we are sort of structuring things? What are the high-level attributes that we are looking at? Then, digging down further and saying, “How are we actually addressing the problem?”

We had mentioned threat modeling, which for some — if you’re not security-focused — might be a new thing to think about, as an example, in terms of your supply chain. What are the threats to your supply chain? Who might be interested, if you’re looking at malicious attack, in inserting something into your code? Who are your customers and who might be interested in potentially compromising them? How might you go about protecting them?

I am going to contradict Andras a little bit, because there is a security aspect to this, and there is a security mindset that is required. The security mindset is a little bit different, in that you tend to be thinking about who is it that would be interested in doing harm and how do you prevent that?

It’s not a normal way of thinking about problems. Usually, people have a problem, they want to solve it, and security is an add-on afterwards. We’re asking that they start that thinking as part of their process now and then start including that as part of their process.

Szakal: But, you have to agree with me that this isn’t your hopelessly lost techie 150-paragraph list of security controls you have to do in all cases, right?

Gates: Absolutely, there is no checklist of, “Yes, I have a Firewall. Yes, I have an IDS.”

Gardner: Okay. It strikes me that this is really a unique form of insurance — insurance for the buyer, insurance for the seller — that they can demonstrate that they’ve taken proper steps — and insurance for the participants in a vast and complex supply chain of contractors and suppliers around the world. Do you think the word “insurance” makes sense or “assurance?” How would you describe it, Steve?

Lipner: We talk about security assurance, and assurance is really what the OTTF is about, providing developers and suppliers with ways to achieve that assurance in providing their customers ways to know that they have done that. Andras referred to install the Firewall, and so on. This is really not about adding some security band-aid onto a technology or a product. It’s really about the fundamental attributes or assurance of the product or technology that’s being produced.

Gardner: Very good. I think we’ll need to leave it there. We have been discussing The Open Group’s new Open Trusted Technology Forum, The Associated Open Trusted Technology Provider Framework, and the movement towards more of an accreditation process for the global supply chains around technology products.

I want to thank our panel. We’ve been joined by Dave Lounsbury, the Chief Technology Officer of The Open Group. Thank you.

Lounsbury: Thank you, Dana.

Gardner: Also, Steve Lipner, the Senior Director of Security Engineering Strategy in Microsoft’s Trustworthy Computing Group. Thank you, Steve.

Lipner: Thank you, Dana.

Gardner: And also, Andras Szakal, he is the Chief Architect in the IBM Federal Software Group and an IBM’s Distinguished Engineer. Thank you.

Szakal: Thank you so much.

Gardner: And, also Carrie Gates, Vice President and Research Staff Member at CA Labs. Thank you.

Gates: Thank you.

Gardner: You’ve been listening to a sponsored podcast discussion in conjunction with The Open Group Conference here in San Diego, the week of February 7, 2011. I’m Dana Gardner, Principal Analyst at Interarbor Solutions. Thanks for joining and come back next time.

Copyright The Open Group and Interarbor Solutions, LLC, 2005-2011. All rights reserved.

Dana Gardner is the Principal Analyst at Interarbor Solutions, which identifies and interprets the trends in Services-Oriented Architecture (SOA) and enterprise software infrastructure markets. Interarbor Solutions creates in-depth Web content and distributes it via BriefingsDirectblogs, podcasts and video-podcasts to support conversational education about SOA, software infrastructure, Enterprise 2.0, and application development and deployment strategies.

Comments Off

Filed under Cybersecurity, Supply chain risk

Enterprise Architecture: Helping to unravel the emergence of new forces

By Raghuraman Krishnamurthy, Cognizant Technology Solutions

It is very interesting to see how the society changes with time. Primitive society was predominantly agrarian. Society was closely knit; the people worked in well-defined land boundaries. Trade was happening but was limited to the adventurous few.

In the 1700s, as Europe began to explore the world in a well-organized manner, the need for industrial enterprises emerged. Industries offered greater stability in jobs as opposed to being at the mercy of vagaries of nature in the agrarian world. People slowly migrated to centres of industries, and the cities emerged. The work was accomplished by well-established principles of time and materials: in one hour, a worker was supposed to assemble ‘x’ components following a set, mechanical way of working. As society further evolved, a lot of information was exchanged and thoughts of optimization began to surface. Ways of storing and processing the information became predominant in the pursuits of creative minds. Computer systems ushered in new possibilities.

With the emergence of the Internet, unimagined new avenues opened up. In one stroke, it was possible to transcend the constraints of time and space. Free flow of information enabled ‘leveling’ of the world in some sense. Enterprises began to take advantage of it by following the principle of ‘get the best work from where it is possible’. Established notions for a big enterprise like well-organized labor, building of mammoth physical structures, etc., were challenged.

In the creative world of today, great emphasis is on innovation and new socially/environmentally-conscious ways of doing business.

At every turn from agrarian to industrial to informative to creative, fundamental changes occurred in how business was done and in the principles of business management. These changes, although seemingly unrelated to a field like Enterprise Architecture, will help unravel the emergence of new forces and the need to adjust (if you are reactive) or plan for (if you are proactive) these forces.

Learning from how manufacturing companies have adjusted their supply chain management to live in the flat world provides a valuable key to how Enterprise Architecture can be looked afresh. I am very excited to explore more of this theme and other topics in The Open Group India Conference in Hyderabad (March 9), Pune (March 11) and Chennai (March 7). I look forward to seeing you there and having interesting discussions amongst us.

Raghuraman Krishnamurthy works as a Principal Architect in Cognizant Technology Solutions and is based in India. He can be reached at Raghuraman.krishnamurthy2@cognizant.com.

4 Comments

Filed under Enterprise Architecture