Monthly Archives: March 2014

Q&A with Allen Brown, President and CEO of The Open Group

By The Open Group

Last month, The Open Group hosted its San Francisco 2014 conference themed “Toward Boundaryless Information Flow™.” Boundaryless Information Flow has been the pillar of The Open Group’s mission since 2002 when it was adopted as the organization’s vision for Enterprise Architecture. We sat down at the conference with The Open Group President and CEO Allen Brown to discuss the industry’s progress toward that goal and the industries that could most benefit from it now as well as The Open Group’s new Dependability through Assuredness™ Standard and what the organization’s Forums are working on in 2014.

The Open Group adopted Boundaryless Information Flow as its vision in 2002, and the theme of the San Francisco Conference has been “Towards Boundaryless Information Flow.” Where do you think the industry is at this point in progressing toward that goal?

Well, it’s progressing reasonably well but the challenge is, of course, when we established that vision back in 2002, life was a little less complex, a little bit less fast moving, a little bit less fast-paced. Although organizations are improving the way that they act in a boundaryless manner – and of course that changes by industry – some industries still have big silos and stovepipes, they still have big boundaries. But generally speaking we are moving and everyone understands the need for information to flow in a boundaryless manner, for people to be able to access and integrate information and to provide it to the teams that they need.

One of the keynotes on Day One focused on the opportunities within the healthcare industry and The Open Group recently started a Healthcare Forum. Do you see Healthcare industry as a test case for Boundaryless Information Flow and why?

Healthcare is one of the verticals that we’ve focused on. And it is not so much a test case, but it is an area that absolutely seems to need information to flow in a boundaryless manner so that everyone involved – from the patient through the administrator through the medical teams – have all got access to the right information at the right time. We know that in many situations there are shifts of medical teams, and from one medical team to another they don’t have access to the same information. Information isn’t easily shared between medical doctors, hospitals and payers. What we’re trying to do is to focus on the needs of the patient and improve the information flow so that you get better outcomes for the patient.

Are there other industries where this vision might be enabled sooner rather than later?

I think that we’re already making significant progress in what we call the Exploration, Mining and Minerals industry. Our EMMM™ Forum has produced an industry-wide model that is being adopted throughout that industry. We’re also looking at whether we can have an influence in the airline industry, automotive industry, manufacturing industry. There are many, many others, government and retail included.

The plenary on Day Two of the conference focused on The Open Group’s Dependability through Assuredness standard, which was released last August. Why is The Open Group looking at dependability and why is it important?

Dependability is ultimately what you need from any system. You need to be able to rely on that system to perform when needed. Systems are becoming more complex, they’re becoming bigger. We’re not just thinking about the things that arrive on the desktop, we’re thinking about systems like the barriers at subway stations or Tube stations, we’re looking at systems that operate any number of complex activities. And they bring an awful lot of things together that you have to rely upon.

Now in all of these systems, what we’re trying to do is to minimize the amount of downtime because downtime can result in financial loss or at worst human life, and we’re trying to focus on that. What is interesting about the Dependability through Assuredness Standard is that it brings together so many other aspects of what The Open Group is working on. Obviously the architecture is at the core, so it’s critical that there’s an architecture. It’s critical that we understand the requirements of that system. It’s also critical that we understand the risks, so that fits in with the work of the Security Forum, and the work that they’ve done on Risk Analysis, Dependency Modeling, and out of the dependency modeling we can get the use cases so that we can understand where the vulnerabilities are, what action has to be taken if we identify a vulnerability or what action needs to be taken in the event of a failure of the system. If we do that and assign accountability to people for who will do what by when, in the event of an anomaly being detected or a failure happening, we can actually minimize that downtime or remove it completely.

Now the other great thing about this is it’s not only a focus on the architecture for the actual system development, and as the system changes over time, requirements change, legislation changes that might affect it, external changes, that all goes into that system, but also there’s another circle within that system that deals with failure and analyzes it and makes sure it doesn’t happen again. But there have been so many evidences of failure recently. In the banks for example in the UK, a bank recently was unable to process debit cards or credit cards for customers for about three or four hours. And that was probably caused by the work done on a routine basis over a weekend. But if Dependability through Assuredness had been in place, that could have been averted, it could have saved an awfully lot of difficulty for an awful lot of people.

How does the Dependability through Assuredness Standard also move the industry toward Boundaryless Information Flow?

It’s part of it. It’s critical that with big systems the information has to flow. But this is not so much the information but how a system is going to work in a dependable manner.

Business Architecture was another featured topic in the San Francisco plenary. What role can business architecture play in enterprise transformation vis a vis the Enterprise Architecture as a whole?

A lot of people in the industry are talking about Business Architecture right now and trying to focus on that as a separate discipline. We see it as a fundamental part of Enterprise Architecture. And, in fact, there are three legs to Enterprise Architecture, there’s Business Architecture, there’s the need for business analysts, which are critical to supplying the information, and then there are the solutions, and other architects, data, applications architects and so on that are needed. So those three legs are needed.

We find that there are two or three different types of Business Architect. Those that are using the analysis to understand what the business is doing in order that they can inform the solutions architects and other architects for the development of solutions. There are those that are more integrated with the business that can understand what is going on and provide input into how that might be improved through technology. And there are those that can actually go another step and talk about here we have the advances and the technology and here are the opportunities for advancing our competitiveness and organization.

What are some of the other key initiatives that The Open Group’s forum and work groups will be working on in 2014?

That kind question is like if you’ve got an award, you’ve got to thank your friends, so apologies to anyone that I leave out. Let me start alphabetically with the Architecture Forum. The Architecture Forum obviously is working on the evolution of TOGAF®, they’re also working with the harmonization of TOGAF with Archimate® and they have a number of projects within that, of course Business Architecture is on one of the projects going on in the Architecture space. The Archimate Forum are pushing ahead with Archimate—they’ve got two interesting activities going on at the moment, one is called ArchiMetals, which is going to be a sister publication to the ArchiSurance case study, where the ArchiSurance provides the example of Archimate is used in the insurance industry, ArchiMetals is going to be used in a manufacturing context, so there will be a whitepaper on that and there will be examples and artifacts that we can use. They’re also working on in Archimate a standard for interoperability for modeling tools. There are four tools that are accredited and certified by The Open Group right now and we’re looking for that interoperability to help organizations that have multiple tools as many of them do.

Going down the alphabet, there’s DirecNet. Not many people know about DirecNet, but Direcnet™ is work that we do around the U.S. Navy. They’re working on standards for long range, high bandwidth mobile networking. We can go to the FACE™ Consortium, the Future Airborne Capability Environment. The FACE Consortium are working on their next version of their standard, they’re working toward accreditation, a certification program and the uptake of that through procurement is absolutely amazing, we’re thrilled about that.

Healthcare we’ve talked about. The Open Group Trusted Technology Forum, where they’re working on how we can trust the supply chain in developed systems, they’ve released the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program, that was launched this week, and we already have one accredited vendor and two certified test labs, assessment labs. That is really exciting because now we’ve got a way of helping any organization that has large complex systems that are developed through a global supply chain to make sure that they can trust their supply chain. And that is going to be invaluable to many industries but also to the safety of citizens and the infrastructure of many countries. So the other part of the O-TTPS is that standard we are planning to move toward ISO standardization shortly.

The next one moving down the list would be Open Platform 3.0™. This is really exciting part of Boundaryless Information Flow, it really is. This is talking about the convergence of SOA, Cloud, Social, Mobile, Internet of Things, Big Data, and bringing all of that together, this convergence, this bringing together of all of those activities is really something that is critical right now, and we need to focus on. In the different areas, some of our Cloud computing standards have already gone to ISO and have been adopted by ISO. We’re working right now on the next products that are going to move through. We have a governance standard in process and an ecosystem standard has recently been published. In the area of Big Data there’s a whitepaper that’s 25 percent completed, there’s also a lot of work on the definition of what Open Platform 3.0 is, so this week the members have been working on trying to define Open Platform 3.0. One of the really interesting activities that’s gone on, the members of the Open Platform 3.0 Forum have produced something like 22 different use cases and they’re really good. They’re concise and they’re precise and the cover a number of different industries, including healthcare and others, and the next stage is to look at those and work on the ROI of those, the monetization, the value from those use cases, and that’s really exciting, I’m looking forward to peeping at that from time to time.

The Real Time and Embedded Systems Forum (RTES) is next. Real-Time is where we incubated the Dependability through Assuredness Framework and that was where that happened and is continuing to develop and that’s really good. The core focus of the RTES Forum is high assurance system, and they’re doing some work with ISO on that and a lot of other areas with multicore and, of course, they have a number of EC projects that we’re partnering with other partners in the EC around RTES.

The Security Forum, as I mentioned earlier, they’ve done a lot of work on risk and dependability. So they’ve not only their standards for the Risk Taxonomy and Risk Analysis, but they’ve now also developed the Open FAIR Certification for People, which is based on those two standards of Risk Analysis and Risk Taxonomy. And we’re already starting to see people being trained and being certified under that Open FAIR Certification Program that the Security Forum developed.

A lot of other activities are going on. Like I said, I probably left a lot of things out, but I hope that gives you a flavor of what’s going on in The Open Group right now.

The Open Group will be hosting a summit in Amsterdam May 12-14, 2014. What can we look forward to at that conference?

In Amsterdam we have a summit – that’s going to bring together a lot of things, it’s going to be a bigger conference that we had here. We’ve got a lot of activity in all of our activities; we’re going to bring together top-level speakers, so we’re looking forward to some interesting work during that week.

 

 

 

1 Comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Cybersecurity, EMMMv™, Enterprise Architecture, FACE™, Healthcare, O-TTF, RISK Management, Standards, TOGAF®

The Open Group Amsterdam Summit to Discuss Enabling Boundaryless Information Flow™

By The Open Group

The next Open Group Summit will cover the major issues and trends surrounding Boundaryless Information Flow™ on May 12-14 in Amsterdam. The event will feature presentations from leading companies, including IBM and Philips, on the key challenges facing effective information integration and enabling boundaryless information, as well as a day dedicated to ArchiMate®, a modeling language for Enterprise Architecture.

Boundaryless Information Flow™:

Boundaryless Information Flow, a shorthand representation of “access to integrated information to support business process improvements,” represents a desired state of an enterprise’s infrastructure that provides services to customers in an extended enterprise with the right information, at the right time and in the right context.

The Amsterdam Summit will bring together many individuals from throughout the globe to discuss key areas to enable Boundaryless Information Flow, including:

  • How EA and business processes can be used to facilitate integrated access to integrated information by staff, customers, suppliers and partners, to support the business
  • How organizations can achieve their business objectives by adopting new technologies and processes as part of the Enterprise Transformation management principles – making the whole process more a matter of design than of chance
  • How organizations move towards the interoperable enterprise, switching focus from IT-centric to enterprise-centric

ArchiMate Day:

On May 14, there will be an entire day dedicated to ArchiMate®, an Open Group standard. ArchiMate is an open and independent modelling language for enterprise architecture that is supported by different tool vendors and consulting firms. ArchiMate provides instruments to enable enterprise architects to describe, analyze and visualize the relationships among business domains in an unambiguous way. ArchiMate Day is appropriately located, as The Netherlands ranks as the number 1 country in the world for the number of ArchiMate® 2 certified individuals and as the number 3 country in the world for the number of TOGAF® 9 certified individuals.

The ArchiMate Day will provide the opportunity for attendees to:

  • Interact directly with other ArchiMate users and tool providers
  • Listen and understand how ArchiMate can be used to develop solutions to common industry problems
  • Learn about the future directions and meet with key users and developers of the language and tools
  • Interact with peers to broaden your expertise and knowledge in the ArchiMate language

Don’t wait to register! Early Bird registration ends March 30, 2014 Register now!

 

Comments Off

Filed under ArchiMate®, Boundaryless Information Flow™, Enterprise Architecture

CIOs to Leverage EA to Bolster their Presence in the C-Suite

by Naveed Rajput, Lead Enterprise Architect, Capco (Capital Markets)

Changing CIO landscape:    

The gulf between the Business and IT has been recognised in the enterprises for many years. However this chronic organisational impairment has started causing landslides in the CIO landscapes. There exists maturity mismatch between business and other central functions as well but IT capability has been brought under intense scrutiny due to increasingly frequent high profile failures.

Global macro forces such as economic fragility, financial crisis, heightened regulations, investment shortfall, fierce competition, globalisation, to name a few, are constantly transforming the business landscape.  Technology is one of the key considerations to untangle the complexity and volatility yielded by above mentioned macro commotion. The issue gets further flared up due to scarily high correlation between these factors which translates into a direct impact on the bottom line of firms.

Cost reduction has become a strategic driver for businesses and as a result, tech savvy finance chiefs are getting popular to lead Technology investment decisions. This paradigm shift in the CIO landscape is an important corporate event which presents an enterprise with burgeoning complexities and ultimately does more harm than good. Enterprise IT which is run as a cost centre struggles to provide sustainable competitiveness and In fact stifles innovation and future growth.

Investment Governance & CIO and CFO Alignment:              

The remedial actions mandate CIOs to assume ascendance to fill the void between IT and Business. The Enterprise Architecture function and governance can play a key strategic role for CIOs if structured and operated properly.

The biggest flashpoint to address includes capital planning and IT investment governance. CIOs can leverage and adapt the investment decision making framework and models used by their business counterparts. Using common factor models allow CIOs to analyse and calibrate the IT investment proposals similar to the way investment decisions are made by business lines – Think risk adjusted returns by deploying capital in a diversified portfolio.

Instead of using fairly lightweight valuation method, CIOs should strengthen their capabilities in deploying sophisticated valuation techniques to demonstrate understanding of the cost and benefits profile of the proposals. Apart from using the standard NPV, IRR, Profitability Index, and Payback Period, they should also consider quantifying embedded options and flexibility in project proposals.

Proposal Calibration and Comparative Investment Analysis:

IT can liaise with Finance to establish quantitative attributes to ensure alignment of parametric model. One of the key measures is the discount rate for which the weighted average cost of capital (WACC) could be a reasonable starting proxy to estimate the opportunity cost.

Once all the relevant input factors are established, the proposals with different time horizon and risk profile should be scaled and standardised for comparative analysis. Equivalent Annual Annuity (EAA) or Least Common Multiple of Lives (LCML) method is often used to bring the proposals to comparative scale. Subsequently, discount rates are adjusted to reflect the riskiness of the projects – think project specific beta.

Depending on the level of sophistication and maturity CIOs can use scenario analysis or Monte Carlo to simulate sensitivity of various factors on the outcome of the corporate portfolio. The extent to which the projects are aligned or misaligned with the strategy should also be part of the quantification.

Investment Fabric – Tying it all Together with EA 

Investment governance is more art than science for executives who ultimately use qualitative judgement for their decisions. Quantitative rigour applied to the process and mature analytics driven due-diligence process develops their gut feel and assists with alignment of views in the C-suite.

CIOs should creatively craft investment fabric to realise enterprise strategy roadmap. They should use EA to decipher correlation between proposed projects and establish P&L attribution to formulate portfolio balance sheet. Beyond the investment decisions, an effective EA can provide adequate architectural insights to the programmes and initiatives spanning across multiple product lines and business functions.

There is a lot that CIOs can leverage from the rest of the C-suite. CIOs should reclaim more power on IT strategy and decision making and by doing so enable the rest of the C-suite to drive the enterprise forward in the most efficient way possible.

Naveed RajputNaveed Rajput is an experienced strategy consultant and a business savvy technologist. He specialises in driving CXO engagements and enterprise strategy initiatives. He is a TOGAF® practitioner with well over 15 years of experience in delivering variety of complex and major global programmes including cross-functional Target Operating Models (TOM), Architecture Maturity, IT Capital Planning, and Investment Governance Initiatives. He has orchestrated technology and regulatory driven change agendas for top Fortune and FTSE financial and professional services firms including Logica, Deutsche Bank, Credit Suisse, Capco, Commerzbank, Mizuho Securities, Shell, and Wolters Kluwer-CCH.

Comments Off

Filed under Enterprise Architecture

Q&A with Jim Hietala on Security and Healthcare

By The Open Group

We recently spoke with Jim Hietala, Vice President, Security for The Open Group, at the 2014 San Francisco conference to discuss upcoming activities in The Open Group’s Security and Healthcare Forums.

Jim, can you tell us what the Security Forum’s priorities are going to be for 2014 and what we can expect to see from the Forum?

In terms of our priorities for 2014, we’re continuing to do work in Security Architecture and Information Security Management. In the area of Security Architecture, the big project that we’re doing is adding security to TOGAF®, so we’re working on the next version of the TOGAF standard and specification and there’s an active project involving folks from the Architecture Forum and the Security Forum to integrate security into and stripe it through TOGAF. So, on the Security Architecture side, that’s the priority. On the Information Security Management side, we’re continuing to do work in the area of Risk Management. We introduced a certification late last year, the OpenFAIR certification, and we’ll continue to do work in the area of Risk Management and Risk Analysis. We’re looking to add a second level to the certification program, and we’re doing some other work around the Risk Analysis standards that we’ve introduced.

The theme of this conference was “Towards Boundaryless Information Flow™” and many of the tracks focused on convergence, and the convergence of things Big Data, mobile, Cloud, also known as Open Platform 3.0. How are those things affecting the realm of security right now?

I think they’re just beginning to. Cloud—obviously the security issues around Cloud have been here as long as Cloud has been over the past four or five years. But if you look at things like the Internet of Things and some of the other things that comprise Open Platform 3.0, the security impacts are really just starting to be felt and considered. So I think information security professionals are really just starting to wrap their hands around, what are those new security risks that come with those technologies, and, more importantly, what do we need to do about them? What do we need to do to mitigate risk around something like the Internet of Things, for example?

What kind of security threats do you think companies need to be most worried about over the next couple of years?

There’s a plethora of things out there right now that organizations need to be concerned about. Certainly advanced persistent threat, the idea that maybe nation states are trying to attack other nations, is a big deal. It’s a very real threat, and it’s something that we have to think about – looking at the risks we’re facing, exactly what is that adversary and what are they capable of? I think profit-motivated criminals continue to be on everyone’s mind with all the credit card hacks that have just come out. We have to be concerned about cyber criminals who are profit motivated and who are very skilled and determined and obviously there’s a lot at stake there. All of those are very real things in the security world and things we have to defend against.

The Security track at the San Francisco conference focused primarily on risk management. How can companies better approach and manage risk?

As I mentioned, we did a lot of work over the last few years in the area of Risk Management and the FAIR Standard that we introduced breaks down risk into what’s the frequency of bad things happening and what’s the impact if they do happen? So I would suggest that taking that sort of approach, using something like taking the Risk Taxonomy Standard that we’ve introduced and the Risk Analysis Standard, and really looking at what are the critical assets to protect, who’s likely to attack them, what’s the probably frequency of attacks that we’ll see? And then looking at the impact side, what’s the consequence if somebody successfully attacks them? That’s really the key—breaking it down, looking at it that way and then taking the right mitigation steps to reduce risk on those assets that are really important.

You’ve recently become involved in The Open Group’s new Healthcare Forum. Why a healthcare vertical forum for The Open Group?

In the area of healthcare, what we see is that there’s just a highly fragmented aspect to the ecosystem. You’ve got healthcare information that’s captured in various places, and the information doesn’t necessarily flow from provider to payer to other providers. In looking at industry verticals, the healthcare industry seemed like an area that really needed a lot of approaches that we bring from The Open Group—TOGAF and Enterprise Architecture approaches that we have.

If you take it up to a higher level, it really needs the Boundaryless Information Flow that we talk about in The Open Group. We need to get to the point where our information as patients is readily available in a secure manner to the people who need to give us care, as well as to us because in a lot of cases the information exists as islands in the healthcare industry. In looking at healthcare it just seemed like a natural place where, in our economies – and it’s really a global problem – a lot of money is spent on healthcare and there’s a lot of opportunities for improvement, both in the economics but in the patient care that’s delivered to individuals through the healthcare system. It just seemed like a great area for us to focus on.

As the new Healthcare Forum kicks off this year, what are the priorities for the Forum?

The Healthcare Forum has just published a whitepaper summarizing the workshop findings for the workshop that we held in Philadelphia last summer. We’re also working on a treatise, which will outline our views about the healthcare ecosystem and where standards and architecture work is most needing to be done. We expect to have that whitepaper produced over the next couple of months. Beyond that, we see a lot of opportunities for doing architecture and standards work in the healthcare sector, and our membership is going to determine which of those areas to focus on, which projects to initiate first.

For more on the The Open Group Security Forum, please visit http://www.opengroup.org/subjectareas/security. For more on the The Open Group Healthcare Forum, see http://www.opengroup.org/getinvolved/industryverticals/healthcare.

62940-hietalaJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security, risk management and healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Cloud/SOA, Conference, Data management, Healthcare, Information security, Open FAIR Certification, Open Platform 3.0, RISK Management, TOGAF®, Uncategorized

What the C-Suite Needs to Prepare for in the Era of BYO Technology

By Allen Brown, President and CEO, The Open Group

IT today is increasingly being driven by end-users. This phenomenon, known as the “consumerization of IT,” is a result of how pervasive technology has become in daily life. Years ago, IT was the primarily the realm of technologists and engineers. Most people, whether in business settings or at home, did not have the technical know-how to source their own applications, write code for a web page or even set up their own workstation.

Today’s technologies are more user-friendly than ever and they’ve become ubiquitous. The introduction of smartphones and tablets has ushered in the era of “BYO” with consumers now bringing the technologies they like and are most comfortable working with into the workplace, all with the expectation that IT will support them. The days where IT decided what technologies would be used within an organization are no more.

At the same time, IT has lost another level of influence due to Cloud computing and Big Data. Again, the “consumers” of IT within the enterprise—line of business managers, developers, marketers, etc.—are driving these changes. Just as users want the agility offered by the devices they know and love, they also want to be able to buy and use the technologies they need to do their job and do it on the fly rather than wait for an IT department to go through a months’ (or years’) long process of requisitions and approvals. And it’s not just developers or IT staff that are sourcing their own applications—marketers are buying applications with their credit cards, and desktop users are sharing documents and spreadsheets via web-based office solutions.

When you can easily buy the processing capacity you need when you need it with your credit card or use applications online for free, why wait for approval?

The convergence of this next era of computing – we call it Open Platform 3.0™ – is creating a Balkanization of the traditional IT department. IT is no longer the control center for technology resources. As we’ve been witnessing over the past few years and as industry pundits have been prognosticating, IT is changing to become more of a service-based command central than a control center from which IT decisions are made.

These changes are happening within enterprises everywhere. The tides of change being brought about by Open Platform 3.0 cannot be held back. As I mentioned in my recent blog on Future Shock and the need for agile organizations, adaptation will be key for companies’ survival as constant change and immediacy become the “new normal” for how they operate.

These changes will, in fact, be positive for most organizations. As technologies converge and users drive the breakdown of traditional departmental silos and stovepipes, organizations will become more interoperable. More than ever, new computing models are driving the industry toward The Open Group’s vision of Boundaryless Information Flow™ within organizations. But the changes resulting from consumer-led IT are not just the problem of the IT department. They are on track to usher in a whole host of organizational changes that all executives must not only be aware of, but must also prepare and plan for.

One of the core of issues around consumerized IT that must be considered is the control of resources. Resource planning in terms of enabling business processes through technology must now be the concern of every person within the C-Suite from the CEO to the CIO and even the CMO.

Take, for example, the financial controls that must be considered in a BYO world. This issue, in particular, hits two very distinct centers of operations most closely—the offices of both the CIO and the CFO.

In the traditional IT paradigm, technology has been a cost center for most businesses with CFOs usually having the final say in what technologies can be bought and used based on budget. There have been very specific controls placed on purchases, each leaving an audit trail that the finance department could easily track and handle. With the Open Platform 3.0 paradigm, those controls go straight out the window. When someone in marketing buys and uses an application on their own without the CIO approving its use or the CFO having an paper trail for the purchase, accounting and financial or technology auditing can become a potential corporate nightmare.

Alternatively, when users share information over the Web using online documents, the CIO, CTO or CSO may have no idea what information is going in and out of the organization or how secure it is. But sharing information through web-based documents—or a CRM system—might be the best way for the CMO to work with vendors or customers or keep track of them. The CMO may also need to begin tracking IT purchases within their own department.

The audit trail that must be considered in this new computing era can extend in many directions. IT may need an accounting of technical and personal assets. Legal may need information for e-Discovery purposes—how does one account for information stored on tablets or smartphones brought from home or work-related emails from sent from personal accounts? The CSO may require risk assessments to be performed on all devices or may need to determine how far an organization’s “perimeter” extends for security purposes. The trail is potentially as large as the organization itself and its entire extended network of employees, vendors, customers, etc.

What can organizations do to help mitigate the potential chaos of a consumer-led IT revolution?

Adapt. Be flexible and nimble. Plan ahead. Strategize. Start talking about what these changes will mean for your organization—and do it sooner rather than later. Work together. Help create standards that can help organizations maintain flexible but open parameters (and perimeters) for sourcing and sharing resources.

Executive teams, in particular, will need to know more about the functions of other departments than ever before. IT departments—including CTOs and EAs—will need to know more about other business functions—such as finance—if they are to become IT service centers. CFOs will need to know more about technology, security, marketing and strategic planning. CMOs and CIOs will need to understand regulatory guidelines not only around securing information but around risk and data privacy.

Putting enterprise and business architectures and industry standards in place can go a long way toward helping to create structures that maintain a healthy balance between providing the flexibility needed for Open Platform 3.0 and BYO while allowing enough organizational control to prevent chaos. With open architectures and standards, organizations will better be able to decide where controls are needed and when and how information should be shared among departments. Interoperability and Boundaryless Information Flow—where and when they’re needed—will be key components of these architectures.

The convergence being brought about Open Platform 3.0 is not just about technology. It’s about the convergence of many things—IT, people, operations, processes, information. It will require significant cultural changes for most organizations and within different departments and organizational functions that are not used to sharing, processing and analyzing information beyond the silos that have been built up around them.

In this new computing model, Enterprise Architectures, interoperability and standards can and must play a central role in guiding the C-Suite through this time of rapid change so that users have the tools they need to be able to innovate, executives have the information they need to steer the proverbial ship and organizations don’t get left behind.

brown-smallAllen Brown is the President and CEO of The Open GroupFor more than ten years, he has been responsible for driving the organization’s strategic plan and day-to-day operations; he was also instrumental in the creation of the Association of Enterprise Architects (AEA). Allen is based in the U.K.

Comments Off

Filed under Business Architecture, Cloud/SOA, Enterprise Architecture, Enterprise Transformation, Standards, Uncategorized

Accrediting the Global Supply Chain: A Conversation with O-TTPS Recognized Assessors Fiona Pattinson and Erin Connor

By The Open Group 

At the recent San Francisco 2014 conference, The Open Group Trusted Technology Forum (OTTF) announced the launch of the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program.

The program is one the first accreditation programs worldwide aimed at assuring the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

In three short years since OTTF launched, the forum has grown to include more than 25 member companies dedicated to safeguarding the global supply chain against the increasing sophistication of cybersecurity attacks through standards. Accreditation is yet another step in the process of protecting global technology supply chains from maliciously tainted and counterfeit products.

As part of the program, third-party assessor companies will be employed to assess organizations applying for accreditation, with The Open Group serving as the vendor-neutral Accreditation Authority that operates the program.  Prior to the launch, the forum conducted a pilot program with a number of member companies. It was announced at the conference that IBM is the first company to becoming accredited, earning accreditation for its Application, Infrastructure and Middleware (AIM), software business division for its product integrity and supply chain practices.

We recently spoke with OTTF members Fiona Pattinson, director of strategy and business development at Atsec Information Security, and Erin Connor, director at EWA-Canada, at the San Francisco conference to learn more about the assessment process and the new program.

The O-TTPS focus is on securing the technology supply chain. What would you say are the biggest threats facing the supply chain today?

Fiona Pattinson (FP): I think in the three years since the forum began certainly all the members have discussed the various threats quite a lot. It was one of things we discussed as an important topic early on, and I don’t know if it’s the ‘biggest threat,’ but certainly the most important threats that we needed to address initially were those of counterfeit and maliciously tainted products. We came to that through both discussion with all the industry experts in the forum and also through research into some of the requirements from government, so that’s exactly how we knew which threats [to start with].

Erin Connor (EC):  And the forum benefits from having both sides of the acquisition process, both acquirers, and the suppliers and vendors. So they get both perspectives.

How would you define maliciously tainted and counterfeit products?

FP:  They are very carefully defined in the standard—we needed to do that because people’s understanding of that can vary so much.

EC: And actually the concept of ‘maliciously’ tainted was incorporated close to the end of the development process for the standard at the request of members on the acquisition side of the process.

[Note: The standard precisely defines maliciously tainted and counterfeit products as follows:

“The two major threats that acquirers face today in their COTS ICT procurements, as addressed in this Standard, are defined as:

1. Maliciously tainted product – the product is produced by the provider and is acquired

through a provider’s authorized channel, but has been tampered with maliciously.

2. Counterfeit product – the product is produced other than by, or for, the provider, or is

supplied to the provider by other than a provider’s authorized channel and is presented as being legitimate even though it is not.”]

The OTTF announced the Accreditation Program for the OTTP Standard at the recent San Francisco conference. Tell us about the standard and how the accreditation program will help ensure conformance to it?

EC: The program is intended to provide organizations with a way to accredit their lifecycle processes for their product development so they can prevent counterfeit or maliciously tainted components from getting into the products they are selling to an end user or into somebody else’s supply chain. It was determined that a third-party type of assessment program would be used. For the organizations, they will know that we Assessors have gone through a qualification process with The Open Group and that we have in place all that’s required on the management side to properly do an assessment. From the consumer side, they have confidence the assessment has been completed by an independent third-party, so they know we aren’t beholden to the organizations to give them a passing grade when perhaps they don’t deserve it. And then of course The Open Group is in position to oversee the whole process and award the final accreditation based on the recommendation we provide.  The Open Group will also be the arbiter of the process between the assessors and organizations if necessary. 

FP:  So The Open Group’s accreditation authority is validating the results of the assessors.

EC: It’s a model that is employed in many, many other product or process assessment and evaluation programs where the actual accreditation authority steps back and have third parties do the assessment.

FP: It is important that the assessor companies are working to the same standard so that there’s no advantage in taking one assessor over the other in terms of the quality of the assessments that are produced.

How does the accreditation program work?

FP: Well, it’s brand new so we don’t know if it is perfect yet, but having said that, we have worked over several months on defining the process, and we have drawn from The Open Group’s existing accreditation programs, as well as from the forum experts who have worked in the accreditation field for many years. We have been performing pilot accreditations in order to check out how the process works. So it is already tested.

How does it actually work? Well, first of all an organization will feel the need to become accredited and at that point will apply to The Open Group to get the accreditation underway. Once their scope of accreditation – which may be as small as one product or theoretically as large as a whole global company – and once the application is reviewed and approved by The Open Group, then they engage an assessor.

There is a way of sampling a large scope to identify the process variations in a larger scope using something we term ‘selective representative products.’ It’s basically a way of logically sampling a big scope so that we capture the process variations within the scope and make sure that the assessment is kept to a reasonable size for the organization undergoing the assessment, but it also gives good assurance to the consumers that it is a representative sample. The assessment is performed by the Recognized Assessor company, and a final report is written and provided to The Open Group for their validation. If everything is in order, then the company will be accredited and their scope of conformance will be added to the accreditation register and trademarked.

EC: So the customers of that organization can go and check the registration for exactly what products are covered by the scope.

FP: Yes, the register is public and anybody can check. So if IBM says WebSphere is accredited, you can go and check that claim on The Open Group web site.

How long does the process take or does it vary?

EC: It will vary depending on how large the scope to be accredited is in terms of the size of the representative set and the documentation evidence. It really does depend on what the variations in the processes are among the product lines as to how long it takes the assessor to go through the evidence and then to produce the report. The other side of the coin is how long it takes the organization to produce the evidence. It may well be that they might not have it totally there at the outset and will have to create some of it.

FP: As Erin said, it varies by the complexity and the variation of the processes and hence the number of selected representative products. There are other factors that can influence the duration. There are three parties influencing that: The applicant Organization, The Open Group’s Accreditation Authority and the Recognized Assessor.

For example, we found that the initial work by the Organization and the Accreditation Authority in checking the scope and the initial documentation can take a few weeks for a complex scope, of course for the pilots we were all new at doing that. In this early part of the project it is vital to get the scope both clearly defined and approved since it is key to a successful accreditation.

It is important that an Organization assigns adequate resources to help keep this to the shortest time possible, both during the initial scope discussions, and during the assessment. If the Organization can provide all the documentation before they get started, then the assessors are not waiting for that and the duration of the assessment can be kept as short as possible.

Of course the resources assigned by the Recognized Assessor also influences how long an assessment takes. A variable for the assessors is how much documentation do they have to read and review? It might be small or it might be a mountain.

The Open Group’s final review and oversight of the assessment takes some time and is influenced by resource availability within that organization. If they have any questions it may take a little while to resolve.

What kind of safeguards does the accreditation program put in place for enforcing the standard?

FP: It is a voluntary standard—there’s no requirement to comply. Currently some of the U.S. government organizations are recommending it. For example, NASA in their SEWP contract and some of the draft NIST documents on Supply Chain refer to it, too.

EC: In terms of actual oversight, we review what their processes are as assessors, and the report and our recommendations are based on that review. The accreditation expires after three years so before the three years is up, the organization should actually get the process underway to obtain a re-accreditation.  They would have to go through the process again but there will be a few more efficiencies because they’ve done it before. They may also wish to expand the scope to include the other product lines and portions of the company. There aren’t any periodic ‘spot checks’ after accreditation to make sure they’re still following the accredited processes, but part of what we look at during the assessment is that they have controls in place to ensure they continue doing the things they are supposed to be doing in terms of securing their supply chain.

FP:  And then the key part is the agreement the organizations signs with The Open Group includes the fact the organization warrant and represent that they remain in conformance with the standard throughout the accreditation period. So there is that assurance too, which builds on the more formal assessment checks.

What are the next steps for The Open Group Trusted Technology Forum?  What will you be working on this year now that the accreditation program has started?

FP: Reviewing the lessons we learned through the pilot!

EC: And reviewing comments from members on the standard now that it’s publicly available and working on version 1.1 to make any corrections or minor modifications. While that’s going on, we’re also looking ahead to version 2 to make more substantial changes, if necessary. The standard is definitely going to be evolving for a couple of years and then it will reach a steady state, which is the normal evolution for a standard.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey – m.hickey@opengroup.org

Fiona Pattinson Fiona Pattinson is responsible for developing new and existing atsec service offerings.  Under the auspices of The Open Group’s OTTF, alongside many expert industry colleagues, Fiona has helped develop The Open Group’s O-TTPS, including developing the accreditation program for supply chain security.  In the past, Fiona has led service developments which have included establishing atsec’s US Common Criteria laboratory, the CMVP cryptographic module testing laboratory, the GSA FIPS 201 TP laboratory, TWIC reader compliance testing, NPIVP, SCAP, PCI, biometrics testing and penetration testing. Fiona has responsibility for understanding a broad range of information security topics and the application of security in a wide variety of technology areas from low-level design to the enterprise level.

ErinConnorErin Connor is the Director at EWA-Canada responsible for EWA-Canada’s Information Technology Security Evaluation & Testing Facility, which includes a Common Criteria Test Lab, a Cryptographic & Security Test Lab (FIPS 140 and SCAP), a Payment Assurance Test Lab (device testing for PCI PTS POI & HSM, Australian Payment Clearing Association and Visa mPOS) and an O-TTPS Assessor lab Recognized by the Open Group.  Erin participated with other expert members of the Open Group Trusted Technology Forum (OTTF) in the development of The Open Group Trusted Technology Provider Standard for supply chain security and its accompanying Accreditation Program.  Erin joined EWA-Canada in 1994 and his initial activities in the IT Security and Infrastructure Assurance field included working on the team fielding a large scale Public Key Infrastructure system, Year 2000 remediation and studies of wireless device vulnerabilities.  Since 2000, Erin has been working on evaluations of a wide variety of products including hardware security modules, enterprise security management products, firewalls, mobile device and management products, as well as system and network vulnerability management products.  He was also the only representative of an evaluation lab in the Biometric Evaluation Methodology Working Group, which developed a proposed methodology for the evaluation of biometric technologies under the Common Criteria.

Comments Off

Filed under Accreditations, Cybersecurity, OTTF, Professional Development, Standards, Supply chain risk