CIOs to Leverage EA to Bolster their Presence in the C-Suite

by Naveed Rajput, Lead Enterprise Architect, Capco (Capital Markets)

Changing CIO landscape:    

The gulf between the Business and IT has been recognised in the enterprises for many years. However this chronic organisational impairment has started causing landslides in the CIO landscapes. There exists maturity mismatch between business and other central functions as well but IT capability has been brought under intense scrutiny due to increasingly frequent high profile failures.

Global macro forces such as economic fragility, financial crisis, heightened regulations, investment shortfall, fierce competition, globalisation, to name a few, are constantly transforming the business landscape.  Technology is one of the key considerations to untangle the complexity and volatility yielded by above mentioned macro commotion. The issue gets further flared up due to scarily high correlation between these factors which translates into a direct impact on the bottom line of firms.

Cost reduction has become a strategic driver for businesses and as a result, tech savvy finance chiefs are getting popular to lead Technology investment decisions. This paradigm shift in the CIO landscape is an important corporate event which presents an enterprise with burgeoning complexities and ultimately does more harm than good. Enterprise IT which is run as a cost centre struggles to provide sustainable competitiveness and In fact stifles innovation and future growth.

Investment Governance & CIO and CFO Alignment:              

The remedial actions mandate CIOs to assume ascendance to fill the void between IT and Business. The Enterprise Architecture function and governance can play a key strategic role for CIOs if structured and operated properly.

The biggest flashpoint to address includes capital planning and IT investment governance. CIOs can leverage and adapt the investment decision making framework and models used by their business counterparts. Using common factor models allow CIOs to analyse and calibrate the IT investment proposals similar to the way investment decisions are made by business lines – Think risk adjusted returns by deploying capital in a diversified portfolio.

Instead of using fairly lightweight valuation method, CIOs should strengthen their capabilities in deploying sophisticated valuation techniques to demonstrate understanding of the cost and benefits profile of the proposals. Apart from using the standard NPV, IRR, Profitability Index, and Payback Period, they should also consider quantifying embedded options and flexibility in project proposals.

Proposal Calibration and Comparative Investment Analysis:

IT can liaise with Finance to establish quantitative attributes to ensure alignment of parametric model. One of the key measures is the discount rate for which the weighted average cost of capital (WACC) could be a reasonable starting proxy to estimate the opportunity cost.

Once all the relevant input factors are established, the proposals with different time horizon and risk profile should be scaled and standardised for comparative analysis. Equivalent Annual Annuity (EAA) or Least Common Multiple of Lives (LCML) method is often used to bring the proposals to comparative scale. Subsequently, discount rates are adjusted to reflect the riskiness of the projects – think project specific beta.

Depending on the level of sophistication and maturity CIOs can use scenario analysis or Monte Carlo to simulate sensitivity of various factors on the outcome of the corporate portfolio. The extent to which the projects are aligned or misaligned with the strategy should also be part of the quantification.

Investment Fabric – Tying it all Together with EA 

Investment governance is more art than science for executives who ultimately use qualitative judgement for their decisions. Quantitative rigour applied to the process and mature analytics driven due-diligence process develops their gut feel and assists with alignment of views in the C-suite.

CIOs should creatively craft investment fabric to realise enterprise strategy roadmap. They should use EA to decipher correlation between proposed projects and establish P&L attribution to formulate portfolio balance sheet. Beyond the investment decisions, an effective EA can provide adequate architectural insights to the programmes and initiatives spanning across multiple product lines and business functions.

There is a lot that CIOs can leverage from the rest of the C-suite. CIOs should reclaim more power on IT strategy and decision making and by doing so enable the rest of the C-suite to drive the enterprise forward in the most efficient way possible.

Naveed RajputNaveed Rajput is an experienced strategy consultant and a business savvy technologist. He specialises in driving CXO engagements and enterprise strategy initiatives. He is a TOGAF® practitioner with well over 15 years of experience in delivering variety of complex and major global programmes including cross-functional Target Operating Models (TOM), Architecture Maturity, IT Capital Planning, and Investment Governance Initiatives. He has orchestrated technology and regulatory driven change agendas for top Fortune and FTSE financial and professional services firms including Logica, Deutsche Bank, Credit Suisse, Capco, Commerzbank, Mizuho Securities, Shell, and Wolters Kluwer-CCH.

Leave a comment

Filed under Enterprise Architecture

Q&A with Jim Hietala on Security and Healthcare

By The Open Group

We recently spoke with Jim Hietala, Vice President, Security for The Open Group, at the 2014 San Francisco conference to discuss upcoming activities in The Open Group’s Security and Healthcare Forums.

Jim, can you tell us what the Security Forum’s priorities are going to be for 2014 and what we can expect to see from the Forum?

In terms of our priorities for 2014, we’re continuing to do work in Security Architecture and Information Security Management. In the area of Security Architecture, the big project that we’re doing is adding security to TOGAF®, so we’re working on the next version of the TOGAF standard and specification and there’s an active project involving folks from the Architecture Forum and the Security Forum to integrate security into and stripe it through TOGAF. So, on the Security Architecture side, that’s the priority. On the Information Security Management side, we’re continuing to do work in the area of Risk Management. We introduced a certification late last year, the OpenFAIR certification, and we’ll continue to do work in the area of Risk Management and Risk Analysis. We’re looking to add a second level to the certification program, and we’re doing some other work around the Risk Analysis standards that we’ve introduced.

The theme of this conference was “Towards Boundaryless Information Flow™” and many of the tracks focused on convergence, and the convergence of things Big Data, mobile, Cloud, also known as Open Platform 3.0. How are those things affecting the realm of security right now?

I think they’re just beginning to. Cloud—obviously the security issues around Cloud have been here as long as Cloud has been over the past four or five years. But if you look at things like the Internet of Things and some of the other things that comprise Open Platform 3.0, the security impacts are really just starting to be felt and considered. So I think information security professionals are really just starting to wrap their hands around, what are those new security risks that come with those technologies, and, more importantly, what do we need to do about them? What do we need to do to mitigate risk around something like the Internet of Things, for example?

What kind of security threats do you think companies need to be most worried about over the next couple of years?

There’s a plethora of things out there right now that organizations need to be concerned about. Certainly advanced persistent threat, the idea that maybe nation states are trying to attack other nations, is a big deal. It’s a very real threat, and it’s something that we have to think about – looking at the risks we’re facing, exactly what is that adversary and what are they capable of? I think profit-motivated criminals continue to be on everyone’s mind with all the credit card hacks that have just come out. We have to be concerned about cyber criminals who are profit motivated and who are very skilled and determined and obviously there’s a lot at stake there. All of those are very real things in the security world and things we have to defend against.

The Security track at the San Francisco conference focused primarily on risk management. How can companies better approach and manage risk?

As I mentioned, we did a lot of work over the last few years in the area of Risk Management and the FAIR Standard that we introduced breaks down risk into what’s the frequency of bad things happening and what’s the impact if they do happen? So I would suggest that taking that sort of approach, using something like taking the Risk Taxonomy Standard that we’ve introduced and the Risk Analysis Standard, and really looking at what are the critical assets to protect, who’s likely to attack them, what’s the probably frequency of attacks that we’ll see? And then looking at the impact side, what’s the consequence if somebody successfully attacks them? That’s really the key—breaking it down, looking at it that way and then taking the right mitigation steps to reduce risk on those assets that are really important.

You’ve recently become involved in The Open Group’s new Healthcare Forum. Why a healthcare vertical forum for The Open Group?

In the area of healthcare, what we see is that there’s just a highly fragmented aspect to the ecosystem. You’ve got healthcare information that’s captured in various places, and the information doesn’t necessarily flow from provider to payer to other providers. In looking at industry verticals, the healthcare industry seemed like an area that really needed a lot of approaches that we bring from The Open Group—TOGAF and Enterprise Architecture approaches that we have.

If you take it up to a higher level, it really needs the Boundaryless Information Flow that we talk about in The Open Group. We need to get to the point where our information as patients is readily available in a secure manner to the people who need to give us care, as well as to us because in a lot of cases the information exists as islands in the healthcare industry. In looking at healthcare it just seemed like a natural place where, in our economies – and it’s really a global problem – a lot of money is spent on healthcare and there’s a lot of opportunities for improvement, both in the economics but in the patient care that’s delivered to individuals through the healthcare system. It just seemed like a great area for us to focus on.

As the new Healthcare Forum kicks off this year, what are the priorities for the Forum?

The Healthcare Forum has just published a whitepaper summarizing the workshop findings for the workshop that we held in Philadelphia last summer. We’re also working on a treatise, which will outline our views about the healthcare ecosystem and where standards and architecture work is most needing to be done. We expect to have that whitepaper produced over the next couple of months. Beyond that, we see a lot of opportunities for doing architecture and standards work in the healthcare sector, and our membership is going to determine which of those areas to focus on, which projects to initiate first.

For more on the The Open Group Security Forum, please visit http://www.opengroup.org/subjectareas/security. For more on the The Open Group Healthcare Forum, see http://www.opengroup.org/getinvolved/industryverticals/healthcare.

62940-hietalaJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security, risk management and healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Leave a comment

Filed under Cloud/SOA, Conference, Data management, Healthcare, Information security, Open FAIR Certification, Open Platform 3.0, RISK Management, TOGAF®, Uncategorized

What the C-Suite Needs to Prepare for in the Era of BYO Technology

By Allen Brown, President and CEO, The Open Group

IT today is increasingly being driven by end-users. This phenomenon, known as the “consumerization of IT,” is a result of how pervasive technology has become in daily life. Years ago, IT was the primarily the realm of technologists and engineers. Most people, whether in business settings or at home, did not have the technical know-how to source their own applications, write code for a web page or even set up their own workstation.

Today’s technologies are more user-friendly than ever and they’ve become ubiquitous. The introduction of smartphones and tablets has ushered in the era of “BYO” with consumers now bringing the technologies they like and are most comfortable working with into the workplace, all with the expectation that IT will support them. The days where IT decided what technologies would be used within an organization are no more.

At the same time, IT has lost another level of influence due to Cloud computing and Big Data. Again, the “consumers” of IT within the enterprise—line of business managers, developers, marketers, etc.—are driving these changes. Just as users want the agility offered by the devices they know and love, they also want to be able to buy and use the technologies they need to do their job and do it on the fly rather than wait for an IT department to go through a months’ (or years’) long process of requisitions and approvals. And it’s not just developers or IT staff that are sourcing their own applications—marketers are buying applications with their credit cards, and desktop users are sharing documents and spreadsheets via web-based office solutions.

When you can easily buy the processing capacity you need when you need it with your credit card or use applications online for free, why wait for approval?

The convergence of this next era of computing – we call it Open Platform 3.0™ – is creating a Balkanization of the traditional IT department. IT is no longer the control center for technology resources. As we’ve been witnessing over the past few years and as industry pundits have been prognosticating, IT is changing to become more of a service-based command central than a control center from which IT decisions are made.

These changes are happening within enterprises everywhere. The tides of change being brought about by Open Platform 3.0 cannot be held back. As I mentioned in my recent blog on Future Shock and the need for agile organizations, adaptation will be key for companies’ survival as constant change and immediacy become the “new normal” for how they operate.

These changes will, in fact, be positive for most organizations. As technologies converge and users drive the breakdown of traditional departmental silos and stovepipes, organizations will become more interoperable. More than ever, new computing models are driving the industry toward The Open Group’s vision of Boundaryless Information Flow™ within organizations. But the changes resulting from consumer-led IT are not just the problem of the IT department. They are on track to usher in a whole host of organizational changes that all executives must not only be aware of, but must also prepare and plan for.

One of the core of issues around consumerized IT that must be considered is the control of resources. Resource planning in terms of enabling business processes through technology must now be the concern of every person within the C-Suite from the CEO to the CIO and even the CMO.

Take, for example, the financial controls that must be considered in a BYO world. This issue, in particular, hits two very distinct centers of operations most closely—the offices of both the CIO and the CFO.

In the traditional IT paradigm, technology has been a cost center for most businesses with CFOs usually having the final say in what technologies can be bought and used based on budget. There have been very specific controls placed on purchases, each leaving an audit trail that the finance department could easily track and handle. With the Open Platform 3.0 paradigm, those controls go straight out the window. When someone in marketing buys and uses an application on their own without the CIO approving its use or the CFO having an paper trail for the purchase, accounting and financial or technology auditing can become a potential corporate nightmare.

Alternatively, when users share information over the Web using online documents, the CIO, CTO or CSO may have no idea what information is going in and out of the organization or how secure it is. But sharing information through web-based documents—or a CRM system—might be the best way for the CMO to work with vendors or customers or keep track of them. The CMO may also need to begin tracking IT purchases within their own department.

The audit trail that must be considered in this new computing era can extend in many directions. IT may need an accounting of technical and personal assets. Legal may need information for e-Discovery purposes—how does one account for information stored on tablets or smartphones brought from home or work-related emails from sent from personal accounts? The CSO may require risk assessments to be performed on all devices or may need to determine how far an organization’s “perimeter” extends for security purposes. The trail is potentially as large as the organization itself and its entire extended network of employees, vendors, customers, etc.

What can organizations do to help mitigate the potential chaos of a consumer-led IT revolution?

Adapt. Be flexible and nimble. Plan ahead. Strategize. Start talking about what these changes will mean for your organization—and do it sooner rather than later. Work together. Help create standards that can help organizations maintain flexible but open parameters (and perimeters) for sourcing and sharing resources.

Executive teams, in particular, will need to know more about the functions of other departments than ever before. IT departments—including CTOs and EAs—will need to know more about other business functions—such as finance—if they are to become IT service centers. CFOs will need to know more about technology, security, marketing and strategic planning. CMOs and CIOs will need to understand regulatory guidelines not only around securing information but around risk and data privacy.

Putting enterprise and business architectures and industry standards in place can go a long way toward helping to create structures that maintain a healthy balance between providing the flexibility needed for Open Platform 3.0 and BYO while allowing enough organizational control to prevent chaos. With open architectures and standards, organizations will better be able to decide where controls are needed and when and how information should be shared among departments. Interoperability and Boundaryless Information Flow—where and when they’re needed—will be key components of these architectures.

The convergence being brought about Open Platform 3.0 is not just about technology. It’s about the convergence of many things—IT, people, operations, processes, information. It will require significant cultural changes for most organizations and within different departments and organizational functions that are not used to sharing, processing and analyzing information beyond the silos that have been built up around them.

In this new computing model, Enterprise Architectures, interoperability and standards can and must play a central role in guiding the C-Suite through this time of rapid change so that users have the tools they need to be able to innovate, executives have the information they need to steer the proverbial ship and organizations don’t get left behind.

brown-smallAllen Brown is the President and CEO of The Open GroupFor more than ten years, he has been responsible for driving the organization’s strategic plan and day-to-day operations; he was also instrumental in the creation of the Association of Enterprise Architects (AEA). Allen is based in the U.K.

Leave a comment

Filed under Business Architecture, Cloud/SOA, Enterprise Architecture, Enterprise Transformation, Standards, Uncategorized

Accrediting the Global Supply Chain: A Conversation with O-TTPS Recognized Assessors Fiona Pattinson and Erin Connor

By The Open Group 

At the recent San Francisco 2014 conference, The Open Group Trusted Technology Forum (OTTF) announced the launch of the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program.

The program is one the first accreditation programs worldwide aimed at assuring the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

In three short years since OTTF launched, the forum has grown to include more than 25 member companies dedicated to safeguarding the global supply chain against the increasing sophistication of cybersecurity attacks through standards. Accreditation is yet another step in the process of protecting global technology supply chains from maliciously tainted and counterfeit products.

As part of the program, third-party assessor companies will be employed to assess organizations applying for accreditation, with The Open Group serving as the vendor-neutral Accreditation Authority that operates the program.  Prior to the launch, the forum conducted a pilot program with a number of member companies. It was announced at the conference that IBM is the first company to becoming accredited, earning accreditation for its Application, Infrastructure and Middleware (AIM), software business division for its product integrity and supply chain practices.

We recently spoke with OTTF members Fiona Pattinson, director of strategy and business development at Atsec Information Security, and Erin Connor, director at EWA-Canada, at the San Francisco conference to learn more about the assessment process and the new program.

The O-TTPS focus is on securing the technology supply chain. What would you say are the biggest threats facing the supply chain today?

Fiona Pattinson (FP): I think in the three years since the forum began certainly all the members have discussed the various threats quite a lot. It was one of things we discussed as an important topic early on, and I don’t know if it’s the ‘biggest threat,’ but certainly the most important threats that we needed to address initially were those of counterfeit and maliciously tainted products. We came to that through both discussion with all the industry experts in the forum and also through research into some of the requirements from government, so that’s exactly how we knew which threats [to start with].

Erin Connor (EC):  And the forum benefits from having both sides of the acquisition process, both acquirers, and the suppliers and vendors. So they get both perspectives.

How would you define maliciously tainted and counterfeit products?

FP:  They are very carefully defined in the standard—we needed to do that because people’s understanding of that can vary so much.

EC: And actually the concept of ‘maliciously’ tainted was incorporated close to the end of the development process for the standard at the request of members on the acquisition side of the process.

[Note: The standard precisely defines maliciously tainted and counterfeit products as follows:

"The two major threats that acquirers face today in their COTS ICT procurements, as addressed in this Standard, are defined as:

1. Maliciously tainted product – the product is produced by the provider and is acquired

through a provider’s authorized channel, but has been tampered with maliciously.

2. Counterfeit product – the product is produced other than by, or for, the provider, or is

supplied to the provider by other than a provider’s authorized channel and is presented as being legitimate even though it is not."]

The OTTF announced the Accreditation Program for the OTTP Standard at the recent San Francisco conference. Tell us about the standard and how the accreditation program will help ensure conformance to it?

EC: The program is intended to provide organizations with a way to accredit their lifecycle processes for their product development so they can prevent counterfeit or maliciously tainted components from getting into the products they are selling to an end user or into somebody else’s supply chain. It was determined that a third-party type of assessment program would be used. For the organizations, they will know that we Assessors have gone through a qualification process with The Open Group and that we have in place all that’s required on the management side to properly do an assessment. From the consumer side, they have confidence the assessment has been completed by an independent third-party, so they know we aren’t beholden to the organizations to give them a passing grade when perhaps they don’t deserve it. And then of course The Open Group is in position to oversee the whole process and award the final accreditation based on the recommendation we provide.  The Open Group will also be the arbiter of the process between the assessors and organizations if necessary. 

FP:  So The Open Group’s accreditation authority is validating the results of the assessors.

EC: It’s a model that is employed in many, many other product or process assessment and evaluation programs where the actual accreditation authority steps back and have third parties do the assessment.

FP: It is important that the assessor companies are working to the same standard so that there’s no advantage in taking one assessor over the other in terms of the quality of the assessments that are produced.

How does the accreditation program work?

FP: Well, it’s brand new so we don’t know if it is perfect yet, but having said that, we have worked over several months on defining the process, and we have drawn from The Open Group’s existing accreditation programs, as well as from the forum experts who have worked in the accreditation field for many years. We have been performing pilot accreditations in order to check out how the process works. So it is already tested.

How does it actually work? Well, first of all an organization will feel the need to become accredited and at that point will apply to The Open Group to get the accreditation underway. Once their scope of accreditation – which may be as small as one product or theoretically as large as a whole global company – and once the application is reviewed and approved by The Open Group, then they engage an assessor.

There is a way of sampling a large scope to identify the process variations in a larger scope using something we term ‘selective representative products.’ It’s basically a way of logically sampling a big scope so that we capture the process variations within the scope and make sure that the assessment is kept to a reasonable size for the organization undergoing the assessment, but it also gives good assurance to the consumers that it is a representative sample. The assessment is performed by the Recognized Assessor company, and a final report is written and provided to The Open Group for their validation. If everything is in order, then the company will be accredited and their scope of conformance will be added to the accreditation register and trademarked.

EC: So the customers of that organization can go and check the registration for exactly what products are covered by the scope.

FP: Yes, the register is public and anybody can check. So if IBM says WebSphere is accredited, you can go and check that claim on The Open Group web site.

How long does the process take or does it vary?

EC: It will vary depending on how large the scope to be accredited is in terms of the size of the representative set and the documentation evidence. It really does depend on what the variations in the processes are among the product lines as to how long it takes the assessor to go through the evidence and then to produce the report. The other side of the coin is how long it takes the organization to produce the evidence. It may well be that they might not have it totally there at the outset and will have to create some of it.

FP: As Erin said, it varies by the complexity and the variation of the processes and hence the number of selected representative products. There are other factors that can influence the duration. There are three parties influencing that: The applicant Organization, The Open Group’s Accreditation Authority and the Recognized Assessor.

For example, we found that the initial work by the Organization and the Accreditation Authority in checking the scope and the initial documentation can take a few weeks for a complex scope, of course for the pilots we were all new at doing that. In this early part of the project it is vital to get the scope both clearly defined and approved since it is key to a successful accreditation.

It is important that an Organization assigns adequate resources to help keep this to the shortest time possible, both during the initial scope discussions, and during the assessment. If the Organization can provide all the documentation before they get started, then the assessors are not waiting for that and the duration of the assessment can be kept as short as possible.

Of course the resources assigned by the Recognized Assessor also influences how long an assessment takes. A variable for the assessors is how much documentation do they have to read and review? It might be small or it might be a mountain.

The Open Group’s final review and oversight of the assessment takes some time and is influenced by resource availability within that organization. If they have any questions it may take a little while to resolve.

What kind of safeguards does the accreditation program put in place for enforcing the standard?

FP: It is a voluntary standard—there’s no requirement to comply. Currently some of the U.S. government organizations are recommending it. For example, NASA in their SEWP contract and some of the draft NIST documents on Supply Chain refer to it, too.

EC: In terms of actual oversight, we review what their processes are as assessors, and the report and our recommendations are based on that review. The accreditation expires after three years so before the three years is up, the organization should actually get the process underway to obtain a re-accreditation.  They would have to go through the process again but there will be a few more efficiencies because they’ve done it before. They may also wish to expand the scope to include the other product lines and portions of the company. There aren’t any periodic ‘spot checks’ after accreditation to make sure they’re still following the accredited processes, but part of what we look at during the assessment is that they have controls in place to ensure they continue doing the things they are supposed to be doing in terms of securing their supply chain.

FP:  And then the key part is the agreement the organizations signs with The Open Group includes the fact the organization warrant and represent that they remain in conformance with the standard throughout the accreditation period. So there is that assurance too, which builds on the more formal assessment checks.

What are the next steps for The Open Group Trusted Technology Forum?  What will you be working on this year now that the accreditation program has started?

FP: Reviewing the lessons we learned through the pilot!

EC: And reviewing comments from members on the standard now that it’s publicly available and working on version 1.1 to make any corrections or minor modifications. While that’s going on, we’re also looking ahead to version 2 to make more substantial changes, if necessary. The standard is definitely going to be evolving for a couple of years and then it will reach a steady state, which is the normal evolution for a standard.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey – m.hickey@opengroup.org

Fiona Pattinson Fiona Pattinson is responsible for developing new and existing atsec service offerings.  Under the auspices of The Open Group’s OTTF, alongside many expert industry colleagues, Fiona has helped develop The Open Group’s O-TTPS, including developing the accreditation program for supply chain security.  In the past, Fiona has led service developments which have included establishing atsec’s US Common Criteria laboratory, the CMVP cryptographic module testing laboratory, the GSA FIPS 201 TP laboratory, TWIC reader compliance testing, NPIVP, SCAP, PCI, biometrics testing and penetration testing. Fiona has responsibility for understanding a broad range of information security topics and the application of security in a wide variety of technology areas from low-level design to the enterprise level.

ErinConnorErin Connor is the Director at EWA-Canada responsible for EWA-Canada’s Information Technology Security Evaluation & Testing Facility, which includes a Common Criteria Test Lab, a Cryptographic & Security Test Lab (FIPS 140 and SCAP), a Payment Assurance Test Lab (device testing for PCI PTS POI & HSM, Australian Payment Clearing Association and Visa mPOS) and an O-TTPS Assessor lab Recognized by the Open Group.  Erin participated with other expert members of the Open Group Trusted Technology Forum (OTTF) in the development of The Open Group Trusted Technology Provider Standard for supply chain security and its accompanying Accreditation Program.  Erin joined EWA-Canada in 1994 and his initial activities in the IT Security and Infrastructure Assurance field included working on the team fielding a large scale Public Key Infrastructure system, Year 2000 remediation and studies of wireless device vulnerabilities.  Since 2000, Erin has been working on evaluations of a wide variety of products including hardware security modules, enterprise security management products, firewalls, mobile device and management products, as well as system and network vulnerability management products.  He was also the only representative of an evaluation lab in the Biometric Evaluation Methodology Working Group, which developed a proposed methodology for the evaluation of biometric technologies under the Common Criteria.

Leave a comment

Filed under Accreditations, Cybersecurity, OTTF, Professional Development, Standards, Supply chain risk

The Open Group and APMG Work Together to Promote TOGAF® and ArchiMate®

The APM Group (APMG) and The Open Group have announced a new partnership whereby APMG will support the accreditation services of The Open Group’s products. The arrangement will initially focus on TOGAF® and ArchiMate®, both standards of The Open Group.

APMG’s team of global assessors will be supporting The Open Group’s internal accreditation team in conducting their assessment activities. The scope of the assessments will focus on organizations, materials and training delivery.

“A significant value to The Open Group in this new venture is the ability to utilize APMG’s team of experienced multi-lingual assessors who are based throughout the world.  This will help The Open Group establish new markets and ensure quality support of existing markets, “ said James de Raeve, Vice President of Certification at The Open Group.

Richard Pharro, CEO of APMG said, “This agreement presents an excellent opportunity to APMG Accredited Training Organizations which are interested in training in The Open Group’s products, as their existing APMG accredited status will be recognized by The Open Group. We believe our global network will significantly enhance the awareness and take up of TOGAF and ArchiMate.”

About The Open Group

The Open Group is an international vendor- and technology-neutral consortium upon which organizations rely to lead the development of IT standards and certifications, and to provide them with access to key industry peers, suppliers and best practices. The Open Group provides guidance and an open environment in order to ensure interoperability and vendor neutrality. Further information on The Open Group can be found at http://opengroup.org.

About APM Group

The APM Group is one of the world’s largest certification bodies for knowledge based workers. As well as the certifications mentioned above, we offer competency-based assessments for specialist roles in the security and aerospace industries. We work with government agencies to help develop people who can achieve great things for the organizations they work for.

4 Comments

Filed under ArchiMate®, Certifications, Professional Development, Standards, TOGAF®

One Year Later: A Q&A Interview with Chris Harding and Dave Lounsbury about Open Platform 3.0™

By The Open Group

The Open Group launched its Open Platform 3.0™ Forum nearly one year ago at the 2013 Sydney conference. Open Platform 3.0 refers to the convergence of new and emerging technology trends such as Mobile, Social, Big Data, Cloud and the Internet of Things, as well as the new business models and system designs these trends are pushing organizations toward due to the consumerization of IT and evolving user behaviors. The Forum was created to help organizations address the architectural and structural considerations that businesses must consider to take advantage of and benefit from this evolutionary shift in how technology is used.

We sat down with The Open Group CTO Dave Lounsbury and Open Platform 3.0 Director Dr. Chris Harding at the recent San Francisco conference to catch up on the Forum’s activities and progress since launch and what they’ll be working on during 2014.

The Open Group’s Forum, Open Platform 3.0, was launched almost a year ago in April of 2013. What has the Forum been working on over the past year?

Chris Harding (CH): We launched at the Sydney conference in April of last year. What we’ve done since then first of all was to look at the requirements for the platform, and we did this using the proven TOGAF® technique of the Business Scenario. So over the course of last summer, the summer of 2013, we developed a Business Scenario capturing the requirements for Open Platform 3.0 and that was published just before The Open Group conference in October. Following that conference, the main activity that we’ve been doing is in fact furthering the requirements space. We’ve been developing analysis of use cases, so currently we have 22 different use cases that members of the forum have put together which are illustrating the use of the convergent technologies and most importantly the use of them in combination with each other.

What we’re doing here in this meeting in San Francisco is to obtain from that basis of requirements and use cases an understanding of what the platform fundamentally should be because it is our intention to produce a Snapshot definition of the platform by the end of March. So in the first year of the Forum, we hope that we will finish that year by producing a Snapshot definition of Open Platform 3.0.

Dave Lounsbury (DL): First, the roots of the Open Platform go deeper. Previous to that we had a number of works groups in the areas of Cloud, SOA and some other ones in terms of Semantic Interoperability. All of those were early pieces, and what we saw at the beginning of 2013 was a coalescing of that into this concept that businesses were looking for a new platform for their operations that combined aspects of Social, Mobile, Cloud computing, Big Data and the analytics that go along with it. We saw that emerging in the marketplace, and we formed the Forum to develop that direction. The Open Group always takes an end-to-end view of any problem – we like to look at the whole ecosystem. We want to make sure that the technical standards aren’t just point targets and actually address a business need.

Some of the work groups within The Open Group, such as Quantum Lifecycle Management (QLM) and Semantic Interoperability, have been brought under the umbrella of Open Platform 3.0, most notably the Cloud Work Group. How will the work of these groups continue under Platform 3.0?

CH: Some of the work already going on in The Open Group was directly or indirectly relevant to Open Platform 3.0. And that first and most importantly was the work of the Cloud Work Group, Cloud being one of the convergent technologies, and the Cloud Work Group became a part of Platform 3.0. Two other activities also became a part of Open Platform 3.0, one was of these was the Semantic Interoperability Work Group, and that is because we recognized that Semantic Interoperability has to be an important part of how these technologies work with each other. Though it may not be that we have a full definition of that in the first version of the standard – it’s a notoriously difficult area – but over the course of time, we hope to incorporate a Semantic Interoperability component in the Platform definition and that may well build on the work that we’ve been doing with the Universal Data Element Framework, the UDEF project, which is currently undergoing a major restructuring. The key thing from the Open Platform 3.0 perspective is how the semantic convention relates to the convergence of the technologies in the platform.

In terms of QLM, that became part become of Open Platform 3.0 because one of the key convergent technologies is the Internet of Things, and QLM overlaps significantly with that. QLM is not about the Internet of Things, as such, but it does have a strong component of understanding the way networked sensors and controls work, so that’s become an important contribution to the new Forum.

DL: Like in any platform there’s going to be multiple components. In Open Platform 3.0, one of the big drivers for this change is Big Data. Big Data is very trendy, right? But where does Big Data come from? Well, it comes from increased connectivity, increased use of mobile devices, increased use of sensors –  the ‘Internet of Things.’ All of these things are generating data about usage patterns, where people are, what they’re doing, what that they‘re buying, what they’re interested in and what their likes and dislikes are, creating a massive flood of data. Now the question becomes ‘how do you compute on that data?’ You need to handle that massively scalable stream of data. You need massively scalable computing  underneath it, you need the ability to move large amounts of information from one place to another. When you think about the analysis of data like that, you have algorithms that do a lot of data access and they’ll have big spikes of computation, as they create some model of it. If you’re going to look at 10 zillion records, you don’t want to buy enough computers so you can always look at 10 zillion records, you want to be able to turn that on, do your analysis and turn it back off.  That’s, of course, why Cloud is a critical component of Open Platform 3.0.

Open Platform 3.0 encompasses a lot of different technologies as well as how they are converging. How do you piece apart everything that Platform 3.0 entails to begin to formulate a standard for it?

CH: I mentioned that we developed 22 use cases. The way that we’re addressing this is to look at use cases and the business and technical ecosystems that those use cases exemplify and to abstract from that some fundamental architectural patterns. These we believe will be the basis for the initial definition of the platform.

DL: That gets back to this question about how were starting up. Again it’s The Open Group’s mantra that we look at a business problem as an end-to-end problem. So what you’ll see in Open Platform 3.0, is that we’ve done the Business Scenario to figure out what’s the business motivator, what do business people need to get this done, and we’re fleshing that out with these details in these detailed use cases.

One of the things that we’re very careful about in The Open Group is that we don’t replicate what’s going on in other standards bodies. If you look at what’s going on in Cloud, and what continues to go on in Cloud under the Open Platform 3.0, banner, we really focused in on what do business people really need in the cloud guides – those are how business people really use it.  We’ve stayed away for a long time from the bits and bytes – we’re now doing a Cloud Reference Architecture – but we’ve also created the Cloud Ecosystem Reference Model, which was just published. That Cloud Ecosystem Reference Model, if you read through it, isn’t about how bits flow around, it’s about how partners interact with each other – what to look for in your Cloud partner, who are the players? When you go to use Cloud in your business, what players do you have to engage with? What are the roles that you have to engage with them on? So again it’s really that business level of guidance that The Open Group is really good at, and we do liaison with other organizations in order to get technical stuff if we need it – or if not, we’ll create it ourselves because we’ve got very competent technical people – but again, it’s that balanced business approach that distinguishes The Open Group way.

Many industry pundits have said that Open Platform 3.0 is ultimately about a shift toward user-driven IT. How does that change the standards making process when most standards are ultimately put in place by technologists not necessarily end-users?

CH:  It’s an interesting question. I mentioned the Business Scenario that we developed over the summer – one of the key things that came out of that was that there is this shift towards a more direct use of the technologies by business users.  And that is partly because it’s becoming more possible. Cloud is one of the key factors that has shortened the cycle of procuring and putting IT in place to support business use, and made it more possible to manage IT directly. At the same time [users are] becoming impatient with delay and wanting to gain the benefits of technology directly and not at arms length through the IT department. We’re seeing in connection with these phenomena such as the business technologist, the technical specialist who works with or is employed by the business department rather than within a separate IT department, and one of whose key strengths is an understanding of the business.  So that is certainly an important dimension that we’re seeing and one of the requirements for the Platform is that it should be usable in an environment where business is using IT more directly.

But that wasn’t the question you asked. The question was, ‘isn’t it a problem that the standards are defined by technologists?’ We don’t believe it’s a problem provided that the technologists do have an understanding of the business environment. That was why in the Business Scenario activity that we conducted, one of the key inputs was a roundtable workshop with CIO level people, and that is where a lot of our perspective on why things are changing comes from. Open Platform 3.0 certainly does have dimension of fundamental architecture patterns and part of that is business architecture patterns but it also has a technical dimension, and obviously you do really need the technical people to explore that dimension though they do always need to keep in mind the technology is there to serve the business.

DL: If you actually look at trends in the marketplace about how IT is done, and in fact if you look at the last blog post that Allen [Brown] did about agile, the whole thrust of agile methodologies and its successor DevOps is to really get the implementers right next to the business people and have a very tight arrangement in order to get fast iteration and really have the implementer do what the business person needs. I actually view consumerization not as some outside threat but actually a logical extension of that trend. What’s happening in my opinion is that people who are not technologists, who are not part of the IT department, are getting comfortable using and managing their own technology. And so they’re making decisions that used to be made by the IT department years ago – or what used to be the IT department. First there was the big mainframe, and you handed in your cards at a window and you got your printout in your little cubby hole. Then the IT department bought your PC, and now we bring our own devices. There’s nothing wrong with that, that’s people getting comfortable with technology and making decisions. I think that’s one of the reasons we have need for an Open Platform 3.0 approach – to develop business guidance and eventually technical standards on how we keep up with that trend. Because it’s a very natural trend – people want to control the resources they need to get their job done, and if those resources are technical resources, and they’re comfortable doing that, great!

Convergence and Open Platform 3.0 seem to take us closer and closer to The Open Group’s vision of Boundaryless Information Flow™.  Is Open Platform 3.0 the fulfillment of that vision?

DL: I think I’d be crazy to say that it’s the endpoint of that vision. I think being able to move large amounts of data and make decisions on it is a significant step forward in Boundaryless Information Flow, but this is a two-edged sword. I talked about all that data being generated by mobile devices and sensors and retail networks and social networks and things like that. That data is growing exponentially.  The number of people who can make decisions on that data are growing at best linearly and not very quickly. So if there’s all this data out there and nobody to look at it, we need to ask if we have we lowered the boundary for communications or have we actually raised it by creating a pile of data that no one can climb? That’s why I think a next step is, in fact, more machine-assisted analytics and predictive analytics and machine learning that will help humans digest and understand that data. That will be, I think, yet another step toward Boundaryless Information Flow. Moving bits around does not equate to information flow – its only information when it moves from data to being information in a human’s brain. Until we lower that barrier as well, we’re not there. And even beyond that, there’s still lots of things that can be done, in terms of breaking down human language barriers and things like that or social networks in more intuitive ways. I think there’s a long way to go. I think this is a really important step forward, but fulfillment is too strong a word.

CH:  Not in itself, I don’t believe. It is a major contribution towards the vision of Boundaryless Information Flow but it is not the complete fulfillment of that vision. Since we’ve formulated the problem statement of Boundaryless Information Flow there have been a number of developments that have impacted on it and maybe helped to bring it closer. So you might think of SOA as an important enabling technology for Boundaryless Information Flow, replacing the information silos with interacting services. Now we’re seeing Open Platform 3.0, which is certainly going to have a service-oriented flavor, shall we say, although it probably will not look exactly like traditional SOA. The Boundaryless Information Flow requirement was a very far-reaching problem statement. The Interoperable Business Scenario was where it was first set out and since then we’ve been gradually making process toward it. Open Platform 3.0 will bring it closer, but I’m sure there will be other things still needed to make it happen. 

One of the key things for Boundaryless Information Flow is Enterprise Architecture. So within a particular enterprise, the business and IT needs to be architected to enable Boundaryless Information Flow, and TOGAF is the method that is defined and maintained by The Open Group for how enterprises define enterprise architectures. Open Platform 3.0 will complement that by providing a ‘this is what an architecture looks like that enables the business to take advantage of this new converging technologies.’ But there will still be a need for the Enterprise Architect to put that together with the other particular factors involved in an enterprise to create an architecture for Boundaryless Information Flow within that enterprise.

When can we expect the first standard from Open Platform 3.0?

DL: Well, we published the Cloud Ecosystem Reference Guide, and again the understanding of how business partners relate in the Cloud world is a key component of Open Platform 3.0. The Forum has a roadmap, and will start publishing the case studies still in process.

The message I would say is there’s already early value in the Cloud Ecosystem Reference Model, which is a logical continuation of cloud work that had already gone on in the Work Group, but is now part of the Forum as part of Open Platform 3.0.

CH: That’s always a tricky question however I can tell you what is planned. The intention, as I said, was to produce a Snapshot definition by the end of March and, given we are a quarter of the way through the meeting at this conference, which is the key meeting that will define the basis for that, the progress has been good so far, so I’m optimistic. A Snapshot is not a Standard. A Snapshot is a statement of ‘this is what we are thinking and might be what it will look like,’ but it’s not guaranteed in any way that the Standard will follow the Snapshot. We are intending to produce the first Standard definition of the platform in about a year’s time after the Snapshot.  That will give the opportunity for people not only within The Open Group but outside The Open Group to give us input and further understanding of the way people intend to use the platform as feedback on the snapshot, which should be the basis for the first published standard.

For more on the Open Platform 3.0 Forum, please visit: http://www3.opengroup.org/subjectareas/platform3.0.

If you have any questions about Open Platform 3.0 or if you would like to join the new Forum, please contact Chris Harding (c.harding@opengroup.org) for queries regarding the Forum or Chris Parnell (c.parnell@opengroup.org) for queries regarding membership.

Chris HardingDr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

Dave LounsburyDave is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, Dave leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia. Dave holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.

Leave a comment

Filed under Cloud, Cloud/SOA, Conference, Open Platform 3.0, Standards, TOGAF®

How to Build a Smarter City – Join The Open Group Tweet Jam on February 26

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

On Wednesday, February 26, The Open Group will host a Tweet Jam examining smart cities and how Real-time and Embedded Systems can seamlessly integrate inputs from various agencies and locations. That collective data allows local governments to better adapt to change by implementing an analytics-based approach to measure:

  • Economic activity
  • Mobility patterns
  • Resource consumption
  • Waste management and sustainability measures
  • Inclement weather
  • And much more!

These metrics allow smart cities to do much more than just coordinate responses to traffic jams, they are forecasting and coordinating safety measures in advance of physical disasters and inclement weather; calculating where offices and shops can be laid out most efficiently; and how all the parts of urban life should be fitted together including energy, sustainability and infrastructural repairs and planning and development.

Smart cities are already very much a reality in the Middle East and in Korea and those have become a model for developers in China, and for redevelopment in Europe. Market research firm, IDC Government Insights projects that 2014 is the year cities around the world start getting smart. It predicts a $265 billion spend by cities worldwide this year alone to implement new technology and integrate agency data. Part of the reason for that spend is likely spurred by the fact that more than half the world’s population currently lives in urban areas. With urbanization rates rapidly increasing, Brookings Institution estimates that number could swell up to 75 percent of the global populace by 2050.

While the awe-inspiring smart city of Rio de Janeiro is proving to be an interesting smart city model for cities across the world, are smart cities always the best option for informing city decisions?  Could the beauty of a self-regulating open grid allow people to decide how best to use spaces in the city?

Please join us on Wednesday, February 26 at 9:00 am PT/12:00 pm ET/5:00 pm GMT for a tweet jam, that will discuss the issues around smart cities.  We welcome The Open Group members and interested participants from all backgrounds to join the discussion and interact with our panel of thought-leaders including  David Lounsbury, CTO and Chris Harding, Director of Interoperability from The Open Group. To access the discussion, please follow the #ogchat hashtag during the allotted discussion time.

What Is a Tweet Jam?

A tweet jam is a one-hour “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on relevant and thought-provoking issues. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

Have your first #ogchat tweet be a self-introduction: name, affiliation, occupation.

Start all other tweets with the question number you’re responding to and add the #ogchat hashtag.

Sample: “A1: There are already a number of cities implementing tech to get smarter. #ogchat”

Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.

While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue.

A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please contact Rob Checkal (@robcheckal or rob.checkal@hotwirepr.com). We anticipate a lively chat and hope you will be able to join!

Leave a comment

Filed under real-time and embedded systems, Tweet Jam