Author Archives: The Open Group Blog

Why Technology Must Move Toward Dependability through Assuredness™

By Allen Brown, President and CEO, The Open Group

In early December, a technical problem at the U.K.’s central air traffic control center in Swanwick, England caused significant delays that were felt at airports throughout Britain and Ireland, also affecting flights in and out of the U.K. from Europe to the U.S. At Heathrow—one of the world’s largest airports—alone, there were a reported 228 cancellations, affecting 15 percent of the 1,300 daily flights flying to and from the airport. With a ripple effect that also disturbed flight schedules at airports in Birmingham, Dublin, Edinburgh, Gatwick, Glasgow and Manchester, the British National Air Traffic Services (NATS) were reported to have handled 20 percent fewer flights that day as a result of the glitch.

According to The Register, the problem was caused when a touch-screen telephone system that allows air traffic controllers to talk to each other failed to update during what should have been a routine shift change from the night to daytime system. According to news reports, the NATS system is the largest of its kind in Europe, containing more than a million lines of code. It took the engineering and manufacturing teams nearly a day to fix the problem. As a result of the snafu, Irish airline Ryanair even went so far as to call on Britain’s Civil Aviation Authority to intervene to prevent further delays and to make sure better contingency efforts are in place to prevent such failures happening again.

Increasingly complex systems

As businesses have come to rely more and more on technology, the systems used to keep operations running smoothly from day to day have gotten not only increasingly larger but increasingly complex. We are long past the days where a single mainframe was used to handle a few batch calculations.

Today, large global organizations, in particular, have systems that are spread across multiple centers of technical operations, often scattered in various locations throughout the globe. And with industries also becoming more inter-related, even individual company systems are often connected to larger extended networks, such as when trading firms are connected to stock exchanges or, as was the case with the Swanwick failure, airlines are affected by NATS’ network problems. Often, when systems become so large that they are part of even larger interconnected systems, the boundaries of the entire system are no longer always known.

The Open Group’s vision for Boundaryless Information Flow™ has never been closer to fruition than it is today. Systems have become increasingly open out of necessity because commerce takes place on a more global scale than ever before. This is a good thing. But as these systems have grown in size and complexity, there is more at stake when they fail than ever before.

The ripple effect felt when technical problems shut down major commercial systems cuts far, wide and deep. Problems such as what happened at Swanwick can affect the entire extended system. In this case, NATS, for example, suffers from damage to its reputation for maintaining good air traffic control procedures. The airlines suffer in terms of cancelled flights, travel vouchers that must be given out and angry passengers blasting them on social media. The software manufacturers and architects of the system are blamed for shoddy planning and for not having the foresight to prevent failures. And so on and so on.

Looking for blame

When large technical failures happen, stakeholders, customers, the public and now governments are beginning to look for accountability for these failures, for someone to assign blame. When the Obamacare website didn’t operate as expected, the U.S. Congress went looking for blame and jobs were lost. In the NATS fiasco, Ryanair asked for the government to intervene. Risk.net has reported that after the Royal Bank of Scotland experienced a batch processing glitch last summer, the U.K. Financial Services Authority wrote to large banks in the U.K. requesting they identify the people in their organization’s responsible for business continuity. And when U.S. trading company Knight Capital lost $440 million in 40 minutes when a trading software upgrade failed in August, U.S. Securities and Exchange Commission Chairman Mary Schapiro was quoted in the same article as stating: “If there is a financial loss to be incurred, it is the firm committing the error that should suffer that loss, not its customers or other investors. That more than anything sends a wake-up call to the entire industry.”

As governments, in particular, look to lay blame for IT failures, companies—and individuals—will no longer be safe from the consequences of these failures. And it won’t just be reputations that are lost. Lawsuits may ensue. Fines will be levied. Jobs will be lost. Today’s organizations are at risk, and that risk must be addressed.

Avoiding catastrophic failure through assuredness

As any IT person or Enterprise Architect well knows, completely preventing system failure is impossible. But mitigating system failure is not. Increasingly the task of keeping systems from failing—rather than just up and running—will be the job of CTOs and enterprise architects.

When systems grow to a level of massive complexity that encompasses everything from old legacy hardware to Cloud infrastructures to worldwide data centers, how can we make sure those systems are reliable, highly available, secure and maintain optimal information flow while still operating at a maximum level that is cost effective?

In August, The Open Group introduced the first industry standard to address the risks associated with large complex systems, the Dependability through Assuredness™ (O-DA) Framework. This new standard is meant to help organizations both determine system risk and help prevent failure as much as possible.

O-DA provides guidelines to make sure large, complex, boundaryless systems run according to the requirements set out for them while also providing contingencies for minimizing damage when stoppage occurs. O-DA can be used as a standalone or in conjunction with an existing architecture development method (ADM) such as the TOGAF® ADM.

O-DA encompasses lessons learned within a number of The Open Group’s forums and work groups—it borrows from the work of the Security Forum’s Dependency Modeling (O-DM) and Risk Taxonomy (O-RT) standards and also from work done within the Open Group Trusted Technology Forum and the Real-Time and Embedded Systems Forums. Much of the work on this standard was completed thanks to the efforts of The Open Group Japan and its members.

This standard addresses the issue of responsibility for technical failures by providing a model for accountability throughout any large system. Accountability is at the core of O-DA because without accountability there is no way to create dependability or assuredness. The standard is also meant to address and account for the constant change that most organization’s experience on a daily basis. The two underlying principles within the standard provide models for both a change accommodation cycle and a failure response cycle. Each cycle, in turn, provides instructions for creating a dependable and adaptable architecture, providing accountability for it along the way.

oda2

Ultimately, the O-DA will help organizations identify potential anomalies and create contingencies for dealing with problems before or as they happen. The more organizations can do to build dependability into large, complex systems, hopefully the less technical disasters will occur. As systems continue to grow and their boundaries continue to blur, assuredness through dependability and accountability will be an integral part of managing complex systems into the future.

Allen Brown

Allen Brown is President and CEO, The Open Group – a global consortium that enables the achievement of business objectives through IT standards.  For over 14 years Allen has been responsible for driving The Open Group’s strategic plan and day-to-day operations, including extending its reach into new global markets, such as China, the Middle East, South Africa and India. In addition, he was instrumental in the creation of the AEA, which was formed to increase job opportunities for all of its members and elevate their market value by advancing professional excellence.

Comments Off

Filed under Dependability through Assuredness™, Standards

The Open Group San Francisco 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

The Open Group’s San Francisco conference, held at the Marriott Union Square, began today highlighting the theme of how the industry is moving Toward Boundaryless Information Flow™.”

The morning plenary began with a welcome from The Open Group President and CEO Allen Brown.  He began the day’s sessions by discussing the conference theme, reminding the audience that The Open Group’s vision of Boundaryless Information Flow began in 2002 as a means to breakdown the silos within organizations and provide better communications within, throughout and beyond organizational walls.

Heather Kreger, Distinguished Engineer and CTO of International Standards at IBM, presented the first session of the day, “Open Technologies Fuel the Business and IT Renaissance.” Kreger discussed how converging technologies such as social and mobile, Big Data, the Internet of Things, analytics, etc.—all powered by the cloud and open architectures—are forcing a renaissance within both IT and companies. Fueling this renaissance is a combination of open standards and open source technologies, which can be used to build out the platforms needed to support these technologies at the speed that is enabling innovation. To adapt to these new circumstances, architects should broaden their skillsets so they have deeper skills and competencies in multiple disciplines, technologies and cultures in order to better navigate this world of open source based development platforms.

The second keynote of the morning, “Enabling the Opportunity to Achieve Boundaryless Information Flow™,” was presented by Larry Schmidt, HP Fellow at Hewlett-Packard, and Eric Stephens, Enterprise Architect, Oracle. Schmidt and Stephens addressed how to cultivate a culture within healthcare ecosystems to enable better information flow. Because healthcare ecosystems are now primarily digital (including not just individuals but technology architectures and the Internet of Things), boundaryless communication is imperative so that individuals can become the managers of their health and the healthcare ecosystem can be better defined. This in turn will help in creating standards that help solve the architectural problems currently hindering the information flow within current healthcare systems, driving better costs and better outcomes.

Following the first two morning keynotes Schmidt provided a brief overview of The Open Group’s new Healthcare Forum. The forum plans to leverage existing Open Group best practices such as harmonization, existing standards (such as TOGAF®) and work with other forums and vertical to create new standards to address the problems facing the healthcare industry today.

Mike Walker, Enterprise Architect at Hewlett-Packard, and Mark Dorfmueller, Associate Director Global Business Services for Procter & Gamble, presented the morning’s final keynote entitled “Business Architecture: The Key to Enterprise Transformation.” According to Walker, business architecture is beginning to change how enterprise architecture is done within organizations. In order to do so, Walker believes that business architects must be able to understand business processes, communicate ideas and engage with others (including other architects) within the business and offer services in order to implement and deliver successful programs. Dorfmueller illustrated business architecture in action by presenting how Procter & Gamble uses their business architecture to change how business is done within the company based on three primary principles—being relevant, practical and making their work consumable for those within the company that implement the architectures.

The morning plenary sessions culminated with a panel discussion on “Future Technology and Enterprise Transformation,” led by Dave Lounsbury, VP and CTO of The Open Group. The panel, which included all of the morning’s speakers, took a high-level view of how emerging technologies are eroding traditional boundaries within organizations. Things within IT that have been specialized in the past are now becoming commoditized to the point where they are now offering new opportunities for companies. This is due to how commonplace they’ve become and because we’re becoming smarter in how we use and get value out of our technologies, as well as the rapid pace of technology innovation we’re experiencing today.

Finally, wrapping up the morning was the Open Trusted Technology Forum (OTTF), a forum of The Open Group, with forum director Sally Long presenting an overview of a new Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program which launched today.  The program is the first such accreditation to provide third-party certification for companies guaranteeing their supply chains are free from maliciously tainted or counterfeit products and conformant to the Open Trusted Technology Provider™ Standard (O-TTPS). IBM is the first company to earn the accreditation and there are at least two other companies that are currently going through the accreditation process.

Monday’s afternoon sessions were split between two tracks, Enterprise Architecture (EA) and Enterprise Transformation and Open Platform 3.0.

In the EA & Enterprise Transformation track, Purna Roy and John Raspen, both Directors of Consulting at Cognizant Technology Solutions, discussed the need to take a broad view and consider factors beyond just IT architectures in their session, “Enterprise Transformation: More than an Architectural Transformation.”  In contrast, Kirk DeCosta, Solution Architect at PNC Financial Services, argued that existing architectures can indeed serve as the foundation for transformation in “The Case for Current State – A Contrarian Viewpoint.”

The Open Platform 3.0 track addressed issues around the convergence of technologies based on cloud platforms, including the impact of Big Data as an enabler of information architectures by Helen Sun, Enterprise Architect at Oracle, and predictive analytics. Dipanjan Sengupta, Principal Architect at Cognizant Technology Solutions, discussed why integration platforms are critical for managing distribution application portfolios in “The Need for a High Performance Integration Platform in the Cloud Era.”

Today’s plenary sessions and many of the track sessions can be viewed on The Open Group’s Livestream channel at http://new.livestream.com/opengroup.

The day ended with an opportunity for everyone to share cocktails and conversation at a networking reception held at the hotel.

photo

Andras Szakal, VP & CTO, IBM U.S. Federal and Chair of the OTTF, presented with a plaque in honor of IBM’s contribution to the O-TTPS Accreditation Program, along with the esteemed panel who were key to the success of the launch.

Comments Off

Filed under Business Architecture, Conference, Enterprise Architecture, Enterprise Transformation, Uncategorized

New Accreditation Program – Raises the Bar for Securing Global Supply Chains

By Sally Long, Director of The Open Group Trusted Technology Forum (OTTF)™

In April 2013, The Open Group announced the release of the Open Trusted Technology Provider™ Standard (O-TTPS) 1.0 – Mitigating Maliciously Tainted and Counterfeit Products. Now we are announcing the O-TTPS Accreditation Program, launched on February 3, 2014, which enables organizations that conform to the standard to be accredited as Open Trusted Technology Providers™.

The O-TTPS, a standard of The Open Group, provides a set of guidelines, recommendations and requirements that help assure against maliciously tainted and counterfeit products throughout commercial off-the-shelf (COTS) information and communication technology (ICT) product lifecycles. The standard includes best practices throughout all phases of a product’s life cycle: design, sourcing, build, fulfillment, distribution, sustainment, and disposal, thus enhancing the integrity of COTS ICT products and the security of their global supply chains.

This accreditation program is one of the first of its kind in providing accreditation for conforming to standards for product integrity coupled with supply chain security.

The standard and the accreditation program are the result of a collaboration between government, third party evaluators and some of industry’s most mature and respected providers who came together and, over a period of four years, shared their practices for integrity and security, including those used in-house and those used with their own supply chains.

Applying for O-TTPS Accreditation

When the OTTF started this initiative, one of its many mantras was “raise all boats.” The  objective was to raise the security bar across the full spectrum of the supply chain, from small component suppliers to the providers who include those components in their products and to the integrators who incorporate those providers’ products into customers’ systems.

The O-TTPS Accreditation Program is open to all component suppliers, providers and integrators. The holistic aspect of this program’s potential, as illustrated in the diagram below should not be underestimated—but it will take a concerted effort to reach and encourage all constituents in the supply chain to become involved.

OTTPSThe importance of mitigating the risk of maliciously tainted and counterfeit products

The focus on mitigating the risks of tainted and counterfeit products by increasing the security of the supply chain is critical in today’s global economy. Virtually nothing is made from one source.

COTS ICT supply chains are complex. A single product can be comprised of hundreds of components from multiple component suppliers from numerous different areas around the world—and providers can change their component suppliers frequently depending on the going rate for a particular component.  If, along the supply chain, bad things happen, such as inserting counterfeit components in place of authentic ones or inserting maliciously tainted code or the double-hammer—maliciously tainted counterfeit parts—then terrible things can happen when that product is installed at a customer site.

With the threat of tainted and counterfeit technology products posing a major risk to global organizations, it is increasingly important for those organizations to take what steps they can to mitigate these risks. The O-TTPS Accreditation Program is one of those steps. Can an accreditation program completely eliminate the risk of tainted and counterfeit components? No!  Does it reduce the risk? Absolutely!

How the Accreditation Program works

The Open Group, with over 25 years’ experience managing vendor- and technology-neutral certification programs, will assume the role of the Accreditation Authority over the entire program. Additionally the program will utilize third-party assessors to assess conformance to the O-TTPS requirements.

Companies seeking accreditation will declare their Scope of Accreditation, which means they can choose to be accredited for conforming to the O-TTPS standard and adhering to the best practice requirements across their entire enterprise, within a specific product line or business unit or within an individual product.  Organizations applying for accreditation are then required to provide evidence of conformance for each of the O-TTPS requirements, demonstrating they have the processes in place to secure in-house development and their supply chains across the entire COTS ICT product lifecycle. O-TTPS accredited organizations will then be able to identify themselves as Open Trusted Technology Providers™ and will become part of a public registry of trusted providers.

The Open Group has also instituted the O-TTPS Recognized Assessor Program, which assures that Recognized Assessor (companies) meet certain criteria as assessor organizations and that their assessors (individuals) meet an additional set of criteria and have passed the O-TTPS Assessor exam, before they can be assigned to an O-TTPS Assessment. The Open Group will operate this program, grant O-TTPS Recognized Assessor certificates and list those qualifying organizations on a public registry of recognized assessor companies.

Efforts to increase awareness of the program

The Open Group understands that to achieve global uptake we need to reach out to other countries across the globe for market adoption, as well as to other standards groups for harmonization. The forum has a very active outreach and harmonization work group and the OTTF is increasingly being recognized for its efforts. A number of prominent U.S. government agencies, including the General Accounting Office and NASA have recognized the standard as an important supply chain security effort. Dave Lounsbury, the CTO of The Open Group, has testified before Congress on the value of this initiative from the industry-government partnership perspective. The Open Group has also met with President Obama’s Cybersecurity Coordinators (past and present) to apprise them of our work. We continue to work closely with NIST from the perspective of the Cybersecurity Framework, which recognizes the supply chain as a critical area for the next version, and the OTTF work is acknowledged in NIST’s Special Publication 161. We have liaisons with ISO and are working internally at mapping our standards and accreditation to Common Criteria. The O-TTPS has also been discussed with government agencies in China, India, Japan and the UK.

The initial version of the standard and the accreditation program are just the beginning. OTTF members will continue to evolve both the standard and the accreditation program to provide additional versions that refine existing requirements, introduce additional requirements, and cover additional threats. And the outreach and harmonization efforts will continue to strengthen so that we can reach that holistic potential of Open Trusted Technology Providers™ throughout all global supply chains.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey – m.hickey@opengroup.org

Sally LongSally Long is the Director of The Open Group Trusted Technology Forum (OTTF). She has managed customer supplier forums and collaborative development projects for over twenty years. She was the release engineering section manager for all multi-vendor collaborative technology development projects at The Open Software Foundation (OSF) in Cambridge Massachusetts. Following the merger of the OSF and X/Open under The Open Group, she served as director for multiple forums in The Open Group. Sally has a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Massachusetts.

Comments Off

Filed under Cybersecurity, OTTF, Supply chain risk

What I learnt at The Open Group Bangalore Conference last weekend

By Sreekanth Iyer, Executive IT Architect, IBM

It was quite a lot of learning on a Saturday attending The Open Group conference at Bangalore. Actually it was a two day program this year. I could not make it on Friday because of other work commitments. I heard from the people who attended that it was a great session on Friday. At least I knew about a fellow IBMer Jithesh Kozhipurath’s presentation on Friday. I’d the chance to look at that excellent material on applying TOGAF® practices for integrated IT Operations Enterprise Architecture which was his experience sharing of the lab infra optimization work that he was leading.

I started bit late on Saturday, thinking it was happening at the Leela Palace which was near to my home (Ah.. that was in 2008) Realized late that it was at the Philips Innovation Campus at Manyata. But managed to reach just on time before the start of the sessions.

The day started with an Architecture as a Service discussion. The presentation was short but there were lot of interesting questions and interactions post the session.  I was curious know more about the “self-service” aspect on that topic.

Then we had Jason Uppal of ClinicialMessage Inc. on stage (see picture below) , who gave a wonderful presentation on the human touch to the architecture and how to leverage EA to make disruptive changes without disrupting the working systems.

Jason bangaloreLots of take-aways from the session. Importantly the typical reasons why certain Architectures can fail… caused many a times we have a solution already in our mind and we are trying to fit that into the requirement. And most of these times if we look at the Requirements artifact we will be see that the problems are not rightly captured. Couldn’t agree more with the good practices that he discussed.

Starting with  “Identifying the Problem Right” – I thought that is definitely the first and important step in Architecture.  Then Jason talked about significance of communicating and engaging people and stakeholders in the architecture — point that he drove home with a good example from the health care industry. He talked about the criticality of communicating and engaging the stakeholders — engagement of course improves quality. Building the right levers in the architecture and solving the whole problem were some of the other key points that I noted down. More importantly the key message was as Architects, we have to go beyond drawing the lines and boxes to deliver the change, may be look to deliver things that can create an impact in 30 days balancing the short term and long term goals.

I got the stage for couple of minutes to update on the AEA Bangalore Chapter activities. My request to the attendees was to leverage the chapter for their own professional development – using that as a platform to share expertise, get answers to queries, connect with other professionals of similar interest and build the network. Hopefully will see more participation in the Bangalore chapter events this year.

On the security track, had multiple interesting sessions. Began with Jim Hietala of The Open Group discussing the Risk Management Framework. I’ve been attending a course on the subject. But this one provided a lot of insight on the taxonomy (O-RT) and the analysis part – more of taking a quantitative approach than a qualitative approach. Though the example was based on risks with regard to laptop thefts, there is no reason we can’t apply the principles to real issues like quantifying the threats for moving workloads to cloud. (that’s another to-do added to my list).

Then it was my session on the Best practices for moving workloads to cloud for Indian Banks. Talked about the progress so far with the whitepaper. The attendees were limited as there was Jason’s EA workshop happening in parallel. But those who attended were really interested in the subject. We did have a good discussion on the benefits, challenges and regulations with regard to the Indian Banking workloads and their movement to cloud.  We discussed few interesting case studies. There are areas that need more content and I’ve requested the people who attended the session to participate in the workgroup. We are looking at getting a first draft done in the next 30 days.

Finally, also sat in the presentation by Ajit A. Matthew on the security implementation at Intel. Everywhere the message is clear. You need to implement context based security and security intelligence to enable the new age innovation but at the same time protect your core assets.

It was a Saturday well spent. Added had some opportunities to connect with few new folks and understand their security challenges with cloud.  Looking to keep the dialog going and have an AEA Bangalore chapter event sometime during Q1. In that direction, I took the first step to write this up and share with my network.

Event Details:
The Open Group Bangalore, India
January 24-25, 2014

Sreekanth IyerSreekanth Iyer is an Executive IT Architect in IBM Security Systems CTO office and works on developing IBM’s Cloud Security Technical Strategy. He is an Open Group Certified Distinguished Architect and is a core member of the Bangalore Chapter of the Association of Enterprise Architects. He has over 18 years’ industry experience and has led several client solutions across multiple industries. His key areas of work include Information Security, Cloud Computing, SOA, Event Processing, and Business Process management. He has authored several technical articles, blogs and is a core contributor to multiple Open Group as well as IBM publications. He works out of the IBM India Software Lab Bangalore and you can follow him on Twitter @sreek.

Comments Off

Filed under Conference, Enterprise Architecture, Healthcare, TOGAF®

The ArchiMate® Certification for People Program 2014 Updates

By Andrew Josey, The Open Group

Following on from the news in December of the 1000th certification in the ArchiMate certification program, The Open Group has made some changes to the program that will make the certification program more accessible. As of January 2014, it is now possible to self study for both certification levels.  Previously to achieve the Level 2 certification, known as ArchiMate 2 Certified, attendance at a course was mandatory.

To accommodate this, a revised examination structure has been introduced as shown in the diagram below:ArchiMate_2_exam

There are two levels of certification:

  • ArchiMate Foundation: Knowledge of the notation, terminology, structure, and concepts of the ArchiMate modeling language.
  • ArchiMate Certified: In addition to Knowledge and comprehension, the ability to analyze and apply the ArchiMate modeling language.

Candidates are able to choose whether they wish to become certified in a stepwise manner by starting with ArchiMate 2 Foundation and then at a later date ArchiMate 2 Certified, or bypass ArchiMate 2 Foundation and go directly to ArchiMate 2 Certified.

For those going directly to ArchiMate 2 Certified there is a choice of taking the two examinations separately or a Combined examination. The advantage of taking the two examinations over the single Combined examination is that if you pass Part 1 but fail Part 2 you can still qualify for ArchiMate 2 Foundation.

The ArchiMate 2 Part 1 examination comprises 40 questions in simple multiple choice format. The ArchiMate 2 Part 2 examination comprises 8 question using a gradient scored, scenario based format. Practice examinations are included as part of an Accredited ArchiMate Training course and available with the Study Guide.

The examinations are delivered either at Prometric test centers or by Accredited Training Course Providers through The Open Group Internet Based Testing portal.

You can find an available accredited training course either by viewing the public Calendar of Accredited Training Courses or by contacting a provider using the Register of Accredited Training Courses.

The ArchiMate 2 Certification Self-Study Pack is available at http://www.opengroup.org/bookstore/catalog/b132.htm.

The hardcopy of the ArchiMate 2 Certification Study Guide is available to order from Van Haren Publishing at http://www.vanharen.net/9789401800020

ArchiMate is a registered trademark of The Open Group.

 Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.1, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Comments Off

Filed under ArchiMate®, Certifications, Enterprise Architecture

Future Shock: Why IT Needs to Embrace Agile Development or Be Left Behind

By Allen Brown, President and CEO, The Open Group

In his 1970 bestseller, Future Shock, futurist Alvin Toffler predicted that the rate of technological change and progress was beginning to accelerate at a rate faster than what people are often ready for or can handle. Looking over the course of history—from agrarian societies through the industrial age to our current post-industrial age in which more people work in service-oriented fields than agricultural ones—Toffler noted that the rapid changes brought on by technology often leave people in a state of “future shock.”

Future shock, Toffler argued, can cause not only disorientation for those who are caught up in it, but it can also induce a kind of paralysis brought on by being confronted with too many choices. In its worst form, future shock can lead to alienation and a breakdown of the social order due to “information overload” (a term originally coined by Toffler).

Toffler’s predictions were amazingly accurate for the time. We are most certainly in an era where we can barely keep up with the constant technological changes we are faced with on a daily basis. This is certainly true of how consumer technologies have changed our lives. Not only is the laptop or phone you buy today practically obsolete by the time you get it home, but we constantly struggle to keep up with our technologies and the volume of information—via email, text messages, the Internet, Twitter, etc.—that we consume on a daily basis. We are all likely suffering from some degree of information overload.

Similarly, technology change is accelerating business drivers at rates that are increasingly difficult for organizations to keep up with, not only for IT, but also for management. Trends such as Cloud, BYOD, Big Data and the Internet of Things are driving information overload for today’s enterprises putting intense pressure on lines of business to respond quickly to market drivers, data-driven imperatives and internal demands. Organizations are being forced to change—whether they are ready or not.

According to Toffler, the only way to combat future shock is to learn to adapt and to do so constantly. Toffler likens an inability to adapt to a new kind of illiteracy, with those who cannot adapt being left behind. “The illiterate of the 21st Century are not those who cannot read and write but those who cannot learn, unlearn and relearn” he said.

The problem is that most organizations today are not in a position to handle rapid change or to adapt quickly. In The Open Group Convergent Technologies Survey, only 52 percent of organizations surveyed felt they were equipped to deal with convergence of new technologies, while 27% said they were ill-prepared. Prepared or not, the tide of convergence is coming whether organizations like it or not. But to survive in our current economy, companies must learn to architect themselves in the moment.

Using Agile Development as a Model
Over the past ten years, agile software development has emerged as one of the ways for IT developers to adapt to the requirements of constant change. Based on a definition coined in the Manifesto for Agile Software Development in 2001, agile development is characterized by iterative, incremental and rapid development that evolves through collaboration. Rather than making development a process that takes years of painstaking planning before execution, the agile method puts a product out into the market at the earliest convenience, tests it with users and then adapts it accordingly. The process is then repeated with feedback and upgrades on a constant, iterative loop.

Agile development is driven by flexibility and the ability to respond rapidly to keep up with the endless change in the market. With agile principles, organizational focus shifts from processes and tools to individuals and interactions, the organization to the customer, from negotiation to collaboration, and to responding to change rather than sticking to rigid plans.

For many readers, “agile” is a loaded term and largely associated with solutions rather than the enterprise architecture but there are some appealing aspects to it.  An adaptation of the twelve principles of Agile Development to the discipline of Enterprise Architecture would be an interesting place to start.  I have just picked out half of the principles and adapted them here by way of example – using parenthesis to show possible deletions and adding a few words here and there in italics.

  •  Our highest priority is to satisfy the customer through early and continuous delivery of (valuable software) valuable architecture guidance to the enterprise
  •  Welcome changing requirements, even late in development. Agile processes harness change for the customer’s competitive advantage.
  • Business people and (developers) architects must work together (daily) throughout the project.
  • Simplicity–the art of maximizing the amount of work not done–is essential.
  • The best architectures, requirements, and designs emerge from self-organizing teams.
  • At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behavior accordingly.

What Organizations Can Learn from Agile Development
As Toffler predicted, the rate of business change is happening so quickly that if you take too much time to do anything, your organization is likely to lose out. Business schools have even taken a page from IT development and have begun teaching the principles of agile development to graduate students. As Toffler noted, “if you don’t have a strategy, you’re part of someone else’s strategy.” In business today, as in combatting “future shock,” adaptability must be at the core of every organization’s strategy.

Organizations that want to survive and thrive in this paradigm will need to take a page from IT, agile development and start-up cultures to become more nimble and move more quickly. Becoming agile will take significant shifts in culture for many organizations. Business and IT must work together in order to facilitate changes that will work for each organization. Enterprise architects and IT leaders can help lead the charge for change within their organizations by helping the C-Suite not only understand how to apply agile development principles to the business, but by showing the potential consequences of being slow to adapt and how business imperatives are the real drivers for these changes.

Architecting things as you go is a difficult thing for most organizations and most industries. Most of us are not used to that level of flexibility or a need to adapt that quickly. We are more comfortable with planning ahead and sticking to a well-thought out plan. Agility does not preclude planning or forethought—rather it is part of the process and action plan instead of being a precursor to action. Although many organizations are likely in for a large dose of “future” or culture shock, adaptation and business transformation is necessary for today’s organizations if they don’t want to be left behind in the face of constant change.

Allen BrownAllen Brown is President and CEO, The Open Group – a global consortium that enables the achievement of business objectives through IT standards.  For over 15 years Allen has been responsible for driving The Open Group’s strategic plan and day-to-day operations, including extending its reach into new global markets, such as China, the Middle East, South Africa and India. In addition, he was instrumental in the creation of the AEA, which was formed to increase job opportunities for all of its members and elevate their market value by advancing professional excellence.

5 Comments

Filed under Open Platform 3.0

ArchiMate® 2 Certification reaches the 1000th certification milestone

By Andrew Josey, The Open Group

We’re pleased to announce that the ArchiMate Certification for People program has reached the significant milestone of 1,000 individual certifications and there are individuals certified in 30 different countries as shown in the world map below.

ArchiMate 1000

The top 10 countries are:

Netherlands 458 45.8%
UK 104 10.4%
Belgium 76 7.6%
Australia 35 3.5%
Germany 32 3.2%
Norway 30 3%
Sweden 30 3%
USA 27 2.7%
Poland 16 1.6%
Slovakia 13 1.3%
 

The vision for the ArchiMate 2 Certification Program is to define and promote a market-driven education and certification program to support the ArchiMate modeling language Standard.

More information on the program is available at the ArchiMate 2 Certification site at http://www.opengroup.org/certifications/archimate/

Details of the ArchiMate 2 Examinations are available at: http://www.opengroup.org/certifications/archimate/docs/exam

The calendar of Accredited ArchiMate 2 Training courses is available at: http://www.opengroup.org/archimate/training-calendar/

The ArchiMate 2 Certification register can be found at https://archimate-cert.opengroup.org/certified-individuals

ArchiMate is a registered trademark of The Open Group.

 Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

Comments Off

Filed under ArchiMate®, Certifications, Enterprise Architecture

Measuring the Immeasurable: You Have More Data Than You Think You Do

By Jim Hietala, Vice President, Security, The Open Group

According to a recent study by the Ponemon Institute, the average U.S. company experiences more than 100 successful cyber-attacks each year at a cost of $11.6M. By enabling security technologies, those companies can reduce losses by nearly $4M and instituting security governance reduces costs by an average of $1.5M, according to the study.

In light of increasing attacks and security breaches, executives are increasingly asking security and risk professionals to provide analyses of individual company risk and loss estimates. For example, the U.S. healthcare sector has been required by the HIPAA Security rule to perform annual risk assessments for some time now. The recent HITECH Act also added security breach notification and disclosure requirements, increased enforcement in the form of audits and increased penalties in the form of fines. Despite federal requirements, the prospect of measuring risk and doing risk analyses can be a daunting task that leaves even the best of us with a case of “analysis paralysis.”

Many IT experts agree that we are nearing a time where risk analysis is not only becoming the norm, but when those risk figures may well be used to cast blame (or be used as part of a defense in a lawsuit) if and when there are catastrophic security breaches that cost consumers, investors and companies significant losses.

In the past, many companies have been reluctant to perform risk analyses due to the perception that measuring IT security risk is too difficult because it’s intangible. But if IT departments could soon become accountable for breaches, don’t you want to be able to determine your risk and the threats potentially facing your organization?

In his book, How to Measure Anything, father of Applied Information Economics Douglas Hubbard points out that immeasurability is an illusion and that organizations do, in fact, usually have the information they need to create good risk analyses. Part of the misperception of immeasurability stems from a lack of understanding of what measurement is actually meant to be. According to Hubbard, most people, and executives in particular, expect measurement and analysis to produce an “exact” number—as in, “our organization has a 64.5 percent chance of having a denial of service attack next year.”

Hubbard argues that, as risk analysts, we need to look at measurement more like how scientists look at things—measurement is meant to reduce uncertainty—not to produce certainty—about a quantity based on observation.  Proper measurement should not produce an exact number, but rather a range of possibility, as in “our organization has a 30-60 percent chance of having a denial of service attack next year.” Realistic measurement of risk is far more likely when expressed as a probability distribution with a range of outcomes than in terms of one number or one outcome.

The problem that most often produces “analysis paralysis” is not just the question of how to derive those numbers but also how to get to the information that will help produce those numbers. If you’ve been tasked, for instance, with determining the risk of a breach that has never happened to your organization before, perhaps a denial of service attack against your web presence, how can you make an accurate determination about something that hasn’t happened in the past? Where do you get your data to do your analysis? How do you model that analysis?

In an article published in CSO Magazine, Hubbard argues that organizations have far more data than they think they do and they actually need less data than they may believe they do in order to do proper analyses. Hubbard says that IT departments, in particular, have gotten so used to having information stored in databases that they can easily query, they forget there are many other sources to gather data from. Just because something hasn’t happened yet and you haven’t been gathering historical data on it and socking it away in your database doesn’t mean you either don’t have any data or that you can’t find what you need to measure your risk. Even in the age of Big Data, there is plenty of useful data outside of the big database.

You will still need to gather that data. But you just need enough to be able to measure it accurately not necessarily precisely. In our recently published Open Group Risk Assessment Standard (O-RA), this is called calibration of estimates. Calibration provides a method for making good estimates, which are necessary for deriving a measured range of probability for risk. Section 3 of the O-RA standard uses provides a comprehensive look at how best to come up with calibrated estimates, as well as how to determine other risk factors using the FAIR (Factor Analysis of Information Risk) model.

So where do you get your data if it’s not already stored and easily accessible in a database? There are numerous sources you can turn to, both externally and internally. You just have to do the research to find it. For example, even if your company hasn’t experienced a DNS attack, many others have—what was their experience when it happened? This information is out there online—you just need to search for it. Industry reports are another source of information. Verizon publishes its own annual Verizon Data Breach Investigations Report for one. DatalossDB publishes an open data beach incident database that provides information on data loss incidents worldwide. Many vendors publish annual security reports and issue regular security advisories. Security publications and analyst firms such as CSO, Gartner, Forrester or Securosis all have research reports that data can be gleaned from.

Then there’s your internal information. Chances are your IT department has records you can use—they likely count how many laptops are lost or stolen each year. You should also look to the experts within your company to help. Other people can provide a wealth of valuable information for use in your analysis. You can also look to the data you do have on related or similar attacks as a gauge.

Chances are, you already have the data you need or you can easily find it online. Use it.

With the ever-growing list of threats and risks organizations face today, we are fast reaching a time when failing to measure risk will no longer be acceptable—in the boardroom or even by governments.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Data management, Information security, Open FAIR Certification, RISK Management, Uncategorized

ArchiMate® 2.1 Specification Maintenance Release

By Andrew Josey, The Open Group

We’re pleased to announce the latest release of the ArchiMate modeling language specification.

ArchiMate® 2.1, an Open Group standard, is a full updated release of the ArchiMate Specification addressing comments raised since the introduction of Issue 2.0 in 2012. It retains the major features and structure of ArchiMate 2.0 adding further detail and clarification, thereby preserving existing investment in the ArchiMate modeling language. In this blog, we take a brief look at what has changed[1].

The changes in this release are as follows:

  1. Additional explanatory text has been added in section 2.6 describing the ArchiMate Framework, its layers and aspects.
  2. Corrections have been made to figures throughout the specification for consistency with the text, including metamodel diagrams, concept diagrams and example models.
  3. An explanation has been added describing the use of colors within the specification. This makes it clear that the metamodel diagrams use colors to distinguish the different aspects of the ArchiMate Framework, and that within the models there are no formal semantics assigned to colors.
  4. Within the three layers, the concepts are now classified according to the aspects of the ArchiMate Framework: Active Structure Concepts (instead of Structural Concepts), Behavioral Concepts, and Passive Structure Concepts (instead of Informational Concepts).
  5. Duplicate text has been removed from the layers; for example meaning was defined in Section 3.4 and also in Section 3.4.2).
  6. In the Layers, a number of concept diagrams have been corrected to show all the permitted symbols for the concept; for example, Business Interface, Application Service, and Infrastructure Service.
  7. In the Architecture Viewpoints, the aspects for each viewpoint are now classified as per the ArchiMate Framework into Active Structure, Behavior, or Passive Structure.
  8. In the Architecture Viewpoints, a number of Concepts and Relationships diagrams have been updated to correct the relationships shown, similarly a number of example diagrams have corrections (for example use of a Communication Path to connect two nodes).
  9. In the Language Extension Mechanisms chapter, it has been made clear that specialization can also be applied to Relationships.
  10. In the Motivation Extension, it has been made clear that the association relationship can be used to connect motivation elements.
  11. The status of the appendices has been made clear; Appendix A is informative, whereas Appendix B is normative.
  12. Appendix B, the Relationship Tables has a number of corrections applied.

More information on the ArchiMate 2.1 Specification, including additional resources, can be obtained from The Open Group website here: http://www.opengroup.org/subjectareas/enterprise/archimate

[1] A detailed listing of the changes is available separately as Document U132, ArchiMate® 2.0 Technical Corrigendum 1 http://www.opengroup.org/bookstore/catalog/U132

Andrew Josey is Director of Standards within The Open Group. He is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF 9.1, ArchiMate 2.0, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4. Previously, he has led the development and operation of many of The Open Group certification development projects, including industry-wide certification programs for the UNIX system, the Linux Standard Base, TOGAF, and IEEE POSIX. He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects.

1 Comment

Filed under ArchiMate®, Enterprise Architecture

Open Platform 3.0™ to Help Rally Organizations in Innovation and Development

by Andy Mulholland, Former Global CTO, Capgemini

The Open Platform 3.0™ initiative, launched by The Open Group, provides a forum in which organizations, including standards bodies, as much as users and product vendors, can coordinate their approach to new business models and new practices for use of IT, can define or identify common vendor-neutral standards, and can foster the adoption of those standards in a cohesive and aligned manner to ensure a integrated commercial viable set of solutions.

The goal is to enable effective business architectures, that support a new generation of interoperable business solutions, quickly, and at low cost using new technologies and provisioning methods, but with integration to existing IT environments.

Acting on behalf of its core franchise base of CIOs, and in association with the US and European CIO associations, Open Platform 3.0 will act as a rallying point for all involved in the development of technology solutions that new innovative business models and practices require.

There is a distinctive sea change in the way that organizations are adopting and using a range of new technologies, mostly relating to a front office revolution in how business is performed with their customers, suppliers and even within their markets. More than ever The Open Group mission of Boundaryless Information Flow™ through the mutual development of technology standards and methods is relevant to this change.

The competitive benefits are driving rapid Business adoption but mostly through a series of business management owned and driven pilots, usually with no long term thought as to scale, compliance, security, even data integrity. Rightly the CIO is concerned as to these issues, but too often in the absence of experience in this new environment and the ability to offer constructive approaches these concerns are viewed as unacceptable barriers.

This situation is further enflamed by the sheer variety of products and different groups, both technological and business, to try to develop workable standards for particular elements. Currently there is little, if any, overall coordination and alignment between all of these individually valuable elements towards a true ‘system’ approach with understandable methods to deliver the comprehensive enterprise approach in a manner that will truly serve the full business purposes.

The business imperatives supported by the teaching of Business Schools are focused on time as a key issue and advocate small fast projects built on externally provisioned, paid for against use, cloud services.  These are elements of the sea change that have to be accepted, and indeed will grow as society overall expects to do business and obtain their own requirements in the same way.

Much of these changes are outside the knowledge, experience of often power of current IT departments, yet they rightly understand that to continue in their essential role of maintaining the core internal operations and commercial stability this change must introduce a new generation of deployment, integration, and management methods. The risk is to continue the polarization that has already started to develop between the internal IT operations based on Client-Server Enterprise Applications versus the external operations of sales and marketing using Browser-Cloud based Apps and Services.

At best this will result in an increasingly decentralized and difficult to manage business, at worst Audit and Compliance management will report this business as being in breach of financial and commercial rules. This is being recognized by organizations introducing a new type of role supported by Business Schools and Universities termed a Business Architect. Their role in the application of new technology is to determine how to orchestrate complex business processes through Big Data and Big Process from the ‘Services’ available to users. This is in many ways a direct equivalent, though with different skills, to an Enterprise Architect in conventional IT who will focus on the data integrity from designing Applications and their integration.

The Open Group’s massive experience in the development of TOGAF®, together with its wide spread global acceptability, lead to a deep understanding of the problem, the issues, and how to develop a solution both for Business Architecture, as well as for its integration with Enterprise Architecture.

The Open Group believes that it is uniquely positioned to play this role due to its extensive experience in the development of standards on behalf of user enterprises to enable Boundaryless Information Flow including its globally recognized Enterprise Architecture standard TOGAF. Moreover it believes from feedback received from many directions this move will be welcomed by many of those involved in the various aspects of this exciting period of change.

mulhollandAndy joined Capgemini in 1996, bringing with him thirteen years of experience from previous senior IT roles across all major industry sectors.

In his former role as Global Chief Technology Officer, Andy was a member of the Capgemini Group management board and advised on all aspects of technology-driven market changes, as well as serving on the technology advisory boards of several organizations and enterprises.

A popular speaker with many appearances at major events all around the World, and frequently quoted by the press, in 2009 Andy was voted one of the top 25 most influential CTOs in the world by InfoWorld USA, and in 2010 his CTOblog was voted best Blog for Business Managers and CIOs for the third third year running by Computing Weekly UK. Andy retired in June 2012, but still maintains an active association with the Capgemini Group and his activities across the Industry lead to his achieving 29th place in 2012 in the prestigious USA ExecRank ratings category ‘Top CTOs’.

Comments Off

Filed under Open Platform 3.0, TOGAF

Do Androids Dream of Electric Sheep?

By Stuart Boardman, KPN

What does the apocalyptic vision of Blade Runner have to do with The Open Group’s Open Platform 3.0™ Forum?

Throughout history, from the ancient Greeks and the Talmud, through The Future Eve and Metropolis to I Robot and Terminator, we seem to have been both fascinated and appalled by the prospect of an autonomous “being” with its own consciousness and aspirations.

Hal-2001

But right now it’s not the machines that bother me. It’s how we try to do what we try to do with them. What we try to do is to address problems of increasingly critical economic, social and environmental importance. It bothers me because, like it or not, these problems can only be addressed by a partnership of man and (intelligent) machine and yet we seem to want to take the intelligence out of both partners.

Two recent posts that came my way via Twitter this week provoked me to write this blog. One is a GE Report that looks very thoroughly, if somewhat uncritically at what it calls the Industrial Internet. The other, by Forrester analyst Sarah Rotman Epps, appeared in Forbes under the title There Is No Internet of Things and laments the lack of interconnectedness in most “Smart” technologies.hammer

What disturbs me about both of those pieces is the suggestion that if we sort out some interoperability and throw masses of computing power and smart algorithms at a problem, everything will be dandy.

Actually it could just make things worse. Technically everything will work but the results will be a matter of chance. The problem lies in the validity of the models we use. And our ability to effectively model complex problems is at best unproven. If the model is faulty and the calculation perfect, the results will be wrong. In fact, when the systems we try to model are complex or chaotic, no deterministic model can deliver correct results other than by accident. But we like deterministic models, because they make us feel like we’re in control. I discussed this problem and its effects in more detail in my article on Ashby’s Law Of Requisite Variety. There’s also an important article by Joyce Hostyn, which explains how a simplistic view of objectivity leads to (at best) biased results. “Data does not lie. It just does not (always) mean what you think it does” (Claudia Perlich, Chief Scientist at Dstillery via CMSWire).

Now that doesn’t detract from the fact that developing a robot vacuum cleaner that actually “learns” the layout of a room is pretty impressive. That doesn’t mean that the robot is aware that it is a vacuum cleaner and that it has a (single) purpose in life. And just as well. It might get upset about us continually moving the furniture and decide to get revenge by crashing into our best antique glass cabinet.

With the Internet of Things (IoT) and Big Data in particular, we’re deploying machines to carry out analyses and take decisions that can be critical for the success of some human endeavor. If the models are wrong or only sometimes right, the consequences can be disastrous for health, the environment or the economy. In my Ashby piece I showed how unexpected events can result in an otherwise good model leading to fundamentally wrong reactions. In a world where IoT and Big Data combine with Mobility (multiple device types, locations and networks) and Cloud, the level of complexity is obviously high and there’s scope for a large number of unexpected events.
IoT Society

If we are to manage the volume of information coming our way and the speed with which it comes or with which we must react we need to harness the power of machine intelligence. In an intelligent manner. Which brings me to Cognitive Computing Systems.

On the IBM Research Cognitive Computing page I found this statement: “Far from replacing our thinking, cognitive systems will extend our cognition and free us to think more creatively.”  Cognitive Computing means allowing the computer to say “listen guys, I’m not really sure about this but here are the options”. Or even “I’ve actually never seen one of these before, so maybe you’d like to see what you can make of it”. And if the computer is really really not sure, maybe we’d better ride the storm for a while and figure out what this new thing is. Cognitive Computing means that we can, in a manner of speaking, discuss this with the computer.

It’s hard to say how far we are from commercially viable implementations of this technology. Watson has a few children but the family is still at the stage of applied research. But necessity is the mother of invention and, if the technologies we’re talking about in Platform 3.0 really do start collectively to take on the roles we have envisaged for them, that could just provide the necessary incentive to develop economically feasible solutions.

spacemenIn the meantime, we need to put ourselves more in the centre of things, to make the optimal use of the technologies we do have available to us but not shirk our responsibilities as intelligent human beings to use that intelligence and not seek easy answers to wicked problems.

 

 

I’ll leave you with 3 minutes and 12 seconds of genius:
marshalldavisjones
Marshall Davis Jones: “Touchscreen”


Stuart BoardmanStuart Boardman is a Senior Business Consultant with KPN where he co-leads the Enterprise Architecture practice as well as the Cloud Computing solutions group. He is co-lead of The Open Group Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI. He is a frequent speaker at conferences on the topics of Cloud, SOA, and Identity.

Comments Off

Filed under Cloud/SOA, Cloud, Platform 3.0, Open Platform 3.0

Evolving Business and Technology Toward an Open Platform 3.0™

By Dave Lounsbury, Chief Technical Officer, The Open Group

The role of IT within the business is one that constantly evolves and changes. If you’ve been in the technology industry long enough, you’ve likely had the privilege of seeing IT grow to become integral to how businesses and organizations function.

In his recent keynote “Just Exactly What Is Going On in Business and Technology?” at The Open Group London Conference in October, Andy Mulholland, former Global Chief Technology Officer at Capgemini, discussed how the role of IT has changed from being traditionally internally focused (inside the firewall, proprietary, a few massive applications, controlled by IT) to one that is increasingly externally focused (outside the firewall, open systems, lots of small applications, increasingly controlled by users). This is due to the rise of a number of disruptive forces currently affecting the industry such as BYOD, Cloud, social media tools, Big Data, the Internet of Things, cognitive computing. As Mulholland pointed out, IT today is about how people are using technology in the front office. They are bringing their own devices, they are using apps to get outside of the firewall, they are moving further and further away from traditional “back office” IT.

Due to the rise of the Internet, the client/server model of the 1980s and 1990s that kept everything within the enterprise is no more. That model has been subsumed by a model in which development is fast and iterative and information is constantly being pushed and pulled primarily from outside organizations. The current model is also increasingly mobile, allowing users to get the information they need anytime and anywhere from any device.

At the same time, there is a push from business and management for increasingly rapid turnaround times and smaller scale projects that are, more often than not, being sourced via Cloud services. The focus of these projects is on innovating business models and acting in areas where the competition does not act. These forces are causing polarization within IT departments between internal IT operations based on legacy systems and new external operations serving buyers in business functions that are sourcing their own services through Cloud-based apps.

Just as UNIX® provided a standard platform for applications on single computers and the combination of servers, PCs and the Internet provided a second platform for web apps and services, we now need a new platform to support the apps and services that use cloud, social, mobile, big data and the Internet of Things. Rather than merely aligning with business goals or enabling business, the next platform will be embedded within the business as an integral element bringing together users, activity and data. To work properly, this must be a standard platform so that these things can work together effectively and at low cost, providing vendors a worthwhile market for their products.

Industry pundits have already begun to talk about this layer of technology. Gartner calls it the “Nexus of Forces.” IDC calls it the “third platform.” At the The Open Group, we refer to it as Open Platform 3.0™, and we announced a new Forum to address how organizations can address and support these technologies earlier this year. Open Platform 3.0 is meant to enable organizations (including standards bodies, users and vendors) coordinate their approaches to the new business models and IT practices driving the new platform to support a new generation of interoperable business solutions.

As is always the case with technologies, a point is reached where technical innovation must transition to business benefit. Open Platform 3.0 is, in essence, the next evolution of computing. To help the industry sort through these changes and create vendor-neutral standards that foster the cohesive adoption of new technologies, The Open Group must also evolve its focus and standards to respond to where the industry is headed.

The work of the Open Platform 3.0 Forum has already begun. Initial actions for the Forum have been identified and were shared during the London conference.  Our recent survey on Convergent Technologies confirmed the need to address these issues. Of those surveyed, 95 percent of respondents felt that converged technologies were an opportunity for business, and 84 percent of solution providers are already dealing with two or more of these technologies in combination. Respondents also saw vendor lock-in as a potential hindrance to using these technologies underscoring the need for an industry standard that will address interoperability. In addition to the survey, the Forum has also produced an initial Business Scenario to begin to address these industry needs and formulate requirements for this new platform.

If you have any questions about Open Platform 3.0 or if you would like to join the new Forum, please contact Chris Harding (c.harding@opengroup.org) for queries regarding the Forum or Chris Parnell (c.parnell@opengroup.org) for queries regarding membership.

 

Dave LounsburyDave is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, Dave leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia. Dave holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.

 

 

1 Comment

Filed under Cloud, Data management, Future Technologies, Open Platform 3.0, Standards, Uncategorized, UNIX

Three Things We Learned at The Open Group, London

By Manuel Ponchaux, Senior Consultant, Corso

The Corso team recently visited London for The Open Group’s “Business Transformation in Finance, Government & Healthcare” conference (#ogLON). The event was predominantly for learning how experts address organisational change when aligning business needs with information technology – something very relevant in today’s climate. Nonetheless, there were a few other things we learnt as well…

1. Lean Enterprise Architecture

We were told that Standard Frameworks are too complex and multidimensional – people were interested in how we use them to provide simple working guidelines to the architecture team.

There were a few themes that frequently popped up, one of them being the measurement of Enterprise Architecture (EA) complexity. There seemed to be a lot of talk about Lean Enterprise Architecture as a solution to complexity issues.

2. Risk Management was popular

Clearly the events of the past few years e.g. financial crisis, banking regulations and other business transformations mean that managing risk is increasingly more important. So, it was no surprise that the Risk Management and EA sessions were very popular and probably attracted the biggest crowd. The Corso session showcasing our IBM/CIO case study was successful with 40+ attending!

3. Business challenges

People visited our stand and told us they were having trouble generating up to date heat maps. There was also a large number of attendee’s interested in Software as a Service as an alternative to traditional on-premise licensing.

So what did we learn from #ogLON?

Attendees are attracted to the ease of use of Corso’s ArchiMate plugin. http://www.corso3.com/products/archimate/

Together with the configurable nature of System Architect, ArchiMate® is a simple framework to use and makes a good starting point for supporting Lean Architecture.

Roadmapping and performing impact analysis reduces the influence of risk when executing any business transformation initiative.

We also learnt that customers in the industry are starting to embrace the concept of SaaS offerings as it provides them with a solution that can get them up and running quickly and easily – something we’re keen to pursue – which is why we’re now offering IBM Rational tools on the Corso cloud. Visit our website at http://www.corsocloud.com

http://info.corso3.com/blog/bid/323481/3-interesting-things-we-learned-at-The-Open-Group-London

Manuel Poncheau Manuel Ponchaux, Senior Consultant, Corso

1 Comment

Filed under ArchiMate®, Enterprise Architecture, Standards, Uncategorized

Introducing Two New Security Standards for Risk Analysis—Part II – Risk Analysis Standard

By Jim Hietala, VP Security, The Open Group

Last week we took a look at one of the new risk standards recently introduced by The Open Group® Security Forum at the The Open Group London Conference 2013, the Risk Taxonomy Technical Standard 2.0 (O-RT). Today’s blog looks at its sister standard, the Risk Analysis (O-RA) Standard, which provides risk professionals the tools they need to perform thorough risk analyses within their organizations for better decision-making about risk.

Risk Analysis (O-RA) Standard

The new Risk Analysis Standard provides a comprehensive guide for performing effective analysis scenarios within organizations using the Factor Analysis of Information Risk (FAIR™) framework. O-RA is geared toward managing the frequency and magnitude of loss that can arise from a threat, whether human, animal or a natural event–in other words “how often bad things happened and how bad they are when they occur.” Used together, the O-RT and O-RA Standards provide organizations with a way to perform consistent risk modeling, that can not only help thoroughly explain risk factors to stakeholders but allow information security professionals to strengthen existing or create better analysis methods. O-RA may also be used in conjunction with other risk frameworks to perform risk analysis.

The O-RA standard is also meant to provide something more than a mere assessment of risk. Many professionals within the security industry often fail to distinguish between “assessing” risk vs. “analysis” of risk. This standard goes beyond assessment by supporting effective analyses so that risk statements are less vulnerable to problems and are more meaningful and defensible than assessments that provide only the broad risk-ratings (“this is a 4 on a scale of 1-to-5”) normally used in assessments.

O-RA also lays out standard process for approaching risk analysis that can help organizations streamline the way they approach risk measurement. By focusing in on these four core process elements, organizations are able to perform more effective analyses:

  • Clearly identifying and characterizing the assets, threats, controls and impact/loss elements at play within the scenario being assessed
  • Understanding the organizational context for analysis (i.e. what’s at stake from an organizational perspective)
  • Measuring/estimating various risk factors
  • Calculating risk using a model that represents a logical, rational, and useful view of what risk is and how it works.

Because measurement and calculation are essential elements of properly analyzing risk variables, an entire chapter of the standard is dedicated to how to measure and calibrate risk. This chapter lays out a number of useful approaches for establishing risk variables, including establishing baseline risk estimates and ranges; creating distribution ranges and most likely values; using Monte Carlo simulations; accounting for uncertainty; determining accuracy vs. precision and subjective vs. objective criteria; deriving vulnerability; using ordinal scales; and determining diminishing returns.

Finally, a practical, real-world example is provided to take readers through an actual risk analysis scenario. Using the FAIR model, the example outlines the process for dealing with an threat in which an HR executive at a large bank has left the user name and password that allow him access to all the company’s HR systems on a Post-It note tacked onto his computer in his office in clear view of anyone (other employees, cleaning crews, etc.) who comes into the office.

The scenario outlines four stages in assessing this risk:

  1. .    Stage 1: Identify Scenario Components (Scope the Analysis)
  2. .    Stage 2: Evaluate Loss Event Frequency (LEF)
  3. .    Stage 3: Evaluate Loss Magnitude (LM)
  4. .    Stage 4: Derive and Articulate Risk

Each step of the risk analysis process is thoroughly outlined for the scenario to provide Risk Analysts an example of how to perform an analysis process using the FAIR framework. Considerable guidance is provided for stages 2 and 3, in particular, as those are the most critical elements in determining organizational risk.

Ultimately, the O-RA is a guide to help organizations make better decisions about which risks are the most critical for the organization to prioritize and pay attention to versus those that are less important and may not warrant attention. It is critical for Risk Analysts and organizations to become more consistent in this practice because lack of consistency in determining risk among information security professionals has been a major obstacle in allowing security professionals a more legitimate “seat at the table” in the boardroom with other business functions (finance, HR, etc.) within organizations.

For our profession to evolve and grow, consistency and accurate measurement is key. Issues and solutions must be identified consistently and comparisons and measurement must be based on solid foundations, as illustrated below.

Risk2

Chained Dependencies

O-RA can help organizations arrive at better decisions through consistent analysis techniques as well as provide more legitimacy within the profession.  Without a foundation from which to manage information risk, Risk Analysts and information security professionals may rely too heavily on intuition, bias, commercial or personal agendas for their analyses and decision making. By outlining a thorough foundation for Risk Analysis, O-RA provides not only a common foundation for performing risk analyses but the opportunity to make better decisions and advance the security profession.

For more on the O-RA Standard or to download it, please visit: https://www2.opengroup.org/ogsys/catalog/C13G.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Introducing Two New Security Standards for Risk Analysis—Part I – Risk Taxonomy Technical Standard 2.0

By Jim Hietala, VP Security, The Open Group

At the The Open Group London 2013 Conference, The Open Group® announced three new initiatives related to the Security Forum’s work around Risk Management. The first of these was the establishment of a new certification program for Risk Analysts working within the security profession, the Open FAIR Certification Program.  Aimed at providing a professional certification for Risk Analysts, the program will bring a much-needed level of assuredness to companies looking to hire Risk Analysts, certifying that analysts who have completed the Open FAIR program understand the fundamentals of risk analysis and are qualified to perform that analysis.

Forming the basis of the Open FAIR certification program are two new Open Group standards, version 2.0 of the Risk Taxonomy (O-RT) standard originally introduced by the Security Forum in 2009, and a new Risk Analysis (O-RA) Standard, both of which were also announced at the London conference. These standards are the result of ongoing work around risk analysis that the Security Forum has been conducting for a number of years now in order to help organizations better understand and identify their exposure to risk, particularly when it comes to information security risk.

The Risk Taxonomy and Risk Analysis standards not only form the basis and body of knowledge for the Open FAIR certification, but provide practical advice for security practitioners who need to evaluate and counter the potential threats their organization may face.

Today’s blog will look at the first standard, the Risk Taxonomy Technical Standard, version 2.0. Next week, we’ll look at the other standard for Risk Analysis.

Risk Taxonomy (O-RT) Technical Standard 2.0

Originally, published in January 2009, the O-RT is intended to provide a common language and references for security and business professionals who need to understand or analyze risk conditions, providing a common language for them to use when discussing those risks. Version 2.0 of the standard contains a number of updates based both on feedback provided by professionals that have been using the standard and as a result of research conducted by Security Forum member CXOWARE.

The majority of the changes to Version 2.0 are refinements in terminology, including changes in language that better reflect what each term encompasses. For example, the term “Control Strength” in the original standard has now been changed to “Resistance Strength” to reflect that controls used in that part of the taxonomy must be resistive in nature.

More substantive changes were made to the portion of the taxonomy that discusses how Loss Magnitude is evaluated.

Why create a taxonomy for risk?  For two reasons. First, the taxonomy provides a foundation from which risk analysis can be performed and talked about. Second, a tightly defined taxonomy reduces the inability to effectively measure or estimate risk scenarios, leading to better decision making, as illustrated by the following “risk management stack.”

Effective Management


↑

Well-informed Decisions

Effective Comparisons


↑

Meaningful Measurements

Accurate Risk Model

The complete Risk Taxonomy is comprised of two branches: Loss Event Frequency (LEF) and Loss Magnitude (LM), illustrated here:

Risk1

Focusing solely on pure risk (which only results in loss) rather than speculative risk (which might result in either loss or profit), the O-RT is meant to help estimate the probable frequency and magnitude of future loss.

Traditionally LM has been far more difficult to determine than LEF, in part because organizations don’t always perform analyses on their losses or they just stick to evaluating “low hanging fruit” variables rather than delve into determining more complex risk factors. The new taxonomy takes a deep dive into the Loss Magnitude branch of the risk analysis taxonomy providing guidance that will allow Risk Analysts to better tackle the difficult task of determining LM. It includes terminology outlining six specific forms of loss an organization can experience (productivity, response, replacement, fines and judgments, competitive advantage, reputation) as well as how to determine Loss Flow, a new concept in this standard.

The Loss Flow analysis helps identify how a loss may affect both primary (owners, employees, etc.) and secondary (customers, stockholders, regulators, etc.) stakeholders as a result of a threat agent’s action on an asset. The new standard provides a thorough overview on how to assess Loss Flow and identify the loss factors of any given threat.

Finally, the standard also includes a practical, real-world scenario to help analysts understand how to put the taxonomy to use in within their organizations. O-RT provides a common linguistic foundation that will allow security professionals to then perform the risk analyses as outlined in the O-RA Standard.

For more on the Risk Taxonomy Standard or to download it, visit: https://www2.opengroup.org/ogsys/catalog/C13K.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Jericho Forum declares “success” and sunsets

By Ian Dobson & Jim Hietala, The Open Group
Ten years ago, the Jericho Forum set out on a mission to evangelise the issues, problems, solutions and provide thought-leadership around the emerging business and security issues of de-perimeterisation, with the aim of one day being able to declare “job-done”.

That day has now arrived.  Today, de-perimeterisation is an established “fact” – touching not just information security but all areas of modern business, including the bring your own IT phenomenon (devices, IDs, services) as well as all forms of cloud computing. It’s widely understood and quoted by the entire industry.  It has become part of today’s computing and security lexicon.

With our de-perimeterisation mission accomplished, the Jericho Forum has decided the time has come to “declare success”, celebrate it as a landmark victory in the evolution of information security, and sunset as a separate Forum in The Open Group.

Our “declare success and sunset” victory celebration on Monday 21st Oct 2013 at the Central Hall Westminster, London UK, was our valedictory announcement that the Jericho Forum will formally sunset on 1st Nov 2013.  The event included many past leading Jericho Forum members attending as guests, with awards of commemorative plaques to those whose distinctive leadership steered the information security mind-set change success that the Jericho Forum has now achieved.

For those who missed the live-streamed event, you can watch it on the livestream recording at http://new.livestream.com/opengroup/Lon13

We are fortunate to be able to pass our Jericho Forum legacy of de-perimeterisation achievements and publications to the good care of The Open Group’s Security Forum, which has undertaken to maintain the Jericho Forum’s deliverables, protect it’s legacy from mis-representation, and perhaps adopt and evolve Jericho’s thought-leadership approach on future information security challenges.

Ian Dobson, Director Jericho Forum
Jim Hietala, VP Security
The Open Group
21st October 2013


Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world. In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Security Architecture

Secure Integration of Convergent Technologies – a Challenge for Open Platform™

By Dr. Chris Harding, The Open Group

The results of The Open Group Convergent Technologies survey point to secure integration of the technologies as a major challenge for Open Platform 3.0.  This and other input is the basis for the definition of the platform, where the discussion took place at The Open Group conference in London.

Survey Highlights

Here are some of the highlights from The Open Group Convergent Technologies survey.

  • 95% of respondents felt that the convergence of technologies such as social media, mobility, cloud, big data, and the Internet of things represents an opportunity for business
  • Mobility currently has greatest take-up of these technologies, and the Internet of things has least.
  • 84% of those from companies creating solutions want to deal with two or more of the technologies in combination.
  • Developing the understanding of the technologies by potential customers is the first problem that solution creators must overcome. This is followed by integrating with products, services and solutions from other suppliers, and using more than one technology in combination.
  • Respondents saw security, vendor lock-in, integration and regulatory compliance as the main problems for users of software that enables use of these convergent technologies for business purposes.
  • When users are considered separately from other respondents, security and vendor lock-in show particularly strongly as issues.

The full survey report is available at: https://www2.opengroup.org/ogsys/catalog/R130

Open Platform 3.0

Analysts forecast that convergence of technical phenomena including mobility, cloud, social media, and big data will drive the growth in use of information technology through 2020. Open Platform 3.0 is an initiative that will advance The Open Group vision of Boundaryless Information Flow™ by helping enterprises to use them.

The survey confirms the value of an open platform to protect users of these technologies from vendor lock-in. It also shows that security is a key concern that must be addressed, that the platform must make the technologies easy to use, and that it must enable them to be used in combination.

Understanding the Requirements

The Open Group is conducting other work to develop an understanding of the requirements of Open Platform 3.0. This includes:

  • The Open Platform 3.0 Business Scenario, that was recently published, and is available from https://www2.opengroup.org/ogsys/catalog/R130
  • A set of business use cases, currently in development
  • A high-level round-table meeting to gain the perspective of CIOs, who will be key stakeholders.

The requirements input have been part of the discussion at The Open Group Conference, which took place in London this week. Monday’s keynote presentation by Andy Mulholland, Former Global CTO at Capgemini on “Just Exactly What Is Going on in Business and Technology?” included the conclusions from the round-table meeting. This week’s presentation and panel discussion on the requirements for Open Platform 3.0 covered all the inputs.

Delivering the Platform

Review of the inputs in the conference was followed by a members meeting of the Open Platform 3.0 Forum, to start developing the architecture of Open Platform 3.0, and to plan the delivery of the platform definition. The aim is to have a snapshot of the definition early in 2014, and to deliver the first version of the standard a year later.

Meeting the Challenge

Open Platform 3.0 will be crucial to establishing openness and interoperability in the new generation of information technologies. This is of first importance for everyone in the IT industry.

Following the conference, there will be an opportunity for everyone to input material and ideas for the definition of the platform. If you want to be part of the community that shapes the definition, to work on it with like-minded people in other companies, and to gain early insight of what it will be, then your company must join the Open Platform 3.0 Forum. (For more information on this, contact Chris Parnell – c.parnell@opengroup.org)

Providing for secure integration of the convergent technologies, and meeting the other requirements for Open Platform 3.0, will be a difficult but exciting challenge. I’m looking forward to continue to tackle the challenge with the Forum members.

Dr. Chris Harding

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

1 Comment

Filed under Cloud/SOA, Conference, Data management, Future Technologies, Open Platform 3.0, Semantic Interoperability, Service Oriented Architecture, Standards

The Open Group London – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications

We eagerly jumped into the second day of our Business Transformation conference in London on Tuesday October 22nd!  The setting is the magnificent Central Hall Westminster.

Steve Nunn, COO of The Open Group and CEO of Association of Enterprise Architects (AEA), started off the morning introducing our plenary based on Healthcare Transformation.  Steve noted that the numbers in healthcare spend are huge and bringing Enterprise Architecture (EA) to healthcare will help with efficiencies.

The well-renowned Dr. Peter Sudbury, Healthcare Specialist with HP Enterprise Services, discussed the healthcare crisis (dollars, demand, demographics), the new healthcare paradigm, barriers to change and innovation. Dr. Sudbury also commented on the real drivers of healthcare costs: healthcare inflation is higher intrinsically; innovation increases cost; productivity improvements lag other industries.

IMG_sudburyDr. Peter Sudbury

Dr. Sudbury, Larry Schmidt (Chief Technologist, HP) and Roar Engen (Head of Enterprise Architecture, Helse Sør-Øst RHF, Norway) participated in the Healthcare Transformation Panel, moderated by Steve Nunn.  The group discussed opportunities for improvement by applying EA in healthcare.  They mentioned that physicians, hospitals, drug manufacturers, nutritionists, etc. should all be working together and using Boundaryless Information Flow™ to ensure data is smoothly shared across all entities.  It was also stated that TOGAF® is beneficial for efficiencies.

Following the panel, Dr. Mario Tokoro (Founder & Executive Advisor of Sony Computer Science Laboratories, Inc. Japanese Science & Technology Agency, DEOS Project Leader) reviewed the Dependability through Assuredness™ standard, a standard of The Open Group.

The conference also offered many sessions in Finance/Commerce, Government and Tutorials/Workshops.

Margaret Ford, Consult Hyperion, UK and Henk Jonkers of BIZZdesign, Netherlands discussed “From Enterprise Architecture to Cyber Security Risk Assessment”.  The key takeaways were: complex cyber security risks require systematic, model-based risk assessment; attack navigators can provide this by linking ArchiMate® to the Risk Taxonomy.

“Applying Service-Oriented Architecture within a Business Technology Environment in the Finance Sector” was presented by Gerard Peters, Managing Consultant, Capgemini, The Netherlands. This case study is part of a white paper on Service-Oriented Architecture for Business Technology (SOA4BT).

You can view all of the plenary and many of the track presentations at livestream.com.  And for those who attended, full conference proceedings will be available.

The night culminated with a spectacular experience on the London Eye, the largest Ferris wheel in Europe located on the River Thames.

Comments Off

Filed under ArchiMate®, Cloud/SOA, Enterprise Architecture, Enterprise Transformation, Healthcare, Professional Development, Service Oriented Architecture, TOGAF®

The Open Group London 2013 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

On Monday October 21st, The Open Group kicked off the first day of our Business Transformation conference in London!  Over 275 guests attended many engaging presentations by subject matter experts in finance, healthcare and government.  Attendees from around the globe represented 28 countries including those from as far away as Columbia, Philippines, Australia, Japan and South Africa.

Allen Brown, President and CEO of The Open Group, welcomed the prestigious group.  Allen announced that The Open Group has 67 new member organizations so far this year!

The plenary launched with “Just Exactly What is Going On in Business and Technology?” by Andy Mulholland, Former Global CTO of Capgemini, who was named one of the top 25 influential CTOs by InfoWorld.  Andy’s key topics regarding digital disruption included real drivers of change, some big and fundamental implications, business model innovation, TOGAF® and the Open Platform 3.0™ initiative.

Next up was Judith Jones, CEO, Architecting the Enterprise Ltd., with a presentation entitled “One World EA Framework for Governments – The Way Forward”.  Judith shared findings from the World Economic Forum, posing the question “what keeps 1000 global leaders awake at night”? Many stats were presented with over 50 global risks – economical, societal, environmental, geopolitical and technological.

Jim Hietala, VP, Security of The Open Group announced the launch of the Open FAIR Certification for People Program.  The new program brings a much-needed certification to the market which focuses on risk analysis. Key partners include CXOWARE, Architecting the Enterprise, SNA Technologies and The Unit bv.

Richard Shreeve, Consultancy Director, IPL and Angela Parratt, Head of Transformation and joint CIO, Bath and North East Somerset Council presented “Using EA to Inform Business Transformation”.  Their case study addressed the challenges of modeling complexity in diverse organizations and the EA-led approach to driving out cost and complexity while maintaining the quality of service delivery.

Allen Brown announced that the Jericho Forum® leaders together with The Open Group management have concluded that the Jericho Forum has achieved its original mission – to establish “de-perimeterization” that touches all areas of modern business.  In declaring this mission achieved, we are now in the happy position to celebrate a decade of success and move to ensuring that the legacy of the Jericho Forum is both maintained within The Open Group and continues to be built upon.  (See photo below.)

Following the plenary, the sessions were divided into tracks – Finance/Commerce, Healthcare and Tutorials/Workshops.

During the Healthcare track, one of the presenters, Larry Schmidt, Chief Technologist with HP, discussed “Challenges and Opportunities for Big Data in Healthcare”. Larry elaborated on the 4 Vs of Big Data – value, velocity, variety and voracity.

Among the many presenters in the Finance/Commerce track, Omkhar Arasaratnam, Chief Security Architect, TD Bank Group, Canada, featured “Enterprise Architecture – We Do That?: How (not) to do Enterprise Architecture at a Bank”.  Omkhar provided insight as to how he took traditional, top down, center-based architectural methodologies and applied it to a highly federated environment.

Tutorials/workshops consisted of EA Practice and Architecture Methods and Techniques.

You can view all of the plenary and many of the track presentations at livestream.com.  For those who attended, please stay tuned for the full conference proceedings.

The evening concluded with a networking reception at the beautiful and historic and Central Hall Westminster.  What an interesting, insightful, collaborative day it was!

IMG_1311

Comments Off

Filed under Business Architecture, Certifications, Cloud, Cloud/SOA, Conference, Cybersecurity, Information security, Open Platform 3.0, Professional Development, RISK Management, Security Architecture, Standards, TOGAF®

Open FAIR Certification Launched

By Jim Hietala, The Open Group, VP of Security

The Open Group today announced the new Open FAIR Certification Program aimed at Risk Analysts, bringing a much-needed professional certification to the market that is focused on the practice of risk analysis. Both the Risk Taxonomy and Risk Analysis standards, standards of The Open Group, constitute the body of knowledge for the certification program, and they advance the risk analysis profession by defining a standard taxonomy for risk, and by describing the process aspects of a rigorous risk analysis.

We believe that this new risk analyst certification program will bring significant value to risk analysts, and to organizations seeking to hire qualified risk analysts. Adoption of these two risk standards from The Open Group will help produce more effective and useful risk analysis. This program clearly represents the growing need in our industry for professionals who understand risk analysis fundamentals.  Furthermore, the mature processes and due diligence The Open Group applies to our standards and certification programs will help make organizations comfortable with the ground breaking concepts and methods underlying FAIR. This will also help professionals looking to differentiate themselves by demonstrating the ability to take a “business perspective” on risk.

In order to become certified, Risk Analysts must pass an Open FAIR certification exam. All certification exams are administered through Prometric, Inc. Exam candidates can start the registration process by visiting Prometric’s Open Group Test Sponsor Site www.prometric.com/opengroup.  With 4,000 testing centers in its IT channel, Prometric brings Open FAIR Certification to security professionals worldwide. For more details on the exam requirements visit http://www.opengroup.org/certifications/exams.

Training courses will be delivered through an Open Group accredited channel. The accreditation of Open FAIR training courses will be available from November 1st 2013.

Our thanks to all of the members of the risk certification working group who worked tirelessly over the past 15 months to bring this certification program, along with a new risk analysis standard and a revised risk taxonomy standard to the market. Our thanks also to the sponsors of the program, whose support is important to building this program. The Open FAIR program sponsors are Architecting the Enterprise, CXOWARE, SNA, and The Unit.

Lastly, if you are involved in risk analysis, we encourage you to consider becoming Open FAIR certified, and to get involved in the risk analysis program at The Open Group. We have plans to develop an advanced level of Open FAIR certification, and we also see a great deal of best practices guidance that is needed by the industry.

For more information on the Open FAIR certification program visit http://www.opengroup.org/certifications/openfair

You may also wish to attend a webcast scheduled for 7th November, 4pm BST that will provide an overview of the Open FAIR certification program, as well as an overview of the two risk standards. You can register here

.62940-hietala

Jim Hietala, CISSP, GSEC, is Vice President, Security for The Open Group, where he manages all security and risk management programs and standards activities, including the Security Forum and the Jericho Forum.  He has participated in the development of several industry standards including O-ISM3, O-ESA, Risk Taxonomy Standard, and O-ACEML. He also led the development of compliance and audit guidance for the Cloud Security Alliance v2 publication.

Jim is a frequent speaker at industry conferences. He has participated in the SANS Analyst/Expert program, having written several research white papers and participated in several webcasts for SANS. He has also published numerous articles on information security, risk management, and compliance topics in publications including CSO, The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

An IT security industry veteran, he has held leadership roles at several IT security vendors.

Jim holds a B.S. in Marketing from Southern Illinois University.

Comments Off

Filed under Conference, Cybersecurity, Open FAIR Certification, Standards