Category Archives: Conference

Facing the Challenges of the Healthcare Industry – An Interview with Eric Stephens of The Open Group Healthcare Forum

By The Open Group

The Open Group launched its new Healthcare Forum at the Philadelphia conference in July 2013. The forum’s focus is on bringing Boundaryless Information Flow™ to the healthcare industry to enable data to flow more easily throughout the complete healthcare ecosystem through a standardized vocabulary and messaging. Leveraging the discipline and principles of Enterprise Architecture, including TOGAF®, the forum aims to develop standards that will result in higher quality outcomes, streamlined business practices and innovation within the industry.

At the recent San Francisco 2014 conference, Eric Stephens, Enterprise Architect at Oracle, delivered a keynote address entitled, “Enabling the Opportunity to Achieve Boundaryless Information Flow” along with Larry Schmidt, HP Fellow at Hewlett-Packard. A veteran of the healthcare industry, Stephens was Senior Director of Enterprise Architects Excellus for BlueCross BlueShield prior to joining Oracle and he is an active member of the Healthcare Forum.

We sat down after the keynote to speak with Stephens about the challenges of healthcare, how standards can help realign the industry and the goals of the forum. The opinions expressed here are Stephens’ own, not of his employer.

What are some of the challenges currently facing the healthcare industry?

There are a number of challenges, and I think when we look at it as a U.S.-centric problem, there’s a disproportionate amount of spending that’s taking place in the U.S. For example, if you look at GDP or percentage of GDP expenditures, we’re looking at now probably 18 percent of GDP [in the U.S.], and other developed countries are spending a full 5 percent less than that of their GDP, and in some cases they’re getting better outcomes outside the U.S.

The mere fact that there’s the existence of what we call “medical tourism, where if I need a hip replacement, I can get it done for a fraction of the cost in another country, same or better quality care and have a vacation—a rehab vacation—at the same time and bring along a spouse or significant other, means there’s a real wide range of disparity there. 

There’s also a lack of transparency. Having worked at an insurance company, I can tell you that with the advent of high deductible plans, there’s a need for additional cost information. When I go on Amazon or go to a local furniture store, I know what the cost is going to be for what I’m about to purchase. In the healthcare system, we don’t get that. With high deductible plans, if I’m going to be responsible for a portion or a larger portion of the fee, I want to know what it is. And what happens is, the incentives to drive costs down force the patient to be a consumer. The consumer now asks the tough questions. If my daughter’s going in for a tonsillectomy, show me a bill of materials that shows me what’s going to be done – if you are charging me $20/pill for Tylenol, I’ll bring my own. Increased transparency is what will in turn drive down the overall costs.

I think there’s one more thing, and this gets into the legal side of things. There is an exorbitant amount of legislation and regulation around what needs to be done. And because every time something goes sideways, there’s going to be a lawsuit, doctors will prescribe an extra test, and extra X-ray for a patient whether they need it or not.

The healthcare system is designed around a vicious cycle of diagnose-treat-release. It’s not incentivized to focus on prevention and management. Oregon is promoting these coordinated care organizations (CCOs) that would be this intermediary that works with all medical professionals – whether it was physical, mental, dental, even social worker – to coordinate episodes of care for patients. This drives down inappropriate utilization – for example, using an ER as a primary care facility and drives the medical system towards prevention and management of health. 

Your keynote with Larry Schmidt of HP focused a lot on cultural changes that need to take place within the healthcare industry – what are some of the changes necessary for the healthcare industry to put standards into place?

I would say culturally, it goes back to those incentives, and it goes back to introducing this idea of patient-centricity. And for the medical community, to really start recognizing that these individuals are consumers and increased choice is being introduced, just like you see in other industries. There are disruptive business models. As a for instance, medical tourism is a disruptive business model for United States-based healthcare. The idea of pharmacies introducing clinical medicine for routine care, such as what you see at a CVS, Wal-Mart or Walgreens. I can get a flu shot, I can get a well-check visit, I can get a vaccine – routine stuff that doesn’t warrant a full-blown medical professional. It’s applying the right amount of medical care to a particular situation.

Why haven’t existing standards been adopted more broadly within the industry? What will help providers be more likely to adopt standards?

I think the standards adoption is about “what’s in it for me, the WIIFM idea. It’s demonstrating to providers that utilizing standards is going to help them get out of the medical administration business and focus on their core business, the same way that any other business would want to standardize its information through integration, processes and components. It reduces your overall maintenance costs going forward and arguably you don’t need a team of billing folks sitting in an doctor’s office because you have standardized exchanges of information.

Why haven’t they been adopted? It’s still a question in my mind. Why would a doctor not want to do that is perhaps a question we’re going to need to explore as part of the Healthcare Forum.

Is it doctors that need to adopt the standards or technologies or combination of different constituents within the ecosystem?

I think it’s a combination. We hear a lot about the Affordable Care Act (ACA) and the health exchanges. What we don’t hear about is the legislation to drive toward standardization to increase interoperability. So unfortunately it would seem the financial incentives or things we’ve tried before haven’t worked, and we may simply have to resort to legislation or at least legislative incentives to make it happen because part of the funding does cover information exchanges so you can move health information between providers and other actors in the healthcare system.

You’re advocating putting the individual at the center of the healthcare ecosystem. What changes need to take place within the industry in order to do this?

I think it’s education, a lot of education that has to take place. I think that individuals via the incentive model around high deductible plans will force some of that but it’s taking responsibility and understanding the individual role in healthcare. It’s also a cultural/societal phenomenon.

I’m kind of speculating here, and going way beyond what enterprise architecture or what IT would deliver, but this is a philosophical thing around if I have an ailment, chances are there’s a pill to fix it. Look at the commercials, every ailment say hypertension, it’s easy, you just dial the medication correctly and you don’t worry as much about diet and exercise. These sorts of things – our over-reliance on medication. I’m certainly not going to knock the medications that are needed for folks that absolutely need them – but I think we can become too dependent on pharmacological solutions for our health problems.   

What responsibility will individuals then have for their healthcare? Will that also require a cultural and behavioral shift for the individual?

The individual has to start managing his or her own health. We manage our careers and families proactively. Now we need to focus on our health and not just float through the system. It may come to financial incentives for certain “individual KPIs such as blood pressure, sugar levels, or BMI. Advances in medical technology may facilitate more personal management of one’s health.

One of the Healthcare Forum’s goals is to help establish Boundaryless Information Flow within the Healthcare industry you’ve said that understanding the healthcare ecosystem will be a key component for that what does that ecosystem encompass and why is it important to know that first?

Very simply we’re talking about the member/patient/consumer, then we get into the payers, the providers, and we have to take into account government agencies and other non-medical agents, but they all have to work in concert and information needs to flow between those organizations in a very standardized way so that decisions can be made in a very timely fashion.

It can’t be bottled up, it’s got to be provided to the right provider at the right time, otherwise, best case, it’s going to cost more to manage all the actors in the system. Worst case, somebody dies or there is a “never event due to misinformation or lack of information during the course of care. The idea of Boundaryless Information Flow gives us the opportunity to standardize, have easily accessible information – and by the way secured – it can really aide in that decision-making process going forward. It’s no different than Wal-Mart knowing what kind of merchandise sells well before and after a hurricane (i.e., beer and toaster pastries, BTW). It’s the same kind of real-time information that’s made available to a Google car so it can steer its way down the road. It’s that kind of viscosity needed to make the right decisions at the right time.

Healthcare is a highly regulated industry, how can Boundarylesss Information Flow and data collection on individuals be achieved and still protect patient privacy?

We can talk about standards and the flow and the technical side. We need to focus on the security and privacy side.  And there’s going to be a legislative side because we’re going to touch on real fundamental data governance issue – who owns the patient record? Each actor in the system thinks they own the patient record. If we’re going to require more personal accountability for healthcare, then shouldn’t the consumer have more ownership? 

We also need to address privacy disclosure regulations to avoid catastrophic data leaks of protected health information (PHI). We need bright IT talent to pull off the integration we are talking about here. We also need folks who are well versed in the privacy laws and regulations. I’ve seen project teams of 200 have up to eight folks just focusing on the security and privacy considerations. We can argue about headcount later but my point is the same – one needs some focused resources around this topic.

What will standards bring to the healthcare industry that is missing now?

I think the standards, and more specifically the harmonization of the standards, is going to bring increased maintainability of solutions, I think it’s going to bring increased interoperability, I think it’s going to bring increased opportunities too. We see mobile computing or even DropBox, that has API hooks into all sorts of tools, and it’s well integrated – so I can integrate and I can move files between devices, I can move files between apps because they have hooks it’s easy to work with. So it’s building these communities of developers, apps and technical capabilities that makes it easy to move the personal health record for example, back and forth between providers and it’s not a cataclysmic event to integrate a new version of electronic health records (EHR) or to integrate the next version of an EHR. This idea of standardization but also some flexibility that goes into it.

Are you looking just at the U.S. or how do you make a standard that can go across borders and be international?

It is a concern, much of my thinking and much of what I’ve conveyed today is U.S.-centric, based on our problems, but many of these interoperability problems are international. We’re going to need to address it; I couldn’t tell you what the sequence is right now. There are other considerations, for example, single vs. multi-payer—that came up in the keynote. We tend to think that if we stay focused on the consumer/patient we’re going to get it for all constituencies. It will take time to go international with a standard, but it wouldn’t be the first time. We have a host of technical standards for the Internet (e.g., TCP/IP, HTTP). The industry has been able to instill these standards across geographies and vendors. Admittedly, the harmonization of health care-related standards will be more difficult. However, as our world shrinks with globalization an international lens will need to be applied to this challenge. 

Eric StephensEric Stephens (@EricStephens) is a member of Oracle’s executive advisory community where he focuses on advancing clients’ business initiatives leveraging the practice of Business and Enterprise Architecture. Prior to joining Oracle he was Senior Director of Enterprise Architecture at Excellus BlueCross BlueShield leading the organization with architecture design, innovation, and technology adoption capabilities within the healthcare industry.

 

Comments Off

Filed under Conference, Data management, Enterprise Architecture, Healthcare, Information security, Standards, TOGAF®

The Open Group San Francisco 2014 – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications

Day two, February 4th, of The Open Group San Francisco conference kicked off with a welcome and opening remarks from Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects.

Nunn introduced Allen Brown, President and CEO of The Open Group, who provided highlights from The Open Group’s last quarter.  As of Q4 2013, The Open Group had 45,000 individual members in 134 countries hailing from 449 member companies in 38 countries worldwide. Ten new member companies have already joined The Open Group in 2014, and 24 members joined in the last quarter of 2013, with the first member company joining from Vietnam. In addition, 6,500 individuals attended events sponsored by The Open Group in Q4 2013 worldwide.

Updates on The Open Group’s ongoing work were provided including updates on the FACE™ Consortium, DirectNet® Waveform Standard, Architecture Forum, Archimate® Forum, Open Platform 3.0™ Forum and Security Forum.

Of note was the ongoing development of TOGAF® and introduction of a three-volume work including individual volumes outlining the TOGAF framework, guidance and tools and techniques for the standard, as well as collaborative work that allows the Archimate modeling language to be used for risk management in enterprise architectures.

In addition, Open Platform 3.0 Forum has already put together 22 business use cases outlining ROI and business value for various uses related to technology convergence. The Cloud Work Group’s Cloud Reference Architecture has also been submitted to ISO for international standards certification, and the Security Forum has introduced certification programs for OpenFAIR risk management certification for individuals.

The morning plenary centered on The Open Group’s Dependability through Assuredness™ (O-DA) Framework, which was released last August.

Speaking first about the framework was Dr. Mario Tokoro, Founder and Executive Advisor for Sony Computer Science Laboratories. Dr. Tokoro gave an overview of the Dependable Embedded OS project (DEOS), a large national project in Japan originally intended to strengthen the country’s embedded systems. After considerable research, the project leaders discovered they needed to consider whether large, open systems could be dependable when it came to business continuity, accountability and ensuring consistency throughout the systems’ lifecycle. Because the boundaries of large open systems are ever-changing, the project leaders knew they must put together dependability requirements that could accommodate constant change, allow for continuous service and provide continuous accountability for the systems based on consensus. As a result, they put together a framework to address both the change accommodation cycle and failure response cycles for large systems – this framework was donated to The Open Group’s Real-Time Embedded Systems Forum and released as the O-DA standard.

Dr. Tokoro’s presentation was followed by a panel discussion on the O-DA standard. Moderated by Dave Lounsbury, VP and CTO of The Open Group, the panel included Dr. Tokoro; Jack Fujieda, Founder and CEO ReGIS, Inc.; T.J. Virdi, Senior Enterprise IT Architect at Boeing; and Bill Brierly, Partner and Senior Consultant, Conexiam. The panel discussed the importance of openness for systems, iterating the conference theme of boundaries and the realities of having standards that can ensure openness and dependability at the same time. They also discussed how the O-DA standard provides end-to-end requirements for system architectures that also account for accommodating changes within the system and accountability for it.

Lounsbury concluded the track by iterating that assuring systems’ dependability is not only fundamental to The Open Group mission of Boundaryless Information Flow™ and interoperability but also in preventing large system failures.

Tuesday’s late morning sessions were split into two tracks, with one track continuing the Dependability through Assuredness theme hosted by Joe Bergmann, Forum Chair of The Open Group’s Real-Time and Embedded Systems Forum. In this track, Fujieda and Brierly furthered the discussion of O-DA outlining the philosophy and vision of the standard, as well as providing a roadmap for the standard.

In the morning Business Innovation & Transformation track, Alan Hakimi, Consulting Executive, Microsoft presented “Zen and the Art of Enterprise Architecture: The Dynamics of Transformation in a Complex World.” Hakimi emphasized that transformation needs to focus on a holistic view of an organization’s ecosystem and motivations, economics, culture and existing systems to help foster real change. Based on Buddhist philosophy, he presented an eightfold path to transformation that can allow enterprise architects to approach transformation and discuss it with other architects and business constituents in a way that is meaningful to them and allows for complexity and balance.

This was followed by “Building the Knowledge-Based Enterprise,” a session given by Bob Weisman, Head Management Consultant for Build the Vision.

Tuesday’s afternoon sessions centered on a number of topics including Business Innovation and Transformation, Risk Management, Archimate, TOGAF tutorials and case studies and Professional Development.

In the Archimate track, Vadim Polyakov of Inovalon, Inc., presented “Implementing an EA Practice in an Agile Enterprise” a case study centered on how his company integrated its enterprise architecture with the principles of agile development and how they customized the Archimate framework as part of the process.

The Risk Management track featured William Estrem, President, Metaplexity Associates, and Jim May of Windsor Software discussing how the Open FAIR Standard can be used in conjunction with TOGAF 9.1 to enhance risk management in organizations in their session, “Integrating Open FAIR Risk Analysis into the Enterprise Architecture Capability.” Jack Jones, President of CXOWARE, also discussed the best ways for “Communicating the Value Proposition” for cohesive enterprise architectures to business managers using risk management scenarios.

The plenary sessions and many of the track sessions from today’s tracks can be viewed on The Open Group’s Livestream channel at http://new.livestream.com/opengroup.

The day culminated with dinner and a Lion Dance performance in honor of Chinese New Year performed by Leung’s White Crane Lion & Dragon Dance School of San Francisco.

We would like to express our gratitude for the support by our following sponsors:  BIZZDesign, Corso, Good e-Learning, I-Server and Metaplexity Associates.

IMG_1460 copy

O-DA standard panel discussion with Dave Lounsbury, Bill Brierly, Dr. Mario Tokoro, Jack Fujieda and TJ Virdi

Comments Off

Filed under Conference, Enterprise Architecture, Enterprise Transformation, Standards, TOGAF®, Uncategorized

The Open Group San Francisco 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

The Open Group’s San Francisco conference, held at the Marriott Union Square, began today highlighting the theme of how the industry is moving Toward Boundaryless Information Flow™.”

The morning plenary began with a welcome from The Open Group President and CEO Allen Brown.  He began the day’s sessions by discussing the conference theme, reminding the audience that The Open Group’s vision of Boundaryless Information Flow began in 2002 as a means to breakdown the silos within organizations and provide better communications within, throughout and beyond organizational walls.

Heather Kreger, Distinguished Engineer and CTO of International Standards at IBM, presented the first session of the day, “Open Technologies Fuel the Business and IT Renaissance.” Kreger discussed how converging technologies such as social and mobile, Big Data, the Internet of Things, analytics, etc.—all powered by the cloud and open architectures—are forcing a renaissance within both IT and companies. Fueling this renaissance is a combination of open standards and open source technologies, which can be used to build out the platforms needed to support these technologies at the speed that is enabling innovation. To adapt to these new circumstances, architects should broaden their skillsets so they have deeper skills and competencies in multiple disciplines, technologies and cultures in order to better navigate this world of open source based development platforms.

The second keynote of the morning, “Enabling the Opportunity to Achieve Boundaryless Information Flow™,” was presented by Larry Schmidt, HP Fellow at Hewlett-Packard, and Eric Stephens, Enterprise Architect, Oracle. Schmidt and Stephens addressed how to cultivate a culture within healthcare ecosystems to enable better information flow. Because healthcare ecosystems are now primarily digital (including not just individuals but technology architectures and the Internet of Things), boundaryless communication is imperative so that individuals can become the managers of their health and the healthcare ecosystem can be better defined. This in turn will help in creating standards that help solve the architectural problems currently hindering the information flow within current healthcare systems, driving better costs and better outcomes.

Following the first two morning keynotes Schmidt provided a brief overview of The Open Group’s new Healthcare Forum. The forum plans to leverage existing Open Group best practices such as harmonization, existing standards (such as TOGAF®) and work with other forums and vertical to create new standards to address the problems facing the healthcare industry today.

Mike Walker, Enterprise Architect at Hewlett-Packard, and Mark Dorfmueller, Associate Director Global Business Services for Procter & Gamble, presented the morning’s final keynote entitled “Business Architecture: The Key to Enterprise Transformation.” According to Walker, business architecture is beginning to change how enterprise architecture is done within organizations. In order to do so, Walker believes that business architects must be able to understand business processes, communicate ideas and engage with others (including other architects) within the business and offer services in order to implement and deliver successful programs. Dorfmueller illustrated business architecture in action by presenting how Procter & Gamble uses their business architecture to change how business is done within the company based on three primary principles—being relevant, practical and making their work consumable for those within the company that implement the architectures.

The morning plenary sessions culminated with a panel discussion on “Future Technology and Enterprise Transformation,” led by Dave Lounsbury, VP and CTO of The Open Group. The panel, which included all of the morning’s speakers, took a high-level view of how emerging technologies are eroding traditional boundaries within organizations. Things within IT that have been specialized in the past are now becoming commoditized to the point where they are now offering new opportunities for companies. This is due to how commonplace they’ve become and because we’re becoming smarter in how we use and get value out of our technologies, as well as the rapid pace of technology innovation we’re experiencing today.

Finally, wrapping up the morning was the Open Trusted Technology Forum (OTTF), a forum of The Open Group, with forum director Sally Long presenting an overview of a new Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program which launched today.  The program is the first such accreditation to provide third-party certification for companies guaranteeing their supply chains are free from maliciously tainted or counterfeit products and conformant to the Open Trusted Technology Provider™ Standard (O-TTPS). IBM is the first company to earn the accreditation and there are at least two other companies that are currently going through the accreditation process.

Monday’s afternoon sessions were split between two tracks, Enterprise Architecture (EA) and Enterprise Transformation and Open Platform 3.0.

In the EA & Enterprise Transformation track, Purna Roy and John Raspen, both Directors of Consulting at Cognizant Technology Solutions, discussed the need to take a broad view and consider factors beyond just IT architectures in their session, “Enterprise Transformation: More than an Architectural Transformation.”  In contrast, Kirk DeCosta, Solution Architect at PNC Financial Services, argued that existing architectures can indeed serve as the foundation for transformation in “The Case for Current State – A Contrarian Viewpoint.”

The Open Platform 3.0 track addressed issues around the convergence of technologies based on cloud platforms, including the impact of Big Data as an enabler of information architectures by Helen Sun, Enterprise Architect at Oracle, and predictive analytics. Dipanjan Sengupta, Principal Architect at Cognizant Technology Solutions, discussed why integration platforms are critical for managing distribution application portfolios in “The Need for a High Performance Integration Platform in the Cloud Era.”

Today’s plenary sessions and many of the track sessions can be viewed on The Open Group’s Livestream channel at http://new.livestream.com/opengroup.

The day ended with an opportunity for everyone to share cocktails and conversation at a networking reception held at the hotel.

photo

Andras Szakal, VP & CTO, IBM U.S. Federal and Chair of the OTTF, presented with a plaque in honor of IBM’s contribution to the O-TTPS Accreditation Program, along with the esteemed panel who were key to the success of the launch.

Comments Off

Filed under Business Architecture, Conference, Enterprise Architecture, Enterprise Transformation, Uncategorized

What I learnt at The Open Group Bangalore Conference last weekend

By Sreekanth Iyer, Executive IT Architect, IBM

It was quite a lot of learning on a Saturday attending The Open Group conference at Bangalore. Actually it was a two day program this year. I could not make it on Friday because of other work commitments. I heard from the people who attended that it was a great session on Friday. At least I knew about a fellow IBMer Jithesh Kozhipurath’s presentation on Friday. I’d the chance to look at that excellent material on applying TOGAF® practices for integrated IT Operations Enterprise Architecture which was his experience sharing of the lab infra optimization work that he was leading.

I started bit late on Saturday, thinking it was happening at the Leela Palace which was near to my home (Ah.. that was in 2008) Realized late that it was at the Philips Innovation Campus at Manyata. But managed to reach just on time before the start of the sessions.

The day started with an Architecture as a Service discussion. The presentation was short but there were lot of interesting questions and interactions post the session.  I was curious know more about the “self-service” aspect on that topic.

Then we had Jason Uppal of ClinicialMessage Inc. on stage (see picture below) , who gave a wonderful presentation on the human touch to the architecture and how to leverage EA to make disruptive changes without disrupting the working systems.

Jason bangaloreLots of take-aways from the session. Importantly the typical reasons why certain Architectures can fail… caused many a times we have a solution already in our mind and we are trying to fit that into the requirement. And most of these times if we look at the Requirements artifact we will be see that the problems are not rightly captured. Couldn’t agree more with the good practices that he discussed.

Starting with  “Identifying the Problem Right” – I thought that is definitely the first and important step in Architecture.  Then Jason talked about significance of communicating and engaging people and stakeholders in the architecture — point that he drove home with a good example from the health care industry. He talked about the criticality of communicating and engaging the stakeholders — engagement of course improves quality. Building the right levers in the architecture and solving the whole problem were some of the other key points that I noted down. More importantly the key message was as Architects, we have to go beyond drawing the lines and boxes to deliver the change, may be look to deliver things that can create an impact in 30 days balancing the short term and long term goals.

I got the stage for couple of minutes to update on the AEA Bangalore Chapter activities. My request to the attendees was to leverage the chapter for their own professional development – using that as a platform to share expertise, get answers to queries, connect with other professionals of similar interest and build the network. Hopefully will see more participation in the Bangalore chapter events this year.

On the security track, had multiple interesting sessions. Began with Jim Hietala of The Open Group discussing the Risk Management Framework. I’ve been attending a course on the subject. But this one provided a lot of insight on the taxonomy (O-RT) and the analysis part – more of taking a quantitative approach than a qualitative approach. Though the example was based on risks with regard to laptop thefts, there is no reason we can’t apply the principles to real issues like quantifying the threats for moving workloads to cloud. (that’s another to-do added to my list).

Then it was my session on the Best practices for moving workloads to cloud for Indian Banks. Talked about the progress so far with the whitepaper. The attendees were limited as there was Jason’s EA workshop happening in parallel. But those who attended were really interested in the subject. We did have a good discussion on the benefits, challenges and regulations with regard to the Indian Banking workloads and their movement to cloud.  We discussed few interesting case studies. There are areas that need more content and I’ve requested the people who attended the session to participate in the workgroup. We are looking at getting a first draft done in the next 30 days.

Finally, also sat in the presentation by Ajit A. Matthew on the security implementation at Intel. Everywhere the message is clear. You need to implement context based security and security intelligence to enable the new age innovation but at the same time protect your core assets.

It was a Saturday well spent. Added had some opportunities to connect with few new folks and understand their security challenges with cloud.  Looking to keep the dialog going and have an AEA Bangalore chapter event sometime during Q1. In that direction, I took the first step to write this up and share with my network.

Event Details:
The Open Group Bangalore, India
January 24-25, 2014

Sreekanth IyerSreekanth Iyer is an Executive IT Architect in IBM Security Systems CTO office and works on developing IBM’s Cloud Security Technical Strategy. He is an Open Group Certified Distinguished Architect and is a core member of the Bangalore Chapter of the Association of Enterprise Architects. He has over 18 years’ industry experience and has led several client solutions across multiple industries. His key areas of work include Information Security, Cloud Computing, SOA, Event Processing, and Business Process management. He has authored several technical articles, blogs and is a core contributor to multiple Open Group as well as IBM publications. He works out of the IBM India Software Lab Bangalore and you can follow him on Twitter @sreek.

Comments Off

Filed under Conference, Enterprise Architecture, Healthcare, TOGAF®

Introducing Two New Security Standards for Risk Analysis—Part II – Risk Analysis Standard

By Jim Hietala, VP Security, The Open Group

Last week we took a look at one of the new risk standards recently introduced by The Open Group® Security Forum at the The Open Group London Conference 2013, the Risk Taxonomy Technical Standard 2.0 (O-RT). Today’s blog looks at its sister standard, the Risk Analysis (O-RA) Standard, which provides risk professionals the tools they need to perform thorough risk analyses within their organizations for better decision-making about risk.

Risk Analysis (O-RA) Standard

The new Risk Analysis Standard provides a comprehensive guide for performing effective analysis scenarios within organizations using the Factor Analysis of Information Risk (FAIR™) framework. O-RA is geared toward managing the frequency and magnitude of loss that can arise from a threat, whether human, animal or a natural event–in other words “how often bad things happened and how bad they are when they occur.” Used together, the O-RT and O-RA Standards provide organizations with a way to perform consistent risk modeling, that can not only help thoroughly explain risk factors to stakeholders but allow information security professionals to strengthen existing or create better analysis methods. O-RA may also be used in conjunction with other risk frameworks to perform risk analysis.

The O-RA standard is also meant to provide something more than a mere assessment of risk. Many professionals within the security industry often fail to distinguish between “assessing” risk vs. “analysis” of risk. This standard goes beyond assessment by supporting effective analyses so that risk statements are less vulnerable to problems and are more meaningful and defensible than assessments that provide only the broad risk-ratings (“this is a 4 on a scale of 1-to-5”) normally used in assessments.

O-RA also lays out standard process for approaching risk analysis that can help organizations streamline the way they approach risk measurement. By focusing in on these four core process elements, organizations are able to perform more effective analyses:

  • Clearly identifying and characterizing the assets, threats, controls and impact/loss elements at play within the scenario being assessed
  • Understanding the organizational context for analysis (i.e. what’s at stake from an organizational perspective)
  • Measuring/estimating various risk factors
  • Calculating risk using a model that represents a logical, rational, and useful view of what risk is and how it works.

Because measurement and calculation are essential elements of properly analyzing risk variables, an entire chapter of the standard is dedicated to how to measure and calibrate risk. This chapter lays out a number of useful approaches for establishing risk variables, including establishing baseline risk estimates and ranges; creating distribution ranges and most likely values; using Monte Carlo simulations; accounting for uncertainty; determining accuracy vs. precision and subjective vs. objective criteria; deriving vulnerability; using ordinal scales; and determining diminishing returns.

Finally, a practical, real-world example is provided to take readers through an actual risk analysis scenario. Using the FAIR model, the example outlines the process for dealing with an threat in which an HR executive at a large bank has left the user name and password that allow him access to all the company’s HR systems on a Post-It note tacked onto his computer in his office in clear view of anyone (other employees, cleaning crews, etc.) who comes into the office.

The scenario outlines four stages in assessing this risk:

  1. .    Stage 1: Identify Scenario Components (Scope the Analysis)
  2. .    Stage 2: Evaluate Loss Event Frequency (LEF)
  3. .    Stage 3: Evaluate Loss Magnitude (LM)
  4. .    Stage 4: Derive and Articulate Risk

Each step of the risk analysis process is thoroughly outlined for the scenario to provide Risk Analysts an example of how to perform an analysis process using the FAIR framework. Considerable guidance is provided for stages 2 and 3, in particular, as those are the most critical elements in determining organizational risk.

Ultimately, the O-RA is a guide to help organizations make better decisions about which risks are the most critical for the organization to prioritize and pay attention to versus those that are less important and may not warrant attention. It is critical for Risk Analysts and organizations to become more consistent in this practice because lack of consistency in determining risk among information security professionals has been a major obstacle in allowing security professionals a more legitimate “seat at the table” in the boardroom with other business functions (finance, HR, etc.) within organizations.

For our profession to evolve and grow, consistency and accurate measurement is key. Issues and solutions must be identified consistently and comparisons and measurement must be based on solid foundations, as illustrated below.

Risk2

Chained Dependencies

O-RA can help organizations arrive at better decisions through consistent analysis techniques as well as provide more legitimacy within the profession.  Without a foundation from which to manage information risk, Risk Analysts and information security professionals may rely too heavily on intuition, bias, commercial or personal agendas for their analyses and decision making. By outlining a thorough foundation for Risk Analysis, O-RA provides not only a common foundation for performing risk analyses but the opportunity to make better decisions and advance the security profession.

For more on the O-RA Standard or to download it, please visit: https://www2.opengroup.org/ogsys/catalog/C13G.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Introducing Two New Security Standards for Risk Analysis—Part I – Risk Taxonomy Technical Standard 2.0

By Jim Hietala, VP Security, The Open Group

At the The Open Group London 2013 Conference, The Open Group® announced three new initiatives related to the Security Forum’s work around Risk Management. The first of these was the establishment of a new certification program for Risk Analysts working within the security profession, the Open FAIR Certification Program.  Aimed at providing a professional certification for Risk Analysts, the program will bring a much-needed level of assuredness to companies looking to hire Risk Analysts, certifying that analysts who have completed the Open FAIR program understand the fundamentals of risk analysis and are qualified to perform that analysis.

Forming the basis of the Open FAIR certification program are two new Open Group standards, version 2.0 of the Risk Taxonomy (O-RT) standard originally introduced by the Security Forum in 2009, and a new Risk Analysis (O-RA) Standard, both of which were also announced at the London conference. These standards are the result of ongoing work around risk analysis that the Security Forum has been conducting for a number of years now in order to help organizations better understand and identify their exposure to risk, particularly when it comes to information security risk.

The Risk Taxonomy and Risk Analysis standards not only form the basis and body of knowledge for the Open FAIR certification, but provide practical advice for security practitioners who need to evaluate and counter the potential threats their organization may face.

Today’s blog will look at the first standard, the Risk Taxonomy Technical Standard, version 2.0. Next week, we’ll look at the other standard for Risk Analysis.

Risk Taxonomy (O-RT) Technical Standard 2.0

Originally, published in January 2009, the O-RT is intended to provide a common language and references for security and business professionals who need to understand or analyze risk conditions, providing a common language for them to use when discussing those risks. Version 2.0 of the standard contains a number of updates based both on feedback provided by professionals that have been using the standard and as a result of research conducted by Security Forum member CXOWARE.

The majority of the changes to Version 2.0 are refinements in terminology, including changes in language that better reflect what each term encompasses. For example, the term “Control Strength” in the original standard has now been changed to “Resistance Strength” to reflect that controls used in that part of the taxonomy must be resistive in nature.

More substantive changes were made to the portion of the taxonomy that discusses how Loss Magnitude is evaluated.

Why create a taxonomy for risk?  For two reasons. First, the taxonomy provides a foundation from which risk analysis can be performed and talked about. Second, a tightly defined taxonomy reduces the inability to effectively measure or estimate risk scenarios, leading to better decision making, as illustrated by the following “risk management stack.”

Effective Management


↑

Well-informed Decisions

Effective Comparisons


↑

Meaningful Measurements

Accurate Risk Model

The complete Risk Taxonomy is comprised of two branches: Loss Event Frequency (LEF) and Loss Magnitude (LM), illustrated here:

Risk1

Focusing solely on pure risk (which only results in loss) rather than speculative risk (which might result in either loss or profit), the O-RT is meant to help estimate the probable frequency and magnitude of future loss.

Traditionally LM has been far more difficult to determine than LEF, in part because organizations don’t always perform analyses on their losses or they just stick to evaluating “low hanging fruit” variables rather than delve into determining more complex risk factors. The new taxonomy takes a deep dive into the Loss Magnitude branch of the risk analysis taxonomy providing guidance that will allow Risk Analysts to better tackle the difficult task of determining LM. It includes terminology outlining six specific forms of loss an organization can experience (productivity, response, replacement, fines and judgments, competitive advantage, reputation) as well as how to determine Loss Flow, a new concept in this standard.

The Loss Flow analysis helps identify how a loss may affect both primary (owners, employees, etc.) and secondary (customers, stockholders, regulators, etc.) stakeholders as a result of a threat agent’s action on an asset. The new standard provides a thorough overview on how to assess Loss Flow and identify the loss factors of any given threat.

Finally, the standard also includes a practical, real-world scenario to help analysts understand how to put the taxonomy to use in within their organizations. O-RT provides a common linguistic foundation that will allow security professionals to then perform the risk analyses as outlined in the O-RA Standard.

For more on the Risk Taxonomy Standard or to download it, visit: https://www2.opengroup.org/ogsys/catalog/C13K.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Jericho Forum declares “success” and sunsets

By Ian Dobson & Jim Hietala, The Open Group
Ten years ago, the Jericho Forum set out on a mission to evangelise the issues, problems, solutions and provide thought-leadership around the emerging business and security issues of de-perimeterisation, with the aim of one day being able to declare “job-done”.

That day has now arrived.  Today, de-perimeterisation is an established “fact” – touching not just information security but all areas of modern business, including the bring your own IT phenomenon (devices, IDs, services) as well as all forms of cloud computing. It’s widely understood and quoted by the entire industry.  It has become part of today’s computing and security lexicon.

With our de-perimeterisation mission accomplished, the Jericho Forum has decided the time has come to “declare success”, celebrate it as a landmark victory in the evolution of information security, and sunset as a separate Forum in The Open Group.

Our “declare success and sunset” victory celebration on Monday 21st Oct 2013 at the Central Hall Westminster, London UK, was our valedictory announcement that the Jericho Forum will formally sunset on 1st Nov 2013.  The event included many past leading Jericho Forum members attending as guests, with awards of commemorative plaques to those whose distinctive leadership steered the information security mind-set change success that the Jericho Forum has now achieved.

For those who missed the live-streamed event, you can watch it on the livestream recording at http://new.livestream.com/opengroup/Lon13

We are fortunate to be able to pass our Jericho Forum legacy of de-perimeterisation achievements and publications to the good care of The Open Group’s Security Forum, which has undertaken to maintain the Jericho Forum’s deliverables, protect it’s legacy from mis-representation, and perhaps adopt and evolve Jericho’s thought-leadership approach on future information security challenges.

Ian Dobson, Director Jericho Forum
Jim Hietala, VP Security
The Open Group
21st October 2013


Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world. In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Security Architecture