Category Archives: Cybersecurity

The Open Group Boston 2014 – Day Two Highlights

By Loren K. Bayes, Director, Global Marketing Communications

Enabling Boundaryless Information Flow™  continued in Boston on Tuesday, July 22Allen Brown, CEO and President of The Open Group welcomed attendees with an overview of the company’s second quarter results.

The Open Group membership is at 459 organizations in 39 countries, including 16 new membership agreements in 2Q 2014.

Membership value is highlighted by the collaboration Open Group members experience. For example, over 4,000 individuals attended Open Group events (physically and virtually whether at member meetings, webinars, podcasts, tweet jams). The Open Group website had more than 1 million page views and over 105,000 publication items were downloaded by members in 80 countries.

Brown also shared highlights from The Open Group Forums which featured status on many upcoming white papers, snapshots, reference models and standards, as well as individiual Forum Roadmaps. The Forums are busy developing and reviewing projects such as the Next Version of TOGAF®, an Open Group standard, an ArchiMate® white paper, The Open Group Healthcare Forum charter and treatise, Standard Mils™ APIs and Open Fair. Many publications are translated into multiple languages including Chinese and Portuguese. Also, a new Forum will be announced in the third quarter at The Open Group London 2014 so stay tuned for that launch news!

Our first keynote of the day was Making Health Addictive by Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health.

Dr. Kvedar described how Healthcare delivery is changing, with mobile technology being a big part. Other factors pushing changes are reimbursement paradigms and caregivers being paid to be more efficient and interested in keeping people healthy and out of hospitals. The goal of Healthcare providers is to integrate care into the day-to-day lives of patients. Healthcare also aims for better technologies and architecture.

Mobile is a game-changer in Healthcare because people are “always on and connected”. Mobile technology allows for in-the-moment messaging, ability to capture health data (GPS, accelerator, etc.) and display information in real time as needed. Bottom-line, smartphones are addictive so they are excellent tools for communication and engagement.

But there is a need to understand and address the implications of automating Healthcare: security, privacy, accountability, economics.

The plenary continued with Proteus Duxbury, CTO, Connect for Health Colorado, who presented From Build to Run at the Colorado Health Insurance Exchange – Achieving Long-term Sustainability through Better Architecture.

Duxbury stated the keys to successes of his organization are the leadership and team’s shared vision, a flexible vendor being agile with rapidly changing regulatory requirements, and COTS solution which provided minimal customization and custom development, resilient architecture and security. Connect for Health experiences many challenges including budget restraints, regulation and operating in a “fish bowl”. Yet, they are on-track with their three-year ‘build to run’ roadmap, stabilizing their foundation and gaining efficiencies.

During the Q&A with Allen Brown following each presentation, both speakers emphasized the need for standards, architecture and data security.

Brown and DuxburyAllen Brown and Proteus Duxbury

During the afternoon, track sessions consisted of Healthcare, Enterprise Architecture (EA) & Business Value, Service-Oriented Architecture (SOA), Security & Risk Management, Professional Development and ArchiMate Tutorials. Chris Armstrong, President, Armstrong Process Group, Inc. discussed Architecture Value Chain and Capability Model. Laura Heritage, Principal Solution Architect / Enterprise API Platform, SOA Software, presented Protecting your APIs from Threats and Hacks.

The evening culminated with a reception at the historic Old South Meeting House, where the Boston Tea Party began in 1773.

photo2

IMG_2814Networking Reception at Old South Meeting House

A special thank you to our sponsors and exhibitors at The Open Group Boston 2014: BiZZdesign, Black Duck, Corso, Good e-Learning, Orbus and AEA.

Join the conversation #ogBOS!

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Leave a comment

Filed under Accreditations, Boundaryless Information Flow™, Business Architecture, COTS, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open FAIR Certification, OTTF, RISK Management, Service Oriented Architecture, Standards, Uncategorized

New Health Data Deluges Require Secure Information Flow Enablement Via Standards, Says The Open Group’s New Healthcare Director

By The Open Group

Below is the transcript of The Open Group podcast on how new devices and practices have the potential to expand the information available to Healthcare providers and facilities.

Listen to the podcast here.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview coming to you in conjunction with The Open Group’s upcoming event, Enabling Boundaryless Information Flow™ July 21-22, 2014 in Boston.

GardnerI’m Dana Gardner, Principal Analyst at Interarbor Solutions and I’ll be your host and moderator for the series of discussions from the conference on Boundaryless Information Flow, Open Platform 3.0™, Healthcare, and Security issues.

One area of special interest is the Healthcare arena, and Boston is a hotbed of innovation and adaption for how technology, Enterprise Architecture, and standards can improve the communication and collaboration among Healthcare ecosystem players.

And so, we’re joined by a new Forum Director at The Open Group to learn how an expected continued deluge of data and information about patients, providers, outcomes, and efficiencies is pushing the Healthcare industry to rapid change.

WJason Lee headshotith that, please join me now in welcoming our guest. We’re here with Jason Lee, Healthcare and Security Forums Director at The Open Group. Welcome, Jason.

Jason Lee: Thank you so much, Dana. Good to be here.

Gardner: Great to have you. I’m looking forward to the Boston conference and want to remind our listeners and readers that it’s not too late to sign up. You can learn more at http://www.opengroup.org.

Jason, let’s start by talking about the relationship between Boundaryless Information Flow, which is a major theme of the conference, and healthcare. Healthcare perhaps is the killer application for Boundaryless Information Flow.

Lee: Interesting, I haven’t heard it referred to that way, but healthcare is 17 percent of the US economy. It’s upwards of $3 trillion. The costs of healthcare are a problem, not just in the United States, but all over the world, and there are a great number of inefficiencies in the way we practice healthcare.

We don’t necessarily intend to be inefficient, but there are so many places and people involved in healthcare, it’s very difficult to get them to speak the same language. It’s almost as if you’re in a large house with lots of different rooms, and every room you walk into they speak a different language. To get information to flow from one room to the other requires some active efforts and that’s what we’re undertaking here at The Open Group.

Gardner: What is it about the current collaboration approaches that don’t work? Obviously, healthcare has been around for a long time and there have been different players involved. What’s the hurdle? What prevents a nice, seamless, easy flow and collaboration in information that gets better outcomes? What’s the holdup?

Lee: There are many ways to answer that question, because there are many barriers. Perhaps the simplest is the transformation of healthcare from a paper-based industry to a digital industry. Everyone has walked into an office, looked behind the people at the front desk, and seen file upon file and row upon row of folders, information that’s kept in a written format.

When there’s been movement toward digitizing that information, not everyone has used the same system. It’s almost like trains running on a different gauge track. Obviously if the track going east to west is a different gauge than going north to south, then trains aren’t going to be able to travel on those same tracks. In the same way, healthcare information does not flow easily from one office to another or from one provider to another.

Gardner: So not only do we have disparate strategies for collecting and communicating health data, but we’re also seeing much larger amounts of data coming from a variety of new and different places. Some of them now even involve sensors inside of patients themselves or devices that people will wear. So is the data deluge, the volume, also an issue here?

Lee: Certainly. I heard recently that an integrated health plan, which has multiple hospitals involved, contains more elements of data than the Library of Congress. As information is collected at multiple points in time, over a relatively short period of time, you really do have a data deluge. Figuring out how to find your way through all the data and look at the most relevant for the patient is a great challenge.

Gardner: I suppose the bad news is that there is this deluge of data, but it’s also good news, because more data means more opportunity for analysis, a better ability to predict and determine best practices, and also provide overall lower costs with better patient care.

So it seems like the stakes are rather high here to get this right, to not just crumble under a volume or an avalanche of data, but to master it, because it’s perhaps the future. The solution is somewhere in there too.

Lee: No question about it. At The Open Group, our focus is on solutions. We, like others, put a great deal of effort into describing the problems, but figuring out how to bring IT technologies to bear on business problems, how to encourage different parts of organizations to speak to one another and across organizations to speak the same language, and to operate using common standards and language. That’s really what we’re all about.

And it is, in a large sense, part of the process of helping to bring healthcare into the 21st Century. A number of industries are a couple of decades ahead of healthcare in the way they use large datasets — big data, some people refer to it as. I’m talking about companies like big department stores and large online retailers. They really have stepped up to the plate and are using that deluge of data in ways that are very beneficial to them, and healthcare can do the same. We’re just not quite at the same level of evolution.

Gardner: And to your point, the stakes are so much higher. Retail is, of course, a big deal in the economy, but as you pointed out, healthcare is such a much larger segment and portion. So just making modest improvements in communication, collaboration, or data analysis can reap huge rewards.

Lee: Absolutely true. There is the cost side of things, but there is also the quality side. So there are many ways in which healthcare can improve through standardization and coordinated development, using modern technology that cannot just reduce cost, but improve quality at the same time.

Gardner: I’d like to get into a few of the hotter trends, but before we do, it seems that The Open Group has recognized the importance here by devoting the entire second day of their conference in Boston, that will be on July 22, to Healthcare.

Maybe you could give us a brief overview of what participants, and even those who come in online and view recorded sessions of the conference at http://new.livestream.com/opengroup should expect? What’s going to go on July 22nd?

Lee: We have a packed day. We’re very excited to have Dr. Joe Kvedar, a physician at Partners HealthCare and Founding Director of the Center for Connected Health, as our first plenary speaker. The title of his presentation is “Making Health Additive.” Dr. Kvedar is a widely respected expert on mobile health, which is currently the Healthcare Forum’s top work priority. As mobile medical devices become ever more available and diversified, they will enable consumers to know more about their own health and wellness. A great deal of data of potentially useful health data will be generated. How this information can be used–not just by consumers but also by the healthcare establishment that takes care of them as patients, will become a question of increasing importance. It will become an area where standards development and The Open Group can be very helpful.

Our second plenary speaker, Proteus Duxbury, Chief Technology Officer at Connect for Health Colorado,will discuss a major feature of the Affordable Care Act—the health insurance exchanges–which are designed to bring health insurance to tens of millions of people who previously did not have access to it. Mr. Duxbury is going to talk about how Enterprise Architecture–which is really about getting to solutions by helping the IT folks talk to the business folks and vice versa–has helped the State of Colorado develop their Health Insurance Exchange.

After the plenaries, we will break up into 3 tracks, one of which is Healthcare-focused. In this track there will be three presentations, all of which discuss how Enterprise Architecture and the approach to Boundaryless Information Flow can help healthcare and healthcare decision-makers become more effective and efficient.

One presentation will focus on the transformation of care delivery at the Visiting Nurse Service of New York. Another will address stewarding healthcare transformation using Enterprise Architecture, focusing on one of our Platinum members, Oracle, and a company called Intelligent Medical Objects, and how they’re working together in a productive way, bringing IT and healthcare decision-making together.

Then, the final presentation in this track will focus on the development of an Enterprise Architecture-based solution at an insurance company. The payers, or the insurers–the big companies that are responsible for paying bills and collecting premiums–have a very important role in the healthcare system that extends beyond administration of benefits. Yet, payers are not always recognized for their key responsibilities and capabilities in the area of clinical improvements and cost improvements.

With the increase in payer data brought on in large part by the adoption of a new coding system–the ICD-10–which will come online this year, there will be a huge amount of additional data, including clinical data, that become available. At The Open Group, we consider payers—health insurance companies (some of which are integrated with providers)–as very important stakeholders in the big picture..

In the afternoon, we’re going to switch gears a bit and have a speaker talk about the challenges, the barriers, the “pain points” in introducing new technology into the healthcare systems. The focus will return to remote or mobile medical devices and the predictable but challenging barriers to getting newly generated health information to flow to doctors’ offices and into patients records, electronic health records, and hospitals data keeping and data sharing systems.

We’ll have a panel of experts that responds to these pain points, these challenges, and then we’ll draw heavily from the audience, who we believe will be very, very helpful, because they bring a great deal of expertise in guiding us in our work. So we’re very much looking forward to the afternoon as well.

Gardner: It’s really interesting. A couple of these different plenaries and discussions in the afternoon come back to this user-generated data. Jason, we really seem to be on the cusp of a whole new level of information that people will be able to develop from themselves through their lifestyle, new devices that are connected.

We hear from folks like Apple, Samsung, Google, and Microsoft. They’re all pulling together information and making it easier for people to not only monitor their exercise, but their diet, and maybe even start to use sensors to keep track of blood sugar levels, for example.

In fact, a new Flurry Analytics survey showed 62 percent increase in the use of health and fitness application over the last six months on the popular mobile devices. This compares to a 33 percent increase in other applications in general. So there’s an 87 percent faster uptick in the use of health and fitness applications.

Tell me a little bit how you see this factoring in. Is this a mixed blessing? Will so much data generated from people in addition to the electronic medical records, for example, be a bad thing? Is this going to be a garbage in, garbage out, or is this something that could potentially be a game-changer in terms of how people react to their own data and then bring more data into the interactions they have with care providers?

Lee: It’s always a challenge to predict what the market is going to do, but I think that’s a remarkable statistic that you cited. My prediction is that the increased volume of person- generated data from mobile health devices is going to be a game-changer. This view also reflects how the Healthcare Forum members (which includes members from Capgemini, Philips, IBM, Oracle and HP) view the future.

The commercial demand for mobile medical devices, things that can be worn, embedded, or swallowed, as in pills, as you mentioned, is growing ever more. The software and the applications that will be developed to be used with the devices is going to grow by leaps and bounds. As you say, there are big players getting involved. Already some of the pedometer type devices that measure the number of steps taken in a day have captured the interest of many, many people. Even David Sedaris, serious guy that he is, was writing about it recently in ‘The New Yorker’.

What we will find is that many of the health indicators that we used to have to go to the doctor or nurse or lab to get information on will become available to us through these remote devices.

There will be a question, of course, as to reliability and validity of the information, to your point about garbage in, garbage out, but I think standards development will help here This, again, is where The Open Group comes in. We might also see the FDA exercising its role in ensuring safety here, as well as other organizations, in determining which devices are reliable.

The Open Group is working in the area of mobile data and information systems that are developed around them, and their ability to (a) talk to one another and (b) talk to the data devices/infrastructure used in doctors’ offices and in hospitals. This is called interoperability and it’s certainly lacking in the country.

There are already problems around interoperability and connectivity of information in the healthcare establishment as it is now. When patients and consumers start collecting their own data, and the patient is put at the center of the nexus of healthcare, then the question becomes how does that information that patients collect get back to the doctor/clinician in ways in which the data can be trusted and where the data are helpful?

After all, if a patient is wearing a medical device, there is the opportunity to collect data, about blood sugar level let’s say, throughout the day. And this is really taking healthcare outside of the four walls of the clinic and bringing information to bear that can be very, very useful to clinicians and beneficial to patients.

In short, the rapid market dynamic in mobile medical devices and in the software and hardware that facilitates interoperability begs for standards-based solutions that reduce costs and improve quality, and all of which puts the patient at the center. This is The Open Group’s Healthcare Forum’s sweet spot.

Gardner: It seems to me a real potential game-changer as well, and that something like Boundaryless Information Flow and standards will play an essential role. Because one of the big question marks with many of the ailments in a modern society has to do with lifestyle and behavior.

So often, the providers of the care only really have the patient’s responses to questions, but imagine having a trove of data at their disposal, a 360-degree view of the patient to then further the cause of understanding what’s really going on, on a day-to-day basis.

But then, it’s also having a two-way street, being able to deliver perhaps in an automated fashion reinforcements and incentives, information back to the patient in real-time about behavior and lifestyles. So it strikes me as something quite promising, and I look forward to hearing more about it at the Boston conference.

Any other thoughts on this issue about patient flow of data, not just among and between providers and payers, for example, or providers in an ecosystem of care, but with the patient as the center of it all, as you said?

Lee: As more mobile medical devices come to the market, we’ll find that consumers own multiple types of devices at least some of which collect multiple types of data. So even for the patient, being at the center of their own healthcare information collection, there can be barriers to having one device talk to the other. If a patient wants to keep their own personal health record, there may be difficulties in bringing all that information into one place.

So the interoperability issue, the need for standards, guidelines, and voluntary consensus among stakeholders about how information is represented becomes an issue, not just between patients and their providers, but for individual consumers as well.

Gardner: And also the cloud providers. There will be a variety of large organizations with cloud-modeled services, and they are going to need to be, in some fashion, brought together, so that a complete 360-degree view of the patient is available when needed. It’s going to be an interesting time.

Of course, we’ve also looked at many other industries and tried to have a cloud synergy, a cloud-of-clouds approach to data and also the transaction. So it’s interesting how what’s going on in multiple industries is common, but it strikes me that, again, the scale and the impact of the healthcare industry makes it a leader now, and perhaps a driver for some of these long overdue structured and standardized activities.

Lee: It could become a leader. There is no question about it. Moreover, there is a lot Healthcare can learn from other companies, from mistakes that other companies have made, from lessons they have learned, from best practices they have developed (both on the content and process side). And there are issues, around security in particular, where Healthcare will be at the leading edge in trying to figure out how much is enough, how much is too much, and what kinds of solutions work.

There’s a great future ahead here. It’s not going to be without bumps in the road, but organizations like The Open Group are designed and experienced to help multiple stakeholders come together and have the conversations that they need to have in order to push forward and solve some of these problems.

Gardner: Well, great. I’m sure there will be a lot more about how to actually implement some of those activities at the conference. Again, that’s going to be in Boston, beginning on July 21, 2014.

We’ll have to leave it there. We’re about out of time. We’ve been talking with a new Director at The Open Group to learn how an expected continued deluge of data and information about patients and providers, outcomes and efficiencies are all working together to push the Healthcare industry to rapid change. And, as we’ve heard, that might very well spill over into other industries as well.

So we’ve seen how innovation and adaptation around technology, Enterprise Architecture and standards can improve the communication and collaboration among Healthcare ecosystem players.

It’s not too late to register for The Open Group Boston 2014 (http://www.opengroup.org/boston2014) and join the conversation via Twitter #ogchat #ogBOS, where you will be able to learn more about Boundaryless Information Flow, Open Platform 3.0, Healthcare and other relevant topics.

So a big thank you to our guest. We’ve been joined by Jason Lee, Healthcare and Security Forums Director at The Open Group. Thanks so much, Jason.

Lee: Thank you very much.

 

 

 

 

 

 

 

 

 

Leave a comment

Filed under Boundaryless Information Flow™, Cloud, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Interoperability, Open Platform 3.0, Standards, Uncategorized

The Open Group Boston 2014 to Explore How New IT Trends are Empowering Improvements in Business

By The Open Group

The Open Group Boston 2014 will be held on July 21-22 and will cover the major issues and trends surrounding Boundaryless Information Flow™. Thought-leaders at the event will share their outlook on IT trends, capabilities, best practices and global interoperability, and how this will lead to improvements in responsiveness and efficiency. The event will feature presentations from representatives of prominent organizations on topics including Healthcare, Service-Oriented Architecture, Security, Risk Management and Enterprise Architecture. The Open Group Boston will also explore how cross-organizational collaboration and trends such as big data and cloud computing are helping to make enterprises more effective.

The event will consist of two days of plenaries and interactive sessions that will provide in-depth insight on how new IT trends are leading to improvements in business. Attendees will learn how industry organizations are seeking large-scale transformation and some of the paths they are taking to realize that.

The first day of the event will bring together subject matter experts in the Open Platform 3.0™, Boundaryless Information Flow™ and Enterprise Architecture spaces. The day will feature thought-leaders from organizations including Boston University, Oracle, IBM and Raytheon. One of the keynotes is from Marshall Van Alstyne, Professor at Boston University School of Management & Researcher at MIT Center for Digital Business, which reveals the secret of internet-driven marketplaces. Other content:

• The Open Group Open Platform 3.0™ focuses on new and emerging technology trends converging with each other and leading to new business models and system designs. These trends include mobility, social media, big data analytics, cloud computing and the Internet of Things.
• Cloud security and the key differences in securing cloud computing environments vs. traditional ones as well as the methods for building secure cloud computing architectures
• Big Data as a service framework as well as preparing to deliver on Big Data promises through people, process and technology
• Integrated Data Analytics and using them to improve decision outcomes

The second day of the event will have an emphasis on Healthcare, with keynotes from Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health, and Connect for Health Colorado CTO, Proteus Duxbury. The day will also showcase speakers from Hewlett Packard and Blue Cross Blue Shield, multiple tracks on a wide variety of topics such as Risk and Professional Development, and Archimate® tutorials. Key learnings include:

• Improving healthcare’s information flow is a key enabler to improving healthcare outcomes and implementing efficiencies within today’s delivery models
• Identifying the current state of IT standards and future opportunities which cover the healthcare ecosystem
• How Archimate® can be used by Enterprise Architects for driving business innovation with tried and true techniques and best practices
• Security and Risk Management evolving as software applications become more accessible through APIs – which can lead to vulnerabilities and the potential need to increase security while still understanding the business value of APIs

Member meetings will also be held on Wednesday and Thursday, June 23-24.

Don’t wait, register now to participate in these conversations and networking opportunities during The Open Group Boston 2014: http://www.opengroup.org/boston2014/registration

Join us on Twitter – #ogchat #ogBOS

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open Platform 3.0, Professional Development, RISK Management, Service Oriented Architecture, Standards, Uncategorized

The Power of APIs – Join The Open Group Tweet Jam on Wednesday, July 9th

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The face of technology is evolving at breakneck speed, driven by demand from consumers and businesses alike for more robust, intuitive and integrated service offerings. APIs (application programming interfaces) have made this possible by offering greater interoperability between otherwise disparate software and hardware systems. While there are clear benefits to their use, how do today’s security and value-conscious enterprises take advantage of this new interoperability without exposing them themselves?

On Wednesday, July 9th at 9:00 am PT/12:00 pm ET/5:00 pm GMT, please join us for a tweet jam that will explore how APIs are changing the face of business today, and how to prepare for their implementation in your enterprise.

APIs are at the heart of how today’s technology communicates with one another, and have been influential in enabling new levels of development for social, mobility and beyond. The business benefits of APIs are endless, as are the opportunities to explore how they can be effectively used and developed.

There is reason to maintain a certain level of caution, however, as recent security issues involving open APIs have impacted overall confidence and sustainability.

This tweet jam will look at the business benefits of APIs, as well as potential vulnerabilities and weak points that you should be wary of when integrating them into your Enterprise Architecture.

We welcome The Open Group members and interested participants from all backgrounds to join the discussion and interact with our panel of thought-leaders from The Open Group including Jason Lee, Healthcare and Security Forums Director; Jim Hietala, Vice President of Security; David Lounsbury, CTO; and Dr. Chris Harding, Director for Interoperability and Open Platform 3.0™ Forum Director. To access the discussion, please follow the hashtag #ogchat during the allotted discussion time.

Interested in joining The Open Group Security Forum? Register your interest, here.

What Is a Tweet Jam?

A tweet jam is a 45 minute “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on relevant and thought-provoking issues. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Here are some helpful guidelines for taking part in the tweet jam:

  • Please introduce yourself (name, title and organization)
  • Use the hashtag #ogchat following each of your tweets
  • Begin your tweets with the question number to which you are responding
  • Please refrain from individual product/service promotions – the goal of the tweet jam is to foster an open and informative dialogue
  • Keep your commentary focused, thoughtful and on-topic

If you have any questions prior to the event or would like to join as a participant, please contact George Morin (@GMorin81 or george.morin@hotwirepr.com).

We look forward to a spirited discussion and hope you will be able to join!

 

3 Comments

Filed under Data management, digital technologies, Enterprise Architecture, Enterprise Transformation, Information security, Open Platform 3.0, real-time and embedded systems, Standards, Strategy, Tweet Jam, Uncategorized

Heartbleed: Tips and Lessons Learned

By Jim Hietala, VP, Security, The Open Group

During our upcoming event May 12-14, The Open Group Summit 2014 AmsterdamEnabling Boundaryless Information Flow™ – one of the discussions will be around risk management and the development of open methodologies for managing risk.

Managing risk is an essential component of an information security program. Risk management is fundamental to effectively securing information, IT assets, and critical business processes. Risk management is also a challenge to get right. With numerous risk management frameworks and standards available, it can be difficult for practitioners to know where to start, and what methodologies to employ.

Recently, the Heartbleed bug has been wreaking havoc not only for major websites and organizations, but the security confidence of the public in general. Even as patches are being made to guarantee safety, systems will remain vulnerable for an extended period of time. Taking proactive steps and learning how to manage risk is imperative to securing your privacy.

With impacts on an estimated 60-70% of websites, Heartbleed is easily the security vulnerability with the highest degree of potential impact ever. There is helpful guidance as to what end-users can try to do to insulate themselves from any negative consequences.

Large organizations obviously need to determine where they have websites and network equipment that is vulnerable, in order to rapidly remediate this. Scanning your IP address range (both for internal addresses, and for IP addresses exposed to the Internet) should be done ASAP, to allow you to identify all sites, servers, and other equipment using OpenSSL, and needing immediate patching.

In the last few days, it has become clear that we are not just talking about websites/web servers. Numerous network equipment vendors have used OpenSSL in their networking products. Look closely at your routers, switches, firewalls, and make sure that you understand in which of these OpenSSL is also an issue. The impact of OpenSSL and Heartbleed on these infrastructure components is likely to be a bigger problem for organizations, as the top router manufacturers all have products affected by this vulnerability.

Taking a step back from the immediate frenzy of finding OpenSSL, and patching websites and network infrastructure to mitigate this security risk, it is pretty clear that we have a lot of work to do as a security community on numerous fronts:

• Open source security components that gain widespread use need much more serious attention, in terms of finding/fixing software vulnerabilities
• For IT hardware and software vendors, and for the organizations that consume their products, OpenSSL and Heartbleed will become the poster child for why we need more rigorous supply chain security mechanisms generally, and specifically for commonly used open source software.
• The widespread impacts from Heartbleed should also focus attention on the need for radically improved security for the emerging Internet of Things (IoT). As bad as Heartbleed is, try to imagine a similar situation when there are billions of IP devices connected to the internet. This is precisely where we are headed absent big changes in software assurance/supply chain security for IoT devices.

Finally, there is a deeper issue here: CIOs and IT people should realize that the fundamental security barriers, such as SSL are under constant attack – and these security walls won’t hold forever. So, it is important not to simply patch your SSL and reissue your certificates, but to rethink your strategies for security defense in depth, such as increased protection of critical data and multiple independent levels of security.

You also need to ensure that your suppliers are implementing security practices that are at least as good as yours – how many web sites got caught out by Heartbleed because of something their upstream supplier did?

Discussions during the Amsterdam Summit will outline important areas to be aware of when managing security risk, including how to be more effective against any copycat bugs. Be sure to sign up now for our summit http://www.opengroup.org/amsterdam2014 .

For more information on The Open Group Security Forum, please visit http://www.opengroup.org/subjectareas/security.

62940-hietalaJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security, risk management and healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

2 Comments

Filed under Boundaryless Information Flow™, Cybersecurity, Information security, RISK Management

How the Open Trusted Technology Provider Standard (O-TTPS) and Accreditation Will Help Lower Cyber Risk

By Andras Szakal, Vice President and Chief Technology Officer, IBM U.S. Federal

Changing business dynamics and enabling technologies

In 2008, IBM introduced the concept of a “Smarter Planet.” The Smarter Planet initiative focused, in part, on the evolution of globalization against the backdrop of changing business dynamics and enabling technologies. A key concept was the need for infrastructure to be tightly integrated, interconnected, and intelligent, thereby facilitating collaboration between people, government and businesses in order to meet the world’s growing appetite for data and automation. Since then, many industries and businesses have adopted this approach, including the ICT (information and communications technology) industries that support the global technology manufacturing supply chain.

Intelligent and interconnected critical systems

This transformation has infused technology into virtually all aspects of our lives, and involves, for example, government systems, the electric grid and healthcare. Most of these technological solutions are made up of hundreds or even thousands of components that are sourced from the growing global technology supply chain.
Intelligent and interconnected critical systems

In the global technology economy, no one technology vendor or integrator is able to always provide a single source solution. It is no longer cost competitive to design all of the electronic components, printed circuit boards, card assemblies, or other sub-assemblies in-house. Adapting to the changing market place and landscape by balancing response time and cost efficiency, in an expedient manner, drives a more wide-spread use of OEM (original equipment manufacturer) products.

As a result, most technology providers procure from a myriad of global component suppliers, who very often require similarly complex supply chains to source their components. Every enterprise has a supplier network, and each of their suppliers has a supply chain network, and these sub-tier suppliers have their own supply chain networks. The resultant technology supply chain is manifested into a network of integrated suppliers.

Increasingly, the critical systems of the planet — telecommunications, banking, energy and others — depend on and benefit from the intelligence and interconnectedness enabled by existing and emerging technologies. As evidence, one need only look to the increase in enterprise mobile applications and BYOD strategies to support corporate and government employees.

Cybersecurity by design: Addressing risk in a sustainable way across the ecosystem

Whether these systems are trusted by the societies they serve depends in part on whether the technologies incorporated into them are fit for the purpose they are intended to serve. Fit for purpose is manifested in two essential ways:

- Does the product meet essential functional requirements?
- Has the product or component been produced by trustworthy provider?

Of course, the leaders or owners of these systems have to do their part to achieve security and safety: e.g., to install, use and maintain technology appropriately, and to pay attention to people and process aspects such as insider threats. Cybersecurity considerations must be addressed in a sustainable way from the get-go, by design, and across the whole ecosystem — not after the fact, or in just one sector or another, or in reaction to crisis.

Assuring the quality and integrity of mission-critical technology

In addressing the broader cybersecurity challenge, however, buyers of mission-critical technology naturally seek reassurance as to the quality and integrity of the products they procure. In our view, the fundamentals of the institutional response to that need are similar to those that have worked in prior eras and in other industries — like food.

The very process of manufacturing technology is not immune to cyber-attack. The primary purpose of attacking the supply chain typically is motivated by monetary gain. The primary goals of a technology supply chain attack are intended to inflict massive economic damage in an effort to gain global economic advantage or as a way to seeding targets with malware that provides unfettered access for attackers.

It is for this reason that the global technology manufacturing industry must establish practices that mitigate this risk by increasing the cost barriers of launching such attacks and increasing the likelihood of being caught before the effects of such an attack are irreversible. As these threats evolve, the global ICT industry must deploy enhanced security through advanced automated cyber intelligence analysis. As critical infrastructure becomes more automated, integrated and essential to critical to functions, the technology supply chain that surrounds it must be considered a principle theme of the overall global security and risk mitigation strategy.

A global, agile, and scalable approach to supply chain security

Certainly, the manner in which technologies are invented, produced, and sold requires a global, agile, and scalable approach to supply chain assurance and is essential to achieve the desired results. Any technology supply chain security standard that hopes to be widely adopted must be flexible and country-agnostic. The very nature of the global supply chain (massively segmented and diverse) requires an approach that provides practicable guidance but avoids being overtly prescriptive. Such an approach would require the aggregation of industry practices that have been proven beneficial and effective at mitigating risk.

The OTTF (The Open Group Trusted Technology Forum) is an increasingly recognized and promising industry initiative to establish best practices to mitigate the risk of technology supply chain attack. Facilitated by The Open Group, a recognized international standards and certification body, the OTTF is working with governments and industry worldwide to create vendor-neutral open standards and best practices that can be implemented by anyone. Current membership includes a list of the most well-known technology vendors, integrators, and technology assessment laboratories.

The benefits of O-TTPS for governments and enterprises

IBM is currently a member of the OTTF and has been honored to hold the Chair for the last three years.  Governments and enterprises alike will benefit from the work of the OTTF. Technology purchasers can use the Open Trusted Technology Provider™ Standard (O-TTPS) and Framework best-practice recommendations to guide their strategies.

A wide range of technology vendors can use O-TTPS approaches to build security and integrity into their end-to-end supply chains. The first version of the O-TTPS is focused on mitigating the risk of maliciously tainted and counterfeit technology components or products. Note that a maliciously tainted product is one that has been produced by the provider and acquired through reputable channels but which has been tampered maliciously. A counterfeit product is produced other than by or for the provider, or is supplied by a non-reputable channel, and is represented as legitimate. The OTTF is currently working on a program that will accredit technology providers who conform to the O-TTPS. IBM expects to complete pilot testing of the program by 2014.

IBM has actively supported the formation of the OTTF and the development of the O-TTPS for several reasons. These include but are not limited to the following:

- The Forum was established within a trusted and respected international standards body – The Open Group.
- The Forum was founded, in part, through active participation by governments in a true public-private partnership in which government members actively participate.
- The OTTF membership includes some of the most mature and trusted commercial technology manufactures and vendors because a primary objective of the OTTF was harmonization with other standards groups such as ISO (International Organization for Standardization) and Common Criteria.

The O-TTPS defines a framework of organizational guidelines and best practices that enhance the security and integrity of COTS ICT. The first version of the O-TTPS is focused on mitigating certain risks of maliciously tainted and counterfeit products within the technology development / engineering lifecycle. These best practices are equally applicable for systems integrators; however, the standard is intended to primarily address the point of view of the technology manufacturer.

O-TTPS requirements

The O-TTPS requirements are divided into three categories:

1. Development / Engineering Process and Method
2. Secure Engineering Practices
3. Supply Chain Security Practices

The O-TTPS is intended to establish a normalized set of criteria against which a technology provider, component supplier, or integrator can be assessed. The standard is divided into categories that define best practices for engineering development practices, secure engineering, and supply chain security and integrity intended to mitigate the risk of maliciously tainted and counterfeit components.

The accreditation program

As part of the process for developing the accreditation criteria and policy, the OTTF established a pilot accreditation program. The purpose of the pilot was to take a handful of companies through the accreditation process and remediate any potential process or interpretation issues. IBM participated in the OTTP-S accreditation pilot to accredit a very significant segment of the software product portfolio; the Application Infrastructure Middleware Division (AIM) which includes the flagship WebSphere product line. The AIM pilot started in mid-2013 and completed in the first week of 2014 and was formally recognized as accredited in the fist week of February 2014.

IBM is currently leveraging the value of the O-TTPS and working to accredit additional development organizations. Some of the lessons learned during the IBM AIM initial O-TTPS accreditation include:

- Conducting a pre-assessment against the O-TTPS should be conducted by an organization before formally entering accreditation. This allows for remediation of any gaps and reduces potential assessment costs and project schedule.
- Starting with a segment of your development portfolio that has a mature secure engineering practices and processes. This helps an organization address accreditation requirements and facilitates interactions with the 3rd party lab.
- Using your first successful O-TTPS accreditation to create templates that will help drive data gathering and validate practices to establish a repeatable process as your organization undertakes additional accreditations.

andras-szakalAndras Szakal, VP and CTO, IBM U.S. Federal, is responsible for IBM’s industry solution technology strategy in support of the U.S. Federal customer. Andras was appointed IBM Distinguished Engineer and Director of IBM’s Federal Software Architecture team in 2005. He is an Open Group Distinguished Certified IT Architect, IBM Certified SOA Solution Designer and a Certified Secure Software Lifecycle Professional (CSSLP).  Andras holds undergraduate degrees in Biology and Computer Science and a Masters Degree in Computer Science from James Madison University. He has been a driving force behind IBM’s adoption of government IT standards as a member of the IBM Software Group Government Standards Strategy Team and the IBM Corporate Security Executive Board focused on secure development and cybersecurity. Andras represents the IBM Software Group on the Board of Directors of The Open Group and currently holds the Chair of the IT Architect Profession Certification Standard (ITAC). More recently, he was appointed chair of The Open Group Trusted Technology Forum and leads the development of The Open Trusted Technology Provider Framework.

1 Comment

Filed under Accreditations, Cybersecurity, government, O-TTF, O-TTPS, OTTF, RISK Management, Standards, supply chain, Supply chain risk

Q&A with Allen Brown, President and CEO of The Open Group

By The Open Group

Last month, The Open Group hosted its San Francisco 2014 conference themed “Toward Boundaryless Information Flow™.” Boundaryless Information Flow has been the pillar of The Open Group’s mission since 2002 when it was adopted as the organization’s vision for Enterprise Architecture. We sat down at the conference with The Open Group President and CEO Allen Brown to discuss the industry’s progress toward that goal and the industries that could most benefit from it now as well as The Open Group’s new Dependability through Assuredness™ Standard and what the organization’s Forums are working on in 2014.

The Open Group adopted Boundaryless Information Flow as its vision in 2002, and the theme of the San Francisco Conference has been “Towards Boundaryless Information Flow.” Where do you think the industry is at this point in progressing toward that goal?

Well, it’s progressing reasonably well but the challenge is, of course, when we established that vision back in 2002, life was a little less complex, a little bit less fast moving, a little bit less fast-paced. Although organizations are improving the way that they act in a boundaryless manner – and of course that changes by industry – some industries still have big silos and stovepipes, they still have big boundaries. But generally speaking we are moving and everyone understands the need for information to flow in a boundaryless manner, for people to be able to access and integrate information and to provide it to the teams that they need.

One of the keynotes on Day One focused on the opportunities within the healthcare industry and The Open Group recently started a Healthcare Forum. Do you see Healthcare industry as a test case for Boundaryless Information Flow and why?

Healthcare is one of the verticals that we’ve focused on. And it is not so much a test case, but it is an area that absolutely seems to need information to flow in a boundaryless manner so that everyone involved – from the patient through the administrator through the medical teams – have all got access to the right information at the right time. We know that in many situations there are shifts of medical teams, and from one medical team to another they don’t have access to the same information. Information isn’t easily shared between medical doctors, hospitals and payers. What we’re trying to do is to focus on the needs of the patient and improve the information flow so that you get better outcomes for the patient.

Are there other industries where this vision might be enabled sooner rather than later?

I think that we’re already making significant progress in what we call the Exploration, Mining and Minerals industry. Our EMMM™ Forum has produced an industry-wide model that is being adopted throughout that industry. We’re also looking at whether we can have an influence in the airline industry, automotive industry, manufacturing industry. There are many, many others, government and retail included.

The plenary on Day Two of the conference focused on The Open Group’s Dependability through Assuredness standard, which was released last August. Why is The Open Group looking at dependability and why is it important?

Dependability is ultimately what you need from any system. You need to be able to rely on that system to perform when needed. Systems are becoming more complex, they’re becoming bigger. We’re not just thinking about the things that arrive on the desktop, we’re thinking about systems like the barriers at subway stations or Tube stations, we’re looking at systems that operate any number of complex activities. And they bring an awful lot of things together that you have to rely upon.

Now in all of these systems, what we’re trying to do is to minimize the amount of downtime because downtime can result in financial loss or at worst human life, and we’re trying to focus on that. What is interesting about the Dependability through Assuredness Standard is that it brings together so many other aspects of what The Open Group is working on. Obviously the architecture is at the core, so it’s critical that there’s an architecture. It’s critical that we understand the requirements of that system. It’s also critical that we understand the risks, so that fits in with the work of the Security Forum, and the work that they’ve done on Risk Analysis, Dependency Modeling, and out of the dependency modeling we can get the use cases so that we can understand where the vulnerabilities are, what action has to be taken if we identify a vulnerability or what action needs to be taken in the event of a failure of the system. If we do that and assign accountability to people for who will do what by when, in the event of an anomaly being detected or a failure happening, we can actually minimize that downtime or remove it completely.

Now the other great thing about this is it’s not only a focus on the architecture for the actual system development, and as the system changes over time, requirements change, legislation changes that might affect it, external changes, that all goes into that system, but also there’s another circle within that system that deals with failure and analyzes it and makes sure it doesn’t happen again. But there have been so many evidences of failure recently. In the banks for example in the UK, a bank recently was unable to process debit cards or credit cards for customers for about three or four hours. And that was probably caused by the work done on a routine basis over a weekend. But if Dependability through Assuredness had been in place, that could have been averted, it could have saved an awfully lot of difficulty for an awful lot of people.

How does the Dependability through Assuredness Standard also move the industry toward Boundaryless Information Flow?

It’s part of it. It’s critical that with big systems the information has to flow. But this is not so much the information but how a system is going to work in a dependable manner.

Business Architecture was another featured topic in the San Francisco plenary. What role can business architecture play in enterprise transformation vis a vis the Enterprise Architecture as a whole?

A lot of people in the industry are talking about Business Architecture right now and trying to focus on that as a separate discipline. We see it as a fundamental part of Enterprise Architecture. And, in fact, there are three legs to Enterprise Architecture, there’s Business Architecture, there’s the need for business analysts, which are critical to supplying the information, and then there are the solutions, and other architects, data, applications architects and so on that are needed. So those three legs are needed.

We find that there are two or three different types of Business Architect. Those that are using the analysis to understand what the business is doing in order that they can inform the solutions architects and other architects for the development of solutions. There are those that are more integrated with the business that can understand what is going on and provide input into how that might be improved through technology. And there are those that can actually go another step and talk about here we have the advances and the technology and here are the opportunities for advancing our competitiveness and organization.

What are some of the other key initiatives that The Open Group’s forum and work groups will be working on in 2014?

That kind question is like if you’ve got an award, you’ve got to thank your friends, so apologies to anyone that I leave out. Let me start alphabetically with the Architecture Forum. The Architecture Forum obviously is working on the evolution of TOGAF®, they’re also working with the harmonization of TOGAF with Archimate® and they have a number of projects within that, of course Business Architecture is on one of the projects going on in the Architecture space. The Archimate Forum are pushing ahead with Archimate—they’ve got two interesting activities going on at the moment, one is called ArchiMetals, which is going to be a sister publication to the ArchiSurance case study, where the ArchiSurance provides the example of Archimate is used in the insurance industry, ArchiMetals is going to be used in a manufacturing context, so there will be a whitepaper on that and there will be examples and artifacts that we can use. They’re also working on in Archimate a standard for interoperability for modeling tools. There are four tools that are accredited and certified by The Open Group right now and we’re looking for that interoperability to help organizations that have multiple tools as many of them do.

Going down the alphabet, there’s DirecNet. Not many people know about DirecNet, but Direcnet™ is work that we do around the U.S. Navy. They’re working on standards for long range, high bandwidth mobile networking. We can go to the FACE™ Consortium, the Future Airborne Capability Environment. The FACE Consortium are working on their next version of their standard, they’re working toward accreditation, a certification program and the uptake of that through procurement is absolutely amazing, we’re thrilled about that.

Healthcare we’ve talked about. The Open Group Trusted Technology Forum, where they’re working on how we can trust the supply chain in developed systems, they’ve released the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program, that was launched this week, and we already have one accredited vendor and two certified test labs, assessment labs. That is really exciting because now we’ve got a way of helping any organization that has large complex systems that are developed through a global supply chain to make sure that they can trust their supply chain. And that is going to be invaluable to many industries but also to the safety of citizens and the infrastructure of many countries. So the other part of the O-TTPS is that standard we are planning to move toward ISO standardization shortly.

The next one moving down the list would be Open Platform 3.0™. This is really exciting part of Boundaryless Information Flow, it really is. This is talking about the convergence of SOA, Cloud, Social, Mobile, Internet of Things, Big Data, and bringing all of that together, this convergence, this bringing together of all of those activities is really something that is critical right now, and we need to focus on. In the different areas, some of our Cloud computing standards have already gone to ISO and have been adopted by ISO. We’re working right now on the next products that are going to move through. We have a governance standard in process and an ecosystem standard has recently been published. In the area of Big Data there’s a whitepaper that’s 25 percent completed, there’s also a lot of work on the definition of what Open Platform 3.0 is, so this week the members have been working on trying to define Open Platform 3.0. One of the really interesting activities that’s gone on, the members of the Open Platform 3.0 Forum have produced something like 22 different use cases and they’re really good. They’re concise and they’re precise and the cover a number of different industries, including healthcare and others, and the next stage is to look at those and work on the ROI of those, the monetization, the value from those use cases, and that’s really exciting, I’m looking forward to peeping at that from time to time.

The Real Time and Embedded Systems Forum (RTES) is next. Real-Time is where we incubated the Dependability through Assuredness Framework and that was where that happened and is continuing to develop and that’s really good. The core focus of the RTES Forum is high assurance system, and they’re doing some work with ISO on that and a lot of other areas with multicore and, of course, they have a number of EC projects that we’re partnering with other partners in the EC around RTES.

The Security Forum, as I mentioned earlier, they’ve done a lot of work on risk and dependability. So they’ve not only their standards for the Risk Taxonomy and Risk Analysis, but they’ve now also developed the Open FAIR Certification for People, which is based on those two standards of Risk Analysis and Risk Taxonomy. And we’re already starting to see people being trained and being certified under that Open FAIR Certification Program that the Security Forum developed.

A lot of other activities are going on. Like I said, I probably left a lot of things out, but I hope that gives you a flavor of what’s going on in The Open Group right now.

The Open Group will be hosting a summit in Amsterdam May 12-14, 2014. What can we look forward to at that conference?

In Amsterdam we have a summit – that’s going to bring together a lot of things, it’s going to be a bigger conference that we had here. We’ve got a lot of activity in all of our activities; we’re going to bring together top-level speakers, so we’re looking forward to some interesting work during that week.

 

 

 

1 Comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Cybersecurity, EMMMv™, Enterprise Architecture, FACE™, Healthcare, O-TTF, RISK Management, Standards, TOGAF®

Q&A with Jim Hietala on Security and Healthcare

By The Open Group

We recently spoke with Jim Hietala, Vice President, Security for The Open Group, at the 2014 San Francisco conference to discuss upcoming activities in The Open Group’s Security and Healthcare Forums.

Jim, can you tell us what the Security Forum’s priorities are going to be for 2014 and what we can expect to see from the Forum?

In terms of our priorities for 2014, we’re continuing to do work in Security Architecture and Information Security Management. In the area of Security Architecture, the big project that we’re doing is adding security to TOGAF®, so we’re working on the next version of the TOGAF standard and specification and there’s an active project involving folks from the Architecture Forum and the Security Forum to integrate security into and stripe it through TOGAF. So, on the Security Architecture side, that’s the priority. On the Information Security Management side, we’re continuing to do work in the area of Risk Management. We introduced a certification late last year, the OpenFAIR certification, and we’ll continue to do work in the area of Risk Management and Risk Analysis. We’re looking to add a second level to the certification program, and we’re doing some other work around the Risk Analysis standards that we’ve introduced.

The theme of this conference was “Towards Boundaryless Information Flow™” and many of the tracks focused on convergence, and the convergence of things Big Data, mobile, Cloud, also known as Open Platform 3.0. How are those things affecting the realm of security right now?

I think they’re just beginning to. Cloud—obviously the security issues around Cloud have been here as long as Cloud has been over the past four or five years. But if you look at things like the Internet of Things and some of the other things that comprise Open Platform 3.0, the security impacts are really just starting to be felt and considered. So I think information security professionals are really just starting to wrap their hands around, what are those new security risks that come with those technologies, and, more importantly, what do we need to do about them? What do we need to do to mitigate risk around something like the Internet of Things, for example?

What kind of security threats do you think companies need to be most worried about over the next couple of years?

There’s a plethora of things out there right now that organizations need to be concerned about. Certainly advanced persistent threat, the idea that maybe nation states are trying to attack other nations, is a big deal. It’s a very real threat, and it’s something that we have to think about – looking at the risks we’re facing, exactly what is that adversary and what are they capable of? I think profit-motivated criminals continue to be on everyone’s mind with all the credit card hacks that have just come out. We have to be concerned about cyber criminals who are profit motivated and who are very skilled and determined and obviously there’s a lot at stake there. All of those are very real things in the security world and things we have to defend against.

The Security track at the San Francisco conference focused primarily on risk management. How can companies better approach and manage risk?

As I mentioned, we did a lot of work over the last few years in the area of Risk Management and the FAIR Standard that we introduced breaks down risk into what’s the frequency of bad things happening and what’s the impact if they do happen? So I would suggest that taking that sort of approach, using something like taking the Risk Taxonomy Standard that we’ve introduced and the Risk Analysis Standard, and really looking at what are the critical assets to protect, who’s likely to attack them, what’s the probably frequency of attacks that we’ll see? And then looking at the impact side, what’s the consequence if somebody successfully attacks them? That’s really the key—breaking it down, looking at it that way and then taking the right mitigation steps to reduce risk on those assets that are really important.

You’ve recently become involved in The Open Group’s new Healthcare Forum. Why a healthcare vertical forum for The Open Group?

In the area of healthcare, what we see is that there’s just a highly fragmented aspect to the ecosystem. You’ve got healthcare information that’s captured in various places, and the information doesn’t necessarily flow from provider to payer to other providers. In looking at industry verticals, the healthcare industry seemed like an area that really needed a lot of approaches that we bring from The Open Group—TOGAF and Enterprise Architecture approaches that we have.

If you take it up to a higher level, it really needs the Boundaryless Information Flow that we talk about in The Open Group. We need to get to the point where our information as patients is readily available in a secure manner to the people who need to give us care, as well as to us because in a lot of cases the information exists as islands in the healthcare industry. In looking at healthcare it just seemed like a natural place where, in our economies – and it’s really a global problem – a lot of money is spent on healthcare and there’s a lot of opportunities for improvement, both in the economics but in the patient care that’s delivered to individuals through the healthcare system. It just seemed like a great area for us to focus on.

As the new Healthcare Forum kicks off this year, what are the priorities for the Forum?

The Healthcare Forum has just published a whitepaper summarizing the workshop findings for the workshop that we held in Philadelphia last summer. We’re also working on a treatise, which will outline our views about the healthcare ecosystem and where standards and architecture work is most needing to be done. We expect to have that whitepaper produced over the next couple of months. Beyond that, we see a lot of opportunities for doing architecture and standards work in the healthcare sector, and our membership is going to determine which of those areas to focus on, which projects to initiate first.

For more on the The Open Group Security Forum, please visit http://www.opengroup.org/subjectareas/security. For more on the The Open Group Healthcare Forum, see http://www.opengroup.org/getinvolved/industryverticals/healthcare.

62940-hietalaJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security, risk management and healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Cloud/SOA, Conference, Data management, Healthcare, Information security, Open FAIR Certification, Open Platform 3.0, RISK Management, TOGAF®, Uncategorized

Accrediting the Global Supply Chain: A Conversation with O-TTPS Recognized Assessors Fiona Pattinson and Erin Connor

By The Open Group 

At the recent San Francisco 2014 conference, The Open Group Trusted Technology Forum (OTTF) announced the launch of the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program.

The program is one the first accreditation programs worldwide aimed at assuring the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

In three short years since OTTF launched, the forum has grown to include more than 25 member companies dedicated to safeguarding the global supply chain against the increasing sophistication of cybersecurity attacks through standards. Accreditation is yet another step in the process of protecting global technology supply chains from maliciously tainted and counterfeit products.

As part of the program, third-party assessor companies will be employed to assess organizations applying for accreditation, with The Open Group serving as the vendor-neutral Accreditation Authority that operates the program.  Prior to the launch, the forum conducted a pilot program with a number of member companies. It was announced at the conference that IBM is the first company to becoming accredited, earning accreditation for its Application, Infrastructure and Middleware (AIM), software business division for its product integrity and supply chain practices.

We recently spoke with OTTF members Fiona Pattinson, director of strategy and business development at Atsec Information Security, and Erin Connor, director at EWA-Canada, at the San Francisco conference to learn more about the assessment process and the new program.

The O-TTPS focus is on securing the technology supply chain. What would you say are the biggest threats facing the supply chain today?

Fiona Pattinson (FP): I think in the three years since the forum began certainly all the members have discussed the various threats quite a lot. It was one of things we discussed as an important topic early on, and I don’t know if it’s the ‘biggest threat,’ but certainly the most important threats that we needed to address initially were those of counterfeit and maliciously tainted products. We came to that through both discussion with all the industry experts in the forum and also through research into some of the requirements from government, so that’s exactly how we knew which threats [to start with].

Erin Connor (EC):  And the forum benefits from having both sides of the acquisition process, both acquirers, and the suppliers and vendors. So they get both perspectives.

How would you define maliciously tainted and counterfeit products?

FP:  They are very carefully defined in the standard—we needed to do that because people’s understanding of that can vary so much.

EC: And actually the concept of ‘maliciously’ tainted was incorporated close to the end of the development process for the standard at the request of members on the acquisition side of the process.

[Note: The standard precisely defines maliciously tainted and counterfeit products as follows:

"The two major threats that acquirers face today in their COTS ICT procurements, as addressed in this Standard, are defined as:

1. Maliciously tainted product – the product is produced by the provider and is acquired

through a provider’s authorized channel, but has been tampered with maliciously.

2. Counterfeit product – the product is produced other than by, or for, the provider, or is

supplied to the provider by other than a provider’s authorized channel and is presented as being legitimate even though it is not."]

The OTTF announced the Accreditation Program for the OTTP Standard at the recent San Francisco conference. Tell us about the standard and how the accreditation program will help ensure conformance to it?

EC: The program is intended to provide organizations with a way to accredit their lifecycle processes for their product development so they can prevent counterfeit or maliciously tainted components from getting into the products they are selling to an end user or into somebody else’s supply chain. It was determined that a third-party type of assessment program would be used. For the organizations, they will know that we Assessors have gone through a qualification process with The Open Group and that we have in place all that’s required on the management side to properly do an assessment. From the consumer side, they have confidence the assessment has been completed by an independent third-party, so they know we aren’t beholden to the organizations to give them a passing grade when perhaps they don’t deserve it. And then of course The Open Group is in position to oversee the whole process and award the final accreditation based on the recommendation we provide.  The Open Group will also be the arbiter of the process between the assessors and organizations if necessary. 

FP:  So The Open Group’s accreditation authority is validating the results of the assessors.

EC: It’s a model that is employed in many, many other product or process assessment and evaluation programs where the actual accreditation authority steps back and have third parties do the assessment.

FP: It is important that the assessor companies are working to the same standard so that there’s no advantage in taking one assessor over the other in terms of the quality of the assessments that are produced.

How does the accreditation program work?

FP: Well, it’s brand new so we don’t know if it is perfect yet, but having said that, we have worked over several months on defining the process, and we have drawn from The Open Group’s existing accreditation programs, as well as from the forum experts who have worked in the accreditation field for many years. We have been performing pilot accreditations in order to check out how the process works. So it is already tested.

How does it actually work? Well, first of all an organization will feel the need to become accredited and at that point will apply to The Open Group to get the accreditation underway. Once their scope of accreditation – which may be as small as one product or theoretically as large as a whole global company – and once the application is reviewed and approved by The Open Group, then they engage an assessor.

There is a way of sampling a large scope to identify the process variations in a larger scope using something we term ‘selective representative products.’ It’s basically a way of logically sampling a big scope so that we capture the process variations within the scope and make sure that the assessment is kept to a reasonable size for the organization undergoing the assessment, but it also gives good assurance to the consumers that it is a representative sample. The assessment is performed by the Recognized Assessor company, and a final report is written and provided to The Open Group for their validation. If everything is in order, then the company will be accredited and their scope of conformance will be added to the accreditation register and trademarked.

EC: So the customers of that organization can go and check the registration for exactly what products are covered by the scope.

FP: Yes, the register is public and anybody can check. So if IBM says WebSphere is accredited, you can go and check that claim on The Open Group web site.

How long does the process take or does it vary?

EC: It will vary depending on how large the scope to be accredited is in terms of the size of the representative set and the documentation evidence. It really does depend on what the variations in the processes are among the product lines as to how long it takes the assessor to go through the evidence and then to produce the report. The other side of the coin is how long it takes the organization to produce the evidence. It may well be that they might not have it totally there at the outset and will have to create some of it.

FP: As Erin said, it varies by the complexity and the variation of the processes and hence the number of selected representative products. There are other factors that can influence the duration. There are three parties influencing that: The applicant Organization, The Open Group’s Accreditation Authority and the Recognized Assessor.

For example, we found that the initial work by the Organization and the Accreditation Authority in checking the scope and the initial documentation can take a few weeks for a complex scope, of course for the pilots we were all new at doing that. In this early part of the project it is vital to get the scope both clearly defined and approved since it is key to a successful accreditation.

It is important that an Organization assigns adequate resources to help keep this to the shortest time possible, both during the initial scope discussions, and during the assessment. If the Organization can provide all the documentation before they get started, then the assessors are not waiting for that and the duration of the assessment can be kept as short as possible.

Of course the resources assigned by the Recognized Assessor also influences how long an assessment takes. A variable for the assessors is how much documentation do they have to read and review? It might be small or it might be a mountain.

The Open Group’s final review and oversight of the assessment takes some time and is influenced by resource availability within that organization. If they have any questions it may take a little while to resolve.

What kind of safeguards does the accreditation program put in place for enforcing the standard?

FP: It is a voluntary standard—there’s no requirement to comply. Currently some of the U.S. government organizations are recommending it. For example, NASA in their SEWP contract and some of the draft NIST documents on Supply Chain refer to it, too.

EC: In terms of actual oversight, we review what their processes are as assessors, and the report and our recommendations are based on that review. The accreditation expires after three years so before the three years is up, the organization should actually get the process underway to obtain a re-accreditation.  They would have to go through the process again but there will be a few more efficiencies because they’ve done it before. They may also wish to expand the scope to include the other product lines and portions of the company. There aren’t any periodic ‘spot checks’ after accreditation to make sure they’re still following the accredited processes, but part of what we look at during the assessment is that they have controls in place to ensure they continue doing the things they are supposed to be doing in terms of securing their supply chain.

FP:  And then the key part is the agreement the organizations signs with The Open Group includes the fact the organization warrant and represent that they remain in conformance with the standard throughout the accreditation period. So there is that assurance too, which builds on the more formal assessment checks.

What are the next steps for The Open Group Trusted Technology Forum?  What will you be working on this year now that the accreditation program has started?

FP: Reviewing the lessons we learned through the pilot!

EC: And reviewing comments from members on the standard now that it’s publicly available and working on version 1.1 to make any corrections or minor modifications. While that’s going on, we’re also looking ahead to version 2 to make more substantial changes, if necessary. The standard is definitely going to be evolving for a couple of years and then it will reach a steady state, which is the normal evolution for a standard.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey – m.hickey@opengroup.org

Fiona Pattinson Fiona Pattinson is responsible for developing new and existing atsec service offerings.  Under the auspices of The Open Group’s OTTF, alongside many expert industry colleagues, Fiona has helped develop The Open Group’s O-TTPS, including developing the accreditation program for supply chain security.  In the past, Fiona has led service developments which have included establishing atsec’s US Common Criteria laboratory, the CMVP cryptographic module testing laboratory, the GSA FIPS 201 TP laboratory, TWIC reader compliance testing, NPIVP, SCAP, PCI, biometrics testing and penetration testing. Fiona has responsibility for understanding a broad range of information security topics and the application of security in a wide variety of technology areas from low-level design to the enterprise level.

ErinConnorErin Connor is the Director at EWA-Canada responsible for EWA-Canada’s Information Technology Security Evaluation & Testing Facility, which includes a Common Criteria Test Lab, a Cryptographic & Security Test Lab (FIPS 140 and SCAP), a Payment Assurance Test Lab (device testing for PCI PTS POI & HSM, Australian Payment Clearing Association and Visa mPOS) and an O-TTPS Assessor lab Recognized by the Open Group.  Erin participated with other expert members of the Open Group Trusted Technology Forum (OTTF) in the development of The Open Group Trusted Technology Provider Standard for supply chain security and its accompanying Accreditation Program.  Erin joined EWA-Canada in 1994 and his initial activities in the IT Security and Infrastructure Assurance field included working on the team fielding a large scale Public Key Infrastructure system, Year 2000 remediation and studies of wireless device vulnerabilities.  Since 2000, Erin has been working on evaluations of a wide variety of products including hardware security modules, enterprise security management products, firewalls, mobile device and management products, as well as system and network vulnerability management products.  He was also the only representative of an evaluation lab in the Biometric Evaluation Methodology Working Group, which developed a proposed methodology for the evaluation of biometric technologies under the Common Criteria.

Comments Off

Filed under Accreditations, Cybersecurity, OTTF, Professional Development, Standards, Supply chain risk

Facing the Challenges of the Healthcare Industry – An Interview with Eric Stephens of The Open Group Healthcare Forum

By The Open Group

The Open Group launched its new Healthcare Forum at the Philadelphia conference in July 2013. The forum’s focus is on bringing Boundaryless Information Flow™ to the healthcare industry to enable data to flow more easily throughout the complete healthcare ecosystem through a standardized vocabulary and messaging. Leveraging the discipline and principles of Enterprise Architecture, including TOGAF®, the forum aims to develop standards that will result in higher quality outcomes, streamlined business practices and innovation within the industry.

At the recent San Francisco 2014 conference, Eric Stephens, Enterprise Architect at Oracle, delivered a keynote address entitled, “Enabling the Opportunity to Achieve Boundaryless Information Flow” along with Larry Schmidt, HP Fellow at Hewlett-Packard. A veteran of the healthcare industry, Stephens was Senior Director of Enterprise Architects Excellus for BlueCross BlueShield prior to joining Oracle and he is an active member of the Healthcare Forum.

We sat down after the keynote to speak with Stephens about the challenges of healthcare, how standards can help realign the industry and the goals of the forum. The opinions expressed here are Stephens’ own, not of his employer.

What are some of the challenges currently facing the healthcare industry?

There are a number of challenges, and I think when we look at it as a U.S.-centric problem, there’s a disproportionate amount of spending that’s taking place in the U.S. For example, if you look at GDP or percentage of GDP expenditures, we’re looking at now probably 18 percent of GDP [in the U.S.], and other developed countries are spending a full 5 percent less than that of their GDP, and in some cases they’re getting better outcomes outside the U.S.

The mere fact that there’s the existence of what we call “medical tourism, where if I need a hip replacement, I can get it done for a fraction of the cost in another country, same or better quality care and have a vacation—a rehab vacation—at the same time and bring along a spouse or significant other, means there’s a real wide range of disparity there. 

There’s also a lack of transparency. Having worked at an insurance company, I can tell you that with the advent of high deductible plans, there’s a need for additional cost information. When I go on Amazon or go to a local furniture store, I know what the cost is going to be for what I’m about to purchase. In the healthcare system, we don’t get that. With high deductible plans, if I’m going to be responsible for a portion or a larger portion of the fee, I want to know what it is. And what happens is, the incentives to drive costs down force the patient to be a consumer. The consumer now asks the tough questions. If my daughter’s going in for a tonsillectomy, show me a bill of materials that shows me what’s going to be done – if you are charging me $20/pill for Tylenol, I’ll bring my own. Increased transparency is what will in turn drive down the overall costs.

I think there’s one more thing, and this gets into the legal side of things. There is an exorbitant amount of legislation and regulation around what needs to be done. And because every time something goes sideways, there’s going to be a lawsuit, doctors will prescribe an extra test, and extra X-ray for a patient whether they need it or not.

The healthcare system is designed around a vicious cycle of diagnose-treat-release. It’s not incentivized to focus on prevention and management. Oregon is promoting these coordinated care organizations (CCOs) that would be this intermediary that works with all medical professionals – whether it was physical, mental, dental, even social worker – to coordinate episodes of care for patients. This drives down inappropriate utilization – for example, using an ER as a primary care facility and drives the medical system towards prevention and management of health. 

Your keynote with Larry Schmidt of HP focused a lot on cultural changes that need to take place within the healthcare industry – what are some of the changes necessary for the healthcare industry to put standards into place?

I would say culturally, it goes back to those incentives, and it goes back to introducing this idea of patient-centricity. And for the medical community, to really start recognizing that these individuals are consumers and increased choice is being introduced, just like you see in other industries. There are disruptive business models. As a for instance, medical tourism is a disruptive business model for United States-based healthcare. The idea of pharmacies introducing clinical medicine for routine care, such as what you see at a CVS, Wal-Mart or Walgreens. I can get a flu shot, I can get a well-check visit, I can get a vaccine – routine stuff that doesn’t warrant a full-blown medical professional. It’s applying the right amount of medical care to a particular situation.

Why haven’t existing standards been adopted more broadly within the industry? What will help providers be more likely to adopt standards?

I think the standards adoption is about “what’s in it for me, the WIIFM idea. It’s demonstrating to providers that utilizing standards is going to help them get out of the medical administration business and focus on their core business, the same way that any other business would want to standardize its information through integration, processes and components. It reduces your overall maintenance costs going forward and arguably you don’t need a team of billing folks sitting in an doctor’s office because you have standardized exchanges of information.

Why haven’t they been adopted? It’s still a question in my mind. Why would a doctor not want to do that is perhaps a question we’re going to need to explore as part of the Healthcare Forum.

Is it doctors that need to adopt the standards or technologies or combination of different constituents within the ecosystem?

I think it’s a combination. We hear a lot about the Affordable Care Act (ACA) and the health exchanges. What we don’t hear about is the legislation to drive toward standardization to increase interoperability. So unfortunately it would seem the financial incentives or things we’ve tried before haven’t worked, and we may simply have to resort to legislation or at least legislative incentives to make it happen because part of the funding does cover information exchanges so you can move health information between providers and other actors in the healthcare system.

You’re advocating putting the individual at the center of the healthcare ecosystem. What changes need to take place within the industry in order to do this?

I think it’s education, a lot of education that has to take place. I think that individuals via the incentive model around high deductible plans will force some of that but it’s taking responsibility and understanding the individual role in healthcare. It’s also a cultural/societal phenomenon.

I’m kind of speculating here, and going way beyond what enterprise architecture or what IT would deliver, but this is a philosophical thing around if I have an ailment, chances are there’s a pill to fix it. Look at the commercials, every ailment say hypertension, it’s easy, you just dial the medication correctly and you don’t worry as much about diet and exercise. These sorts of things – our over-reliance on medication. I’m certainly not going to knock the medications that are needed for folks that absolutely need them – but I think we can become too dependent on pharmacological solutions for our health problems.   

What responsibility will individuals then have for their healthcare? Will that also require a cultural and behavioral shift for the individual?

The individual has to start managing his or her own health. We manage our careers and families proactively. Now we need to focus on our health and not just float through the system. It may come to financial incentives for certain “individual KPIs such as blood pressure, sugar levels, or BMI. Advances in medical technology may facilitate more personal management of one’s health.

One of the Healthcare Forum’s goals is to help establish Boundaryless Information Flow within the Healthcare industry you’ve said that understanding the healthcare ecosystem will be a key component for that what does that ecosystem encompass and why is it important to know that first?

Very simply we’re talking about the member/patient/consumer, then we get into the payers, the providers, and we have to take into account government agencies and other non-medical agents, but they all have to work in concert and information needs to flow between those organizations in a very standardized way so that decisions can be made in a very timely fashion.

It can’t be bottled up, it’s got to be provided to the right provider at the right time, otherwise, best case, it’s going to cost more to manage all the actors in the system. Worst case, somebody dies or there is a “never event due to misinformation or lack of information during the course of care. The idea of Boundaryless Information Flow gives us the opportunity to standardize, have easily accessible information – and by the way secured – it can really aide in that decision-making process going forward. It’s no different than Wal-Mart knowing what kind of merchandise sells well before and after a hurricane (i.e., beer and toaster pastries, BTW). It’s the same kind of real-time information that’s made available to a Google car so it can steer its way down the road. It’s that kind of viscosity needed to make the right decisions at the right time.

Healthcare is a highly regulated industry, how can Boundarylesss Information Flow and data collection on individuals be achieved and still protect patient privacy?

We can talk about standards and the flow and the technical side. We need to focus on the security and privacy side.  And there’s going to be a legislative side because we’re going to touch on real fundamental data governance issue – who owns the patient record? Each actor in the system thinks they own the patient record. If we’re going to require more personal accountability for healthcare, then shouldn’t the consumer have more ownership? 

We also need to address privacy disclosure regulations to avoid catastrophic data leaks of protected health information (PHI). We need bright IT talent to pull off the integration we are talking about here. We also need folks who are well versed in the privacy laws and regulations. I’ve seen project teams of 200 have up to eight folks just focusing on the security and privacy considerations. We can argue about headcount later but my point is the same – one needs some focused resources around this topic.

What will standards bring to the healthcare industry that is missing now?

I think the standards, and more specifically the harmonization of the standards, is going to bring increased maintainability of solutions, I think it’s going to bring increased interoperability, I think it’s going to bring increased opportunities too. We see mobile computing or even DropBox, that has API hooks into all sorts of tools, and it’s well integrated – so I can integrate and I can move files between devices, I can move files between apps because they have hooks it’s easy to work with. So it’s building these communities of developers, apps and technical capabilities that makes it easy to move the personal health record for example, back and forth between providers and it’s not a cataclysmic event to integrate a new version of electronic health records (EHR) or to integrate the next version of an EHR. This idea of standardization but also some flexibility that goes into it.

Are you looking just at the U.S. or how do you make a standard that can go across borders and be international?

It is a concern, much of my thinking and much of what I’ve conveyed today is U.S.-centric, based on our problems, but many of these interoperability problems are international. We’re going to need to address it; I couldn’t tell you what the sequence is right now. There are other considerations, for example, single vs. multi-payer—that came up in the keynote. We tend to think that if we stay focused on the consumer/patient we’re going to get it for all constituencies. It will take time to go international with a standard, but it wouldn’t be the first time. We have a host of technical standards for the Internet (e.g., TCP/IP, HTTP). The industry has been able to instill these standards across geographies and vendors. Admittedly, the harmonization of health care-related standards will be more difficult. However, as our world shrinks with globalization an international lens will need to be applied to this challenge. 

Eric StephensEric Stephens (@EricStephens) is a member of Oracle’s executive advisory community where he focuses on advancing clients’ business initiatives leveraging the practice of Business and Enterprise Architecture. Prior to joining Oracle he was Senior Director of Enterprise Architecture at Excellus BlueCross BlueShield leading the organization with architecture design, innovation, and technology adoption capabilities within the healthcare industry.

 

Comments Off

Filed under Conference, Data management, Enterprise Architecture, Healthcare, Information security, Standards, TOGAF®

New Accreditation Program – Raises the Bar for Securing Global Supply Chains

By Sally Long, Director of The Open Group Trusted Technology Forum (OTTF)™

In April 2013, The Open Group announced the release of the Open Trusted Technology Provider™ Standard (O-TTPS) 1.0 – Mitigating Maliciously Tainted and Counterfeit Products. Now we are announcing the O-TTPS Accreditation Program, launched on February 3, 2014, which enables organizations that conform to the standard to be accredited as Open Trusted Technology Providers™.

The O-TTPS, a standard of The Open Group, provides a set of guidelines, recommendations and requirements that help assure against maliciously tainted and counterfeit products throughout commercial off-the-shelf (COTS) information and communication technology (ICT) product lifecycles. The standard includes best practices throughout all phases of a product’s life cycle: design, sourcing, build, fulfillment, distribution, sustainment, and disposal, thus enhancing the integrity of COTS ICT products and the security of their global supply chains.

This accreditation program is one of the first of its kind in providing accreditation for conforming to standards for product integrity coupled with supply chain security.

The standard and the accreditation program are the result of a collaboration between government, third party evaluators and some of industry’s most mature and respected providers who came together and, over a period of four years, shared their practices for integrity and security, including those used in-house and those used with their own supply chains.

Applying for O-TTPS Accreditation

When the OTTF started this initiative, one of its many mantras was “raise all boats.” The  objective was to raise the security bar across the full spectrum of the supply chain, from small component suppliers to the providers who include those components in their products and to the integrators who incorporate those providers’ products into customers’ systems.

The O-TTPS Accreditation Program is open to all component suppliers, providers and integrators. The holistic aspect of this program’s potential, as illustrated in the diagram below should not be underestimated—but it will take a concerted effort to reach and encourage all constituents in the supply chain to become involved.

OTTPSThe importance of mitigating the risk of maliciously tainted and counterfeit products

The focus on mitigating the risks of tainted and counterfeit products by increasing the security of the supply chain is critical in today’s global economy. Virtually nothing is made from one source.

COTS ICT supply chains are complex. A single product can be comprised of hundreds of components from multiple component suppliers from numerous different areas around the world—and providers can change their component suppliers frequently depending on the going rate for a particular component.  If, along the supply chain, bad things happen, such as inserting counterfeit components in place of authentic ones or inserting maliciously tainted code or the double-hammer—maliciously tainted counterfeit parts—then terrible things can happen when that product is installed at a customer site.

With the threat of tainted and counterfeit technology products posing a major risk to global organizations, it is increasingly important for those organizations to take what steps they can to mitigate these risks. The O-TTPS Accreditation Program is one of those steps. Can an accreditation program completely eliminate the risk of tainted and counterfeit components? No!  Does it reduce the risk? Absolutely!

How the Accreditation Program works

The Open Group, with over 25 years’ experience managing vendor- and technology-neutral certification programs, will assume the role of the Accreditation Authority over the entire program. Additionally the program will utilize third-party assessors to assess conformance to the O-TTPS requirements.

Companies seeking accreditation will declare their Scope of Accreditation, which means they can choose to be accredited for conforming to the O-TTPS standard and adhering to the best practice requirements across their entire enterprise, within a specific product line or business unit or within an individual product.  Organizations applying for accreditation are then required to provide evidence of conformance for each of the O-TTPS requirements, demonstrating they have the processes in place to secure in-house development and their supply chains across the entire COTS ICT product lifecycle. O-TTPS accredited organizations will then be able to identify themselves as Open Trusted Technology Providers™ and will become part of a public registry of trusted providers.

The Open Group has also instituted the O-TTPS Recognized Assessor Program, which assures that Recognized Assessor (companies) meet certain criteria as assessor organizations and that their assessors (individuals) meet an additional set of criteria and have passed the O-TTPS Assessor exam, before they can be assigned to an O-TTPS Assessment. The Open Group will operate this program, grant O-TTPS Recognized Assessor certificates and list those qualifying organizations on a public registry of recognized assessor companies.

Efforts to increase awareness of the program

The Open Group understands that to achieve global uptake we need to reach out to other countries across the globe for market adoption, as well as to other standards groups for harmonization. The forum has a very active outreach and harmonization work group and the OTTF is increasingly being recognized for its efforts. A number of prominent U.S. government agencies, including the General Accounting Office and NASA have recognized the standard as an important supply chain security effort. Dave Lounsbury, the CTO of The Open Group, has testified before Congress on the value of this initiative from the industry-government partnership perspective. The Open Group has also met with President Obama’s Cybersecurity Coordinators (past and present) to apprise them of our work. We continue to work closely with NIST from the perspective of the Cybersecurity Framework, which recognizes the supply chain as a critical area for the next version, and the OTTF work is acknowledged in NIST’s Special Publication 161. We have liaisons with ISO and are working internally at mapping our standards and accreditation to Common Criteria. The O-TTPS has also been discussed with government agencies in China, India, Japan and the UK.

The initial version of the standard and the accreditation program are just the beginning. OTTF members will continue to evolve both the standard and the accreditation program to provide additional versions that refine existing requirements, introduce additional requirements, and cover additional threats. And the outreach and harmonization efforts will continue to strengthen so that we can reach that holistic potential of Open Trusted Technology Providers™ throughout all global supply chains.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey – m.hickey@opengroup.org

Sally LongSally Long is the Director of The Open Group Trusted Technology Forum (OTTF). She has managed customer supplier forums and collaborative development projects for over twenty years. She was the release engineering section manager for all multi-vendor collaborative technology development projects at The Open Software Foundation (OSF) in Cambridge Massachusetts. Following the merger of the OSF and X/Open under The Open Group, she served as director for multiple forums in The Open Group. Sally has a Bachelor of Science degree in Electrical Engineering from Northeastern University in Boston, Massachusetts.

Comments Off

Filed under Cybersecurity, OTTF, Supply chain risk

Measuring the Immeasurable: You Have More Data Than You Think You Do

By Jim Hietala, Vice President, Security, The Open Group

According to a recent study by the Ponemon Institute, the average U.S. company experiences more than 100 successful cyber-attacks each year at a cost of $11.6M. By enabling security technologies, those companies can reduce losses by nearly $4M and instituting security governance reduces costs by an average of $1.5M, according to the study.

In light of increasing attacks and security breaches, executives are increasingly asking security and risk professionals to provide analyses of individual company risk and loss estimates. For example, the U.S. healthcare sector has been required by the HIPAA Security rule to perform annual risk assessments for some time now. The recent HITECH Act also added security breach notification and disclosure requirements, increased enforcement in the form of audits and increased penalties in the form of fines. Despite federal requirements, the prospect of measuring risk and doing risk analyses can be a daunting task that leaves even the best of us with a case of “analysis paralysis.”

Many IT experts agree that we are nearing a time where risk analysis is not only becoming the norm, but when those risk figures may well be used to cast blame (or be used as part of a defense in a lawsuit) if and when there are catastrophic security breaches that cost consumers, investors and companies significant losses.

In the past, many companies have been reluctant to perform risk analyses due to the perception that measuring IT security risk is too difficult because it’s intangible. But if IT departments could soon become accountable for breaches, don’t you want to be able to determine your risk and the threats potentially facing your organization?

In his book, How to Measure Anything, father of Applied Information Economics Douglas Hubbard points out that immeasurability is an illusion and that organizations do, in fact, usually have the information they need to create good risk analyses. Part of the misperception of immeasurability stems from a lack of understanding of what measurement is actually meant to be. According to Hubbard, most people, and executives in particular, expect measurement and analysis to produce an “exact” number—as in, “our organization has a 64.5 percent chance of having a denial of service attack next year.”

Hubbard argues that, as risk analysts, we need to look at measurement more like how scientists look at things—measurement is meant to reduce uncertainty—not to produce certainty—about a quantity based on observation.  Proper measurement should not produce an exact number, but rather a range of possibility, as in “our organization has a 30-60 percent chance of having a denial of service attack next year.” Realistic measurement of risk is far more likely when expressed as a probability distribution with a range of outcomes than in terms of one number or one outcome.

The problem that most often produces “analysis paralysis” is not just the question of how to derive those numbers but also how to get to the information that will help produce those numbers. If you’ve been tasked, for instance, with determining the risk of a breach that has never happened to your organization before, perhaps a denial of service attack against your web presence, how can you make an accurate determination about something that hasn’t happened in the past? Where do you get your data to do your analysis? How do you model that analysis?

In an article published in CSO Magazine, Hubbard argues that organizations have far more data than they think they do and they actually need less data than they may believe they do in order to do proper analyses. Hubbard says that IT departments, in particular, have gotten so used to having information stored in databases that they can easily query, they forget there are many other sources to gather data from. Just because something hasn’t happened yet and you haven’t been gathering historical data on it and socking it away in your database doesn’t mean you either don’t have any data or that you can’t find what you need to measure your risk. Even in the age of Big Data, there is plenty of useful data outside of the big database.

You will still need to gather that data. But you just need enough to be able to measure it accurately not necessarily precisely. In our recently published Open Group Risk Assessment Standard (O-RA), this is called calibration of estimates. Calibration provides a method for making good estimates, which are necessary for deriving a measured range of probability for risk. Section 3 of the O-RA standard uses provides a comprehensive look at how best to come up with calibrated estimates, as well as how to determine other risk factors using the FAIR (Factor Analysis of Information Risk) model.

So where do you get your data if it’s not already stored and easily accessible in a database? There are numerous sources you can turn to, both externally and internally. You just have to do the research to find it. For example, even if your company hasn’t experienced a DNS attack, many others have—what was their experience when it happened? This information is out there online—you just need to search for it. Industry reports are another source of information. Verizon publishes its own annual Verizon Data Breach Investigations Report for one. DatalossDB publishes an open data beach incident database that provides information on data loss incidents worldwide. Many vendors publish annual security reports and issue regular security advisories. Security publications and analyst firms such as CSO, Gartner, Forrester or Securosis all have research reports that data can be gleaned from.

Then there’s your internal information. Chances are your IT department has records you can use—they likely count how many laptops are lost or stolen each year. You should also look to the experts within your company to help. Other people can provide a wealth of valuable information for use in your analysis. You can also look to the data you do have on related or similar attacks as a gauge.

Chances are, you already have the data you need or you can easily find it online. Use it.

With the ever-growing list of threats and risks organizations face today, we are fast reaching a time when failing to measure risk will no longer be acceptable—in the boardroom or even by governments.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

1 Comment

Filed under Cybersecurity, Data management, Information security, Open FAIR Certification, RISK Management, Uncategorized

Introducing Two New Security Standards for Risk Analysis—Part II – Risk Analysis Standard

By Jim Hietala, VP Security, The Open Group

Last week we took a look at one of the new risk standards recently introduced by The Open Group® Security Forum at the The Open Group London Conference 2013, the Risk Taxonomy Technical Standard 2.0 (O-RT). Today’s blog looks at its sister standard, the Risk Analysis (O-RA) Standard, which provides risk professionals the tools they need to perform thorough risk analyses within their organizations for better decision-making about risk.

Risk Analysis (O-RA) Standard

The new Risk Analysis Standard provides a comprehensive guide for performing effective analysis scenarios within organizations using the Factor Analysis of Information Risk (FAIR™) framework. O-RA is geared toward managing the frequency and magnitude of loss that can arise from a threat, whether human, animal or a natural event–in other words “how often bad things happened and how bad they are when they occur.” Used together, the O-RT and O-RA Standards provide organizations with a way to perform consistent risk modeling, that can not only help thoroughly explain risk factors to stakeholders but allow information security professionals to strengthen existing or create better analysis methods. O-RA may also be used in conjunction with other risk frameworks to perform risk analysis.

The O-RA standard is also meant to provide something more than a mere assessment of risk. Many professionals within the security industry often fail to distinguish between “assessing” risk vs. “analysis” of risk. This standard goes beyond assessment by supporting effective analyses so that risk statements are less vulnerable to problems and are more meaningful and defensible than assessments that provide only the broad risk-ratings (“this is a 4 on a scale of 1-to-5”) normally used in assessments.

O-RA also lays out standard process for approaching risk analysis that can help organizations streamline the way they approach risk measurement. By focusing in on these four core process elements, organizations are able to perform more effective analyses:

  • Clearly identifying and characterizing the assets, threats, controls and impact/loss elements at play within the scenario being assessed
  • Understanding the organizational context for analysis (i.e. what’s at stake from an organizational perspective)
  • Measuring/estimating various risk factors
  • Calculating risk using a model that represents a logical, rational, and useful view of what risk is and how it works.

Because measurement and calculation are essential elements of properly analyzing risk variables, an entire chapter of the standard is dedicated to how to measure and calibrate risk. This chapter lays out a number of useful approaches for establishing risk variables, including establishing baseline risk estimates and ranges; creating distribution ranges and most likely values; using Monte Carlo simulations; accounting for uncertainty; determining accuracy vs. precision and subjective vs. objective criteria; deriving vulnerability; using ordinal scales; and determining diminishing returns.

Finally, a practical, real-world example is provided to take readers through an actual risk analysis scenario. Using the FAIR model, the example outlines the process for dealing with an threat in which an HR executive at a large bank has left the user name and password that allow him access to all the company’s HR systems on a Post-It note tacked onto his computer in his office in clear view of anyone (other employees, cleaning crews, etc.) who comes into the office.

The scenario outlines four stages in assessing this risk:

  1. .    Stage 1: Identify Scenario Components (Scope the Analysis)
  2. .    Stage 2: Evaluate Loss Event Frequency (LEF)
  3. .    Stage 3: Evaluate Loss Magnitude (LM)
  4. .    Stage 4: Derive and Articulate Risk

Each step of the risk analysis process is thoroughly outlined for the scenario to provide Risk Analysts an example of how to perform an analysis process using the FAIR framework. Considerable guidance is provided for stages 2 and 3, in particular, as those are the most critical elements in determining organizational risk.

Ultimately, the O-RA is a guide to help organizations make better decisions about which risks are the most critical for the organization to prioritize and pay attention to versus those that are less important and may not warrant attention. It is critical for Risk Analysts and organizations to become more consistent in this practice because lack of consistency in determining risk among information security professionals has been a major obstacle in allowing security professionals a more legitimate “seat at the table” in the boardroom with other business functions (finance, HR, etc.) within organizations.

For our profession to evolve and grow, consistency and accurate measurement is key. Issues and solutions must be identified consistently and comparisons and measurement must be based on solid foundations, as illustrated below.

Risk2

Chained Dependencies

O-RA can help organizations arrive at better decisions through consistent analysis techniques as well as provide more legitimacy within the profession.  Without a foundation from which to manage information risk, Risk Analysts and information security professionals may rely too heavily on intuition, bias, commercial or personal agendas for their analyses and decision making. By outlining a thorough foundation for Risk Analysis, O-RA provides not only a common foundation for performing risk analyses but the opportunity to make better decisions and advance the security profession.

For more on the O-RA Standard or to download it, please visit: https://www2.opengroup.org/ogsys/catalog/C13G.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Introducing Two New Security Standards for Risk Analysis—Part I – Risk Taxonomy Technical Standard 2.0

By Jim Hietala, VP Security, The Open Group

At the The Open Group London 2013 Conference, The Open Group® announced three new initiatives related to the Security Forum’s work around Risk Management. The first of these was the establishment of a new certification program for Risk Analysts working within the security profession, the Open FAIR Certification Program.  Aimed at providing a professional certification for Risk Analysts, the program will bring a much-needed level of assuredness to companies looking to hire Risk Analysts, certifying that analysts who have completed the Open FAIR program understand the fundamentals of risk analysis and are qualified to perform that analysis.

Forming the basis of the Open FAIR certification program are two new Open Group standards, version 2.0 of the Risk Taxonomy (O-RT) standard originally introduced by the Security Forum in 2009, and a new Risk Analysis (O-RA) Standard, both of which were also announced at the London conference. These standards are the result of ongoing work around risk analysis that the Security Forum has been conducting for a number of years now in order to help organizations better understand and identify their exposure to risk, particularly when it comes to information security risk.

The Risk Taxonomy and Risk Analysis standards not only form the basis and body of knowledge for the Open FAIR certification, but provide practical advice for security practitioners who need to evaluate and counter the potential threats their organization may face.

Today’s blog will look at the first standard, the Risk Taxonomy Technical Standard, version 2.0. Next week, we’ll look at the other standard for Risk Analysis.

Risk Taxonomy (O-RT) Technical Standard 2.0

Originally, published in January 2009, the O-RT is intended to provide a common language and references for security and business professionals who need to understand or analyze risk conditions, providing a common language for them to use when discussing those risks. Version 2.0 of the standard contains a number of updates based both on feedback provided by professionals that have been using the standard and as a result of research conducted by Security Forum member CXOWARE.

The majority of the changes to Version 2.0 are refinements in terminology, including changes in language that better reflect what each term encompasses. For example, the term “Control Strength” in the original standard has now been changed to “Resistance Strength” to reflect that controls used in that part of the taxonomy must be resistive in nature.

More substantive changes were made to the portion of the taxonomy that discusses how Loss Magnitude is evaluated.

Why create a taxonomy for risk?  For two reasons. First, the taxonomy provides a foundation from which risk analysis can be performed and talked about. Second, a tightly defined taxonomy reduces the inability to effectively measure or estimate risk scenarios, leading to better decision making, as illustrated by the following “risk management stack.”

Effective Management


↑

Well-informed Decisions

Effective Comparisons


↑

Meaningful Measurements

Accurate Risk Model

The complete Risk Taxonomy is comprised of two branches: Loss Event Frequency (LEF) and Loss Magnitude (LM), illustrated here:

Risk1

Focusing solely on pure risk (which only results in loss) rather than speculative risk (which might result in either loss or profit), the O-RT is meant to help estimate the probable frequency and magnitude of future loss.

Traditionally LM has been far more difficult to determine than LEF, in part because organizations don’t always perform analyses on their losses or they just stick to evaluating “low hanging fruit” variables rather than delve into determining more complex risk factors. The new taxonomy takes a deep dive into the Loss Magnitude branch of the risk analysis taxonomy providing guidance that will allow Risk Analysts to better tackle the difficult task of determining LM. It includes terminology outlining six specific forms of loss an organization can experience (productivity, response, replacement, fines and judgments, competitive advantage, reputation) as well as how to determine Loss Flow, a new concept in this standard.

The Loss Flow analysis helps identify how a loss may affect both primary (owners, employees, etc.) and secondary (customers, stockholders, regulators, etc.) stakeholders as a result of a threat agent’s action on an asset. The new standard provides a thorough overview on how to assess Loss Flow and identify the loss factors of any given threat.

Finally, the standard also includes a practical, real-world scenario to help analysts understand how to put the taxonomy to use in within their organizations. O-RT provides a common linguistic foundation that will allow security professionals to then perform the risk analyses as outlined in the O-RA Standard.

For more on the Risk Taxonomy Standard or to download it, visit: https://www2.opengroup.org/ogsys/catalog/C13K.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Jericho Forum declares “success” and sunsets

By Ian Dobson & Jim Hietala, The Open Group
Ten years ago, the Jericho Forum set out on a mission to evangelise the issues, problems, solutions and provide thought-leadership around the emerging business and security issues of de-perimeterisation, with the aim of one day being able to declare “job-done”.

That day has now arrived.  Today, de-perimeterisation is an established “fact” – touching not just information security but all areas of modern business, including the bring your own IT phenomenon (devices, IDs, services) as well as all forms of cloud computing. It’s widely understood and quoted by the entire industry.  It has become part of today’s computing and security lexicon.

With our de-perimeterisation mission accomplished, the Jericho Forum has decided the time has come to “declare success”, celebrate it as a landmark victory in the evolution of information security, and sunset as a separate Forum in The Open Group.

Our “declare success and sunset” victory celebration on Monday 21st Oct 2013 at the Central Hall Westminster, London UK, was our valedictory announcement that the Jericho Forum will formally sunset on 1st Nov 2013.  The event included many past leading Jericho Forum members attending as guests, with awards of commemorative plaques to those whose distinctive leadership steered the information security mind-set change success that the Jericho Forum has now achieved.

For those who missed the live-streamed event, you can watch it on the livestream recording at http://new.livestream.com/opengroup/Lon13

We are fortunate to be able to pass our Jericho Forum legacy of de-perimeterisation achievements and publications to the good care of The Open Group’s Security Forum, which has undertaken to maintain the Jericho Forum’s deliverables, protect it’s legacy from mis-representation, and perhaps adopt and evolve Jericho’s thought-leadership approach on future information security challenges.

Ian Dobson, Director Jericho Forum
Jim Hietala, VP Security
The Open Group
21st October 2013


Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world. In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Security Architecture

The Open Group London 2013 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

On Monday October 21st, The Open Group kicked off the first day of our Business Transformation conference in London!  Over 275 guests attended many engaging presentations by subject matter experts in finance, healthcare and government.  Attendees from around the globe represented 28 countries including those from as far away as Columbia, Philippines, Australia, Japan and South Africa.

Allen Brown, President and CEO of The Open Group, welcomed the prestigious group.  Allen announced that The Open Group has 67 new member organizations so far this year!

The plenary launched with “Just Exactly What is Going On in Business and Technology?” by Andy Mulholland, Former Global CTO of Capgemini, who was named one of the top 25 influential CTOs by InfoWorld.  Andy’s key topics regarding digital disruption included real drivers of change, some big and fundamental implications, business model innovation, TOGAF® and the Open Platform 3.0™ initiative.

Next up was Judith Jones, CEO, Architecting the Enterprise Ltd., with a presentation entitled “One World EA Framework for Governments – The Way Forward”.  Judith shared findings from the World Economic Forum, posing the question “what keeps 1000 global leaders awake at night”? Many stats were presented with over 50 global risks – economical, societal, environmental, geopolitical and technological.

Jim Hietala, VP, Security of The Open Group announced the launch of the Open FAIR Certification for People Program.  The new program brings a much-needed certification to the market which focuses on risk analysis. Key partners include CXOWARE, Architecting the Enterprise, SNA Technologies and The Unit bv.

Richard Shreeve, Consultancy Director, IPL and Angela Parratt, Head of Transformation and joint CIO, Bath and North East Somerset Council presented “Using EA to Inform Business Transformation”.  Their case study addressed the challenges of modeling complexity in diverse organizations and the EA-led approach to driving out cost and complexity while maintaining the quality of service delivery.

Allen Brown announced that the Jericho Forum® leaders together with The Open Group management have concluded that the Jericho Forum has achieved its original mission – to establish “de-perimeterization” that touches all areas of modern business.  In declaring this mission achieved, we are now in the happy position to celebrate a decade of success and move to ensuring that the legacy of the Jericho Forum is both maintained within The Open Group and continues to be built upon.  (See photo below.)

Following the plenary, the sessions were divided into tracks – Finance/Commerce, Healthcare and Tutorials/Workshops.

During the Healthcare track, one of the presenters, Larry Schmidt, Chief Technologist with HP, discussed “Challenges and Opportunities for Big Data in Healthcare”. Larry elaborated on the 4 Vs of Big Data – value, velocity, variety and voracity.

Among the many presenters in the Finance/Commerce track, Omkhar Arasaratnam, Chief Security Architect, TD Bank Group, Canada, featured “Enterprise Architecture – We Do That?: How (not) to do Enterprise Architecture at a Bank”.  Omkhar provided insight as to how he took traditional, top down, center-based architectural methodologies and applied it to a highly federated environment.

Tutorials/workshops consisted of EA Practice and Architecture Methods and Techniques.

You can view all of the plenary and many of the track presentations at livestream.com.  For those who attended, please stay tuned for the full conference proceedings.

The evening concluded with a networking reception at the beautiful and historic and Central Hall Westminster.  What an interesting, insightful, collaborative day it was!

IMG_1311

Comments Off

Filed under Business Architecture, Certifications, Cloud, Cloud/SOA, Conference, Cybersecurity, Information security, Open Platform 3.0, Professional Development, RISK Management, Security Architecture, Standards, TOGAF®

Open FAIR Certification Launched

By Jim Hietala, The Open Group, VP of Security

The Open Group today announced the new Open FAIR Certification Program aimed at Risk Analysts, bringing a much-needed professional certification to the market that is focused on the practice of risk analysis. Both the Risk Taxonomy and Risk Analysis standards, standards of The Open Group, constitute the body of knowledge for the certification program, and they advance the risk analysis profession by defining a standard taxonomy for risk, and by describing the process aspects of a rigorous risk analysis.

We believe that this new risk analyst certification program will bring significant value to risk analysts, and to organizations seeking to hire qualified risk analysts. Adoption of these two risk standards from The Open Group will help produce more effective and useful risk analysis. This program clearly represents the growing need in our industry for professionals who understand risk analysis fundamentals.  Furthermore, the mature processes and due diligence The Open Group applies to our standards and certification programs will help make organizations comfortable with the ground breaking concepts and methods underlying FAIR. This will also help professionals looking to differentiate themselves by demonstrating the ability to take a “business perspective” on risk.

In order to become certified, Risk Analysts must pass an Open FAIR certification exam. All certification exams are administered through Prometric, Inc. Exam candidates can start the registration process by visiting Prometric’s Open Group Test Sponsor Site www.prometric.com/opengroup.  With 4,000 testing centers in its IT channel, Prometric brings Open FAIR Certification to security professionals worldwide. For more details on the exam requirements visit http://www.opengroup.org/certifications/exams.

Training courses will be delivered through an Open Group accredited channel. The accreditation of Open FAIR training courses will be available from November 1st 2013.

Our thanks to all of the members of the risk certification working group who worked tirelessly over the past 15 months to bring this certification program, along with a new risk analysis standard and a revised risk taxonomy standard to the market. Our thanks also to the sponsors of the program, whose support is important to building this program. The Open FAIR program sponsors are Architecting the Enterprise, CXOWARE, SNA, and The Unit.

Lastly, if you are involved in risk analysis, we encourage you to consider becoming Open FAIR certified, and to get involved in the risk analysis program at The Open Group. We have plans to develop an advanced level of Open FAIR certification, and we also see a great deal of best practices guidance that is needed by the industry.

For more information on the Open FAIR certification program visit http://www.opengroup.org/certifications/openfair

You may also wish to attend a webcast scheduled for 7th November, 4pm BST that will provide an overview of the Open FAIR certification program, as well as an overview of the two risk standards. You can register here

.62940-hietala

Jim Hietala, CISSP, GSEC, is Vice President, Security for The Open Group, where he manages all security and risk management programs and standards activities, including the Security Forum and the Jericho Forum.  He has participated in the development of several industry standards including O-ISM3, O-ESA, Risk Taxonomy Standard, and O-ACEML. He also led the development of compliance and audit guidance for the Cloud Security Alliance v2 publication.

Jim is a frequent speaker at industry conferences. He has participated in the SANS Analyst/Expert program, having written several research white papers and participated in several webcasts for SANS. He has also published numerous articles on information security, risk management, and compliance topics in publications including CSO, The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

An IT security industry veteran, he has held leadership roles at several IT security vendors.

Jim holds a B.S. in Marketing from Southern Illinois University.

Comments Off

Filed under Conference, Cybersecurity, Open FAIR Certification, Standards

The Open Group Philadelphia – Day Three Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

We are winding down Day 3 and gearing up for the next two days of training and workshops.  Today’s subject areas included TOGAF®, ArchiMate®, Risk Management, Innovation Management, Open Platform 3.0™ and Future Trends.

The objective of the Future Trends session was to discuss “emerging business and technical trends that will shape enterprise IT”, according to Dave Lounsbury, Chief Technical Officer of The Open Group.

This track also featured a presentation by Dr. William Lafontaine, VP High Performance Computing, Analytics & Cognitive Markets, IBM Research, who gave an overview of the “Global Technology Outlook 2013”.  He stated the Mega Trends are:  Growing Scale/Lower Barrier of Entry; Increasing Complexity/Yet More Consumable; Fast Pace; Contextual Overload.  Mike Walker, Strategies & Enterprise Architecture Advisor for HP, noted the key disrupters that will affect our future are the business of IT, technology itself, expectation of consumers and globalization.

The session concluded with an in-depth Q&A with Bill, Dave, Mike (as shown below) and Allen Brown, CEO of The Open Group.Philly Day 3

Other sessions included presentations by TJ Virdi (Senior Enterprise Architect, Boeing) on Innovation Management, Jack Jones (President, CXOWARE, Inc.) on Risk Management and Stephen Bennett (Executive Principal, Oracle) on Big Data.

A special thanks goes to our many sponsors during this dynamic conference: Windstream, Architecting the Enterprise, Metaplexity, BIZZdesign, Corso, Avolution, CXOWARE, Penn State – Online Program in Enterprise Architecture, and Association of Enterprise Architects.

Stay tuned for post-conference proceedings to be posted soon!  See you at our conference in London, October 21-24.

Comments Off

Filed under ArchiMate®, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Open Platform 3.0, RISK Management, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications at The Open Group.

philly 2.jpgDay 2 at The Open Group conference in the City of Brotherly Love, as Philadelphia is also known, was another busy and remarkable day.

The plenary started with a fascinating presentation, “Managing the Health of the Nation” by David Nash, MD, MBA, Dean of Jefferson School of Population Health.  Healthcare is the number one industry in the city of Philadelphia, with the highest number of patients in beds in the top 10 US cities. The key theme of his thought-provoking speech was “boundaryless information sharing” (sound familiar?), which will enable a healthcare system that is “safe, effective, patient-centered, timely, equitable, efficient”.

Following Dr. Nash’s presentation was the Healthcare Transformation Panel moderated by Allen Brown, CEO of The Open Group.  Participants were:  Gina Uppal (Fulbright-Killam Fellow, American University Program), Mike Lambert (Open Group Fellow, Architecting the Enterprise), Rosemary Kennedy (Associate Professor, Thomas Jefferson University), Blaine Warkentine, MD, MPH and Fran Charney (Pennsylvania Patient Safety Authority). The group brought different sets of experiences within the healthcare system and provided reaction to Dr. Nash’s speech.  All agree on the need for fundamental change and that technology will be key.

The conference featured a spotlight on The Open Group’s newest forum, Open Platform 3.0™ by Dr. Chris Harding, Director of Interoperability.  Open Platform 3.0 was formed to advance The Open Group vision of Boundaryless Information Flow™ to help enterprises in the use of Cloud, Social, Mobile Computing and Big Data.  For more info; http://www.opengroup.org/getinvolved/forums/platform3.0

The Open Group flourishes because of people interaction and collaboration.  The accolades continued with several members being recognized for their outstanding contributions to The Open Group Trusted Technology Forum (OTTF) and the Service-Oriented Architecture (SOA) and Cloud Computing Work Groups.  To learn more about our Forums and Work Groups and how to get involved, please visit http://www.opengroup.org/getinvolved

Presentations and workshops were also held in the Healthcare, Finance and Government vertical industries. Presenters included Larry Schmidt (Chief Technologist, HP), Rajamanicka Ponmudi (IT Architect, IBM) and Robert Weisman (CEO, Build the Vision, Inc.).

2 Comments

Filed under ArchiMate®, Business Architecture, Cloud/SOA, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Open Platform 3.0, Security Architecture, Standards, TOGAF®

The Open Group Philadelphia – Day One Highlights

By Loren K.  Baynes, Director, Global Marketing Communications at The Open Group.

PhillyOn Monday, July 15th, we kicked off our conference in Philadelphia. As Allen Brown, CEO of The Open Group, commented in his opening remarks, Philadelphia is the birthplace of American democracy.  This is the first time The Open Group has hosted a conference in this historical city.

Today’s plenary sessions featured keynote speakers covering topics ranging from an announcement of a new Open Group standard, appointment of a new Fellow, Enterprise Architecture and Transformation, Big Data and spotlights on The Open Group forums, Real-time Embedded Systems and Open Trusted Technology, as well as a new initiative on Healthcare.

Allen Brown noted that The Open Group has 432 member organizations with headquarters in 32 countries and over 40,000 individual members in 126 countries.

The Open Group Vision is Boundaryless Information Flow™ achieved through global interoperability in a secure, reliable and timely manner.  But as stated by Allen, “Boundaryless does not mean there are no boundaries.  It means that boundaries are permeable to enable business”

Allen also presented an overview of the new “Dependability Through Assuredness™ Standard.  The Open Group Real-time Embedded Systems Forum is the home of this standard. More news to come!

Allen introduced Dr. Mario Tokoro, (CEO of Sony Computer Systems Laboratories) who began this project in 2006. Dr. Tokoro stated, “Thank you from the bottom of my heart for understanding the need for this standard.”

Eric Sweden, MSIH MBA, Program Director, Enterprise Architecture & Governance\National Association of State CIOs (NASCIO) offered a presentation entitled “State of the States – NASCIO on Enterprise Architecture: An Emphasis on Cross-Jurisdictional Collaboration across States”.  Eric noted “Enterprise Architecture is a blueprint for better government.” Furthermore, “Cybersecurity is a top priority for government”.

Dr. Michael Cavaretta, Technical Lead and Data Scientist with Ford Motor Company discussed “The Impact of Big Data on the Enterprise”.  The five keys, according to Dr. Cavaretta, are “perform, analyze, assess, track and monitor”.  Please see the following transcript from a Big Data analytics podcast, hosted by The Open Group, Dr. Cavaretta participated in earlier this year. http://blog.opengroup.org/2013/01/28/the-open-group-conference-plenary-speaker-sees-big-data-analytics-as-a-way-to-bolster-quality-manufacturing-and-business-processes/

The final presentation during Monday morning’s plenary was “Enabling Transformation Through Architecture” by Lori Summers (Director of Technology) and Amit Mayabhate (Business Architect Manager) with Fannie Mae Multifamily.

Lori stated that their organization had adopted Business Architecture and today they have an integrated team who will complete the transformation, realize value delivery and achieve their goals.

Amit noted “Traceability from the business to architecture principles was key to our design.”

In addition to the many interesting and engaging presentations, several awards were presented.  Joe Bergmann, Director, Real-time and Embedded Systems Forum, The Open Group, was appointed Fellow by Allen Brown in recognition of Joe’s major achievements over the past 20+ years with The Open Group.

Other special recognition recipients include members from Oracle, IBM, HP and Red Hat.

In addition to the plenary session, we hosted meetings on Finance, Government and Healthcare industry verticals. Today is only Day One of The Open Group conference in Philadelphia. Please stay tuned for more exciting conference highlights over the next couple days.

Comments Off

Filed under ArchiMate®, Business Architecture, Conference, Cybersecurity, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, O-TTF, Security Architecture, Standards, TOGAF®