Category Archives: Conference

The Open Group Boston 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

The Open Group kicked off Enabling Boundaryless Information Flow™  July 21 at the spectacular setting of the Hyatt Boston Harbor. Allen Brown, CEO and President of The Open Group, welcomed over 150 people from 20 countries, including as far away as Australia, Japan, Saudi Arabia and India.

The first keynote speaker was Marshall Van Alstyne, Professor at Boston University School of Management & Researcher at MIT Center for Digital Business, known as a leading expert in business models. His presentation entitled Platform Shift – How New Open Business Models are Changing the Shape of Industry posed the questions “What does ‘openness’ mean? Why do platforms beat products every time?”.

Van AlstyneMarshall Van Alstyne

According to “InterBrand: 2014 Best Global Brands”, 13 of the top 31 companies are “platform companies”. To be a ‘platform’, a company needs embeddable functions or service and allow 3rd party access. Alystyne noted, “products have features, platforms have communities”. Great standalone products are not sufficient. Positive changes experienced by a platform company include pricing/profitability, supply chains, internal organization, innovation, decreased industry bottlenecks and strategy.

Platforms benefit from broad contributions, as long as there is control of the top several complements. Alstyne commented, “If you believe in the power of community, you need to embrace the platform.”

The next presentation was Open Platform 3.0™ – An Integrated Approach to the Convergence of Technology Platforms, by Dr. Chris Harding, Director for Interoperability, The Open Group. Dr. Harding discussed how society has developed a digital society.

1970 was considered the dawn of an epoch which saw the First RAM chip, IBM introduction of System/370 and a new operating system – UNIX®. Examples of digital progress since that era include driverless cars and Smart Cities (management of traffic, energy, water, communication).

Digital society enablers are digital structural change and corporate social media. The benefits are open innovation, open access, open culture, open government and delivering more business value.

Dr. Harding also noted, standards are essential to innovation and enable markets based on integration. The Open Group Open Platform 3.0™ is using ArchiMate®, an Open Group standard, to analyze the 30+ business use cases produced by the Forum. The development cycle is understanding, analysis, specification, iteration.

Dr. Harding emphasized the importance of Boundaryless Information Flow™, as an enabler of business objectives and efficiency through IT standards in the era of digital technology, and designed for today’s agile enterprise with direct involvement of business users.

Both sessions concluded with an interactive audience Q&A hosted by Allen Brown.

The last session of the morning’s plenary was a panel: The Internet of Things and Interoperability. Dana Gardner, Principal Analyst at Interarbor Solutions, moderated the panel. Participating in the panel were Said Tabet, CTO for Governance, Risk and Compliance Strategy, EMC; Penelope Gordon, Emerging Technology Strategist, 1Plug Corporation; Jean-Francois Barsoum, Senior Managing Consultant, Smarter Cities, Water & Transportation, IBM; and Dave Lounsbury, CTO, The Open Group.

IoT PanelIoT Panel – Gardner, Barsoum, Tabet, Lounsbury, Gordon

The panel explored the practical limits and opportunities of Internet of Things (IoT). The different areas discussed include obstacles to decision-making as big data becomes more prolific, openness, governance and connectivity of things, data and people which pertain to many industries such as smart cities, manufacturing and healthcare.

How do industries, organizations and individuals deal with IoT? This is not necessarily a new problem, but an accelerated one. There are new areas of interoperability but where does the data go and who owns the data? Openness is important and governance is essential.

What needs to change most to see the benefits of the IoT? The panel agreed there needs to be a push for innovation, increased education, move beyond models of humans managing the interface (i.e. machine-to-machine) and determine what data is most important, not always collecting all the data.

A podcast and transcript of the Internet of Things and Interoperability panel will be posted soon.

The afternoon was divided into several tracks: Boundaryless Information Flow™, Open Platform 3.0™ and Enterprise Architecture (EA) & Enterprise Transformation. Best Practices for Enabling Boundaryless Information Flow across the Government was presented by Syed Husain, Consultant Enterprise Architecture, Saudi Arabia E-government Authority. Robert K. Pucci, CTO, Communications Practice, Cognizant Technology Solutions discussed Business Transformation Justification Leveraging Business and Enterprise Architecture.

The evening concluded with a lively networking reception at the hotel.

Join the conversation #ogBOS!

Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog and media relations. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

 

Leave a comment

Filed under Uncategorized, Enterprise Architecture, ArchiMate®, Standards, Business Architecture, Data management, Enterprise Transformation, Conference, Professional Development, Healthcare, Open Platform 3.0, Boundaryless Information Flow™, Interoperability

The Open Group Boston 2014 Preview: Talking People Architecture with David Foote

By The Open Group

Among all the issues that CIOs, CTOs and IT departments are facing today, staffing is likely near the top of the list of what’s keeping them up at night. Sure, there’s dealing with constant (and disruptive) technological changes and keeping up with the latest tech and business trends, such as having a Big Data, Internet of Things (IoT) or a mobile strategy, but without the right people with the right skills at the right time it’s impossible to execute on these initiatives.

Technology jobs are notoriously difficult to fill–far more difficult than positions in other industries where roles and skillsets may be much more static. And because technology is rapidly evolving, the roles for tech workers are also always in flux. Last year you may have needed an Agile developer, but today you may need a mobile developer with secure coding ability and in six months you might need an IoT developer with strong operations or logistics domain experience—with each position requiring different combinations of tech, functional area, solution and “soft” skillsets.

According to David Foote, IT Industry Analyst and co-founder of IT workforce research and advisory firm Foote Partners, the mash-up of HR systems and ad hoc people management practices most companies have been using for years to manage IT workers have become frighteningly ineffective. He says that to cope in today’s environment, companies need to architect their people infrastructure similar to how they have been architecting their technical infrastructure.

“People Architecture” is the term Foote has coined to describe the application of traditional architectural principles and practices that may already be in place elsewhere within an organization and applying them to managing the IT workforce. This includes applying such things as strategy and capability roadmaps, phase gate blueprints, benchmarks, performance metrics, governance practices and stakeholder management to human capital management (HCM).

HCM components for People Architecture typically include job definition and design, compensation, incentives and recognition, skills demand and acquisition, job and career paths, professional development and work/life balance.

Part of the dilemma for employers right now, Foote says, is that there is very little job title standardization in the marketplace and too many job titles floating around IT departments today. “There are too many dimensions and variability in jobs now that companies have gotten lost from an HR perspective. They’re unable to cope with the complexity of defining, determining pay and laying out career paths for all these jobs, for example. For many, serious retention and hiring problems are showing up for the first time. Work-around solutions used for years to cope with systemic weaknesses in their people management systems have stopped working,” says Foote. “Recruiters start picking off their best people and candidates are suddenly rejecting offers and a panic sets in. Tensions are palpable in their IT workforce. These IT realities are pervasive.”

Twenty-five years ago, Foote says, defining roles in IT departments was easier. But then the Internet exploded and technology became far more customer-facing, shifting basic IT responsibilities from highly technical people deep within companies to roles requiring more visibility and transparency within and outside the enterprise. Large chunks of IT budgets moved into the business lines while traditional IT became more of a business itself.

According to Foote, IT roles became siloed not just by technology but by functional areas such as finance and accounting, operations and logistics, sales, marketing and HR systems, and by industry knowledge and customer familiarity. Then the IT professional services industry rapidly expanded to compete with their customers for talent in the marketplace. Even the architect role changed: an Enterprise Architect today can specialize in applications, security or data architecture among others, or focus on a specific industry such as energy, retail or healthcare.

Foote likens the fragmentation of IT jobs and skillsets that’s happening now to the emergence of IT architecture 25 years ago. Just as technical architecture practices emerged to help make sense of the disparate systems rapidly growing within companies and how best to determine the right future tech investments, a people architecture approach today helps organizations better manage an IT workforce spread through the enterprise with roles ranging from architects and analysts to a wide variety of engineers, developers and project and program managers.

“Technical architecture practices were successful because—when you did them well—companies achieved an understanding of what they have systems-wise and then connected it to where they were going and how they were going to get there, all within a process inclusive of all the various stakeholders who shared the risk in the outcome. It helped clearly define enterprise technology capabilities and gave companies more options and flexibility going forward,” according to Foote.

“Right now employers desperately need to incorporate in human capital management systems and practice the same straightforward, inclusive architecture approaches companies are already using in other areas of their businesses. This can go a long way toward not just lessening staffing shortages but also executing more predictably and being more agile in face of constant uncertainties and the accelerating pace of change. Ultimately this translates into a more effective workforce whether they are full-timers or the contingent workforce of part-timers, consultants and contractors.

“It always comes down to your people. That’s not a platitude but a fact,” insists Foote. “If you’re not competitive in today’s labor marketplace and you’re not an employer where people want to work, you’re dead.”

One industry that he says has gotten it right is the consulting industry. “After all, their assets walk out the door every night. Consulting groups within firms such as IBM and Accenture have been good at architecting their staffing because it’s their job to get out in front of what’s coming technologically. Because these firms must anticipate customer needs before they get the call to implement services, they have to be ahead of the curve in already identifying and hiring the bench strength needed to fulfill demand. They do many things right to hire, develop and keep the staff they need in place.”

Unfortunately, many companies take too much of a just-in-time approach to their workforce so they are always managing staffing from a position of scarcity rather than looking ahead, Foote says. But, this is changing, in part due to companies being tired of never having the people they need and being able to execute predictably.

The key is to put a structure in place that addresses a strategy around what a company needs and when. This applies not just to the hiring process, but also to compensation, training and advancement.

“Architecting anything allows you to be able to, in a more organized way, be more agile in dealing with anything that comes at you. That’s the beauty of architecture. You plan for the fact that you’re going to continue to scale and continue to change systems, the world’s going to continue to change, but you have an orderly way to manage the governance, planning and execution of that, the strategy of that and the implementation of decisions knowing that the architecture provides a more agile and flexible modular approach,” he said.

Foote says organizations such as The Open Group can lend themselves to facilitating People Architecture in a couple different ways. First, through extending the principles of architecture to human capital management, and second through vendor-independent, expertise and experience driven certifications, such as TOGAF® or OpenCA and OpenCITS, that help companies define core competencies for people and that provide opportunities for training and career advancement.

“I’m pretty bullish on many vendor-independent certifications in general, particularly where a defined book of knowledge exists that’s achieved wide acceptance in the industry. And that’s what you’ve got with The Open Group. Nobody’s challenging the architectural framework supremacy of TOGAF that that I’m aware of. In fact, large vendors with their own certifications participated actively in developing the framework and applying it very successfully to their business models,” he said.

Although the process of implementing People Architecture can be difficult and may take several years to master (much like Enterprise Architecture), Foote says it is making a huge difference for companies that implement it.

To learn more about People Architecture and models for implementing it, plan to attend Foote’s session at The Open Group Boston 2014 on Tuesday July 22. Foote’s session will address how architectural principles are being applied to human capital so that organizations can better manage their workforces from hiring and training through compensation, incentives and advancement. He will also discuss how career paths for EAs can be architected. Following the conference, the session proceedings will be available to Open Group members and conference attendees at www.opengroup.org.

Join the conversation – #ogchat #ogBOS

footeDavid Foote is an IT industry research pioneer, innovator, and one of the most quoted industry analysts on global IT workforce trends and multiple facets of the human side of technology value creation. His two decades of groundbreaking deep research and analysis of IT-business cross-skilling and technology/business management integration and leading the industry in innovative IT skills demand and compensation benchmarking has earned him a place on a short list of thought leaders in IT human capital management.

A former Gartner and META Group analyst, David leads the research and analytical practice groups at Foote Partners that reach 2,300 customers on six continents.

Leave a comment

Filed under architecture, Conference, Open CA, Open CITS, Professional Development, Standards, TOGAF®, Uncategorized

New Health Data Deluges Require Secure Information Flow Enablement Via Standards, Says The Open Group’s New Healthcare Director

By The Open Group

Below is the transcript of The Open Group podcast on how new devices and practices have the potential to expand the information available to Healthcare providers and facilities.

Listen to the podcast here.

Dana Gardner: Hello, and welcome to a special BriefingsDirect Thought Leadership Interview coming to you in conjunction with The Open Group’s upcoming event, Enabling Boundaryless Information Flow™ July 21-22, 2014 in Boston.

GardnerI’m Dana Gardner, Principal Analyst at Interarbor Solutions and I’ll be your host and moderator for the series of discussions from the conference on Boundaryless Information Flow, Open Platform 3.0™, Healthcare, and Security issues.

One area of special interest is the Healthcare arena, and Boston is a hotbed of innovation and adaption for how technology, Enterprise Architecture, and standards can improve the communication and collaboration among Healthcare ecosystem players.

And so, we’re joined by a new Forum Director at The Open Group to learn how an expected continued deluge of data and information about patients, providers, outcomes, and efficiencies is pushing the Healthcare industry to rapid change.

WJason Lee headshotith that, please join me now in welcoming our guest. We’re here with Jason Lee, Healthcare and Security Forums Director at The Open Group. Welcome, Jason.

Jason Lee: Thank you so much, Dana. Good to be here.

Gardner: Great to have you. I’m looking forward to the Boston conference and want to remind our listeners and readers that it’s not too late to sign up. You can learn more at http://www.opengroup.org.

Jason, let’s start by talking about the relationship between Boundaryless Information Flow, which is a major theme of the conference, and healthcare. Healthcare perhaps is the killer application for Boundaryless Information Flow.

Lee: Interesting, I haven’t heard it referred to that way, but healthcare is 17 percent of the US economy. It’s upwards of $3 trillion. The costs of healthcare are a problem, not just in the United States, but all over the world, and there are a great number of inefficiencies in the way we practice healthcare.

We don’t necessarily intend to be inefficient, but there are so many places and people involved in healthcare, it’s very difficult to get them to speak the same language. It’s almost as if you’re in a large house with lots of different rooms, and every room you walk into they speak a different language. To get information to flow from one room to the other requires some active efforts and that’s what we’re undertaking here at The Open Group.

Gardner: What is it about the current collaboration approaches that don’t work? Obviously, healthcare has been around for a long time and there have been different players involved. What’s the hurdle? What prevents a nice, seamless, easy flow and collaboration in information that gets better outcomes? What’s the holdup?

Lee: There are many ways to answer that question, because there are many barriers. Perhaps the simplest is the transformation of healthcare from a paper-based industry to a digital industry. Everyone has walked into an office, looked behind the people at the front desk, and seen file upon file and row upon row of folders, information that’s kept in a written format.

When there’s been movement toward digitizing that information, not everyone has used the same system. It’s almost like trains running on a different gauge track. Obviously if the track going east to west is a different gauge than going north to south, then trains aren’t going to be able to travel on those same tracks. In the same way, healthcare information does not flow easily from one office to another or from one provider to another.

Gardner: So not only do we have disparate strategies for collecting and communicating health data, but we’re also seeing much larger amounts of data coming from a variety of new and different places. Some of them now even involve sensors inside of patients themselves or devices that people will wear. So is the data deluge, the volume, also an issue here?

Lee: Certainly. I heard recently that an integrated health plan, which has multiple hospitals involved, contains more elements of data than the Library of Congress. As information is collected at multiple points in time, over a relatively short period of time, you really do have a data deluge. Figuring out how to find your way through all the data and look at the most relevant for the patient is a great challenge.

Gardner: I suppose the bad news is that there is this deluge of data, but it’s also good news, because more data means more opportunity for analysis, a better ability to predict and determine best practices, and also provide overall lower costs with better patient care.

So it seems like the stakes are rather high here to get this right, to not just crumble under a volume or an avalanche of data, but to master it, because it’s perhaps the future. The solution is somewhere in there too.

Lee: No question about it. At The Open Group, our focus is on solutions. We, like others, put a great deal of effort into describing the problems, but figuring out how to bring IT technologies to bear on business problems, how to encourage different parts of organizations to speak to one another and across organizations to speak the same language, and to operate using common standards and language. That’s really what we’re all about.

And it is, in a large sense, part of the process of helping to bring healthcare into the 21st Century. A number of industries are a couple of decades ahead of healthcare in the way they use large datasets — big data, some people refer to it as. I’m talking about companies like big department stores and large online retailers. They really have stepped up to the plate and are using that deluge of data in ways that are very beneficial to them, and healthcare can do the same. We’re just not quite at the same level of evolution.

Gardner: And to your point, the stakes are so much higher. Retail is, of course, a big deal in the economy, but as you pointed out, healthcare is such a much larger segment and portion. So just making modest improvements in communication, collaboration, or data analysis can reap huge rewards.

Lee: Absolutely true. There is the cost side of things, but there is also the quality side. So there are many ways in which healthcare can improve through standardization and coordinated development, using modern technology that cannot just reduce cost, but improve quality at the same time.

Gardner: I’d like to get into a few of the hotter trends, but before we do, it seems that The Open Group has recognized the importance here by devoting the entire second day of their conference in Boston, that will be on July 22, to Healthcare.

Maybe you could give us a brief overview of what participants, and even those who come in online and view recorded sessions of the conference at http://new.livestream.com/opengroup should expect? What’s going to go on July 22nd?

Lee: We have a packed day. We’re very excited to have Dr. Joe Kvedar, a physician at Partners HealthCare and Founding Director of the Center for Connected Health, as our first plenary speaker. The title of his presentation is “Making Health Additive.” Dr. Kvedar is a widely respected expert on mobile health, which is currently the Healthcare Forum’s top work priority. As mobile medical devices become ever more available and diversified, they will enable consumers to know more about their own health and wellness. A great deal of data of potentially useful health data will be generated. How this information can be used–not just by consumers but also by the healthcare establishment that takes care of them as patients, will become a question of increasing importance. It will become an area where standards development and The Open Group can be very helpful.

Our second plenary speaker, Proteus Duxbury, Chief Technology Officer at Connect for Health Colorado,will discuss a major feature of the Affordable Care Act—the health insurance exchanges–which are designed to bring health insurance to tens of millions of people who previously did not have access to it. Mr. Duxbury is going to talk about how Enterprise Architecture–which is really about getting to solutions by helping the IT folks talk to the business folks and vice versa–has helped the State of Colorado develop their Health Insurance Exchange.

After the plenaries, we will break up into 3 tracks, one of which is Healthcare-focused. In this track there will be three presentations, all of which discuss how Enterprise Architecture and the approach to Boundaryless Information Flow can help healthcare and healthcare decision-makers become more effective and efficient.

One presentation will focus on the transformation of care delivery at the Visiting Nurse Service of New York. Another will address stewarding healthcare transformation using Enterprise Architecture, focusing on one of our Platinum members, Oracle, and a company called Intelligent Medical Objects, and how they’re working together in a productive way, bringing IT and healthcare decision-making together.

Then, the final presentation in this track will focus on the development of an Enterprise Architecture-based solution at an insurance company. The payers, or the insurers–the big companies that are responsible for paying bills and collecting premiums–have a very important role in the healthcare system that extends beyond administration of benefits. Yet, payers are not always recognized for their key responsibilities and capabilities in the area of clinical improvements and cost improvements.

With the increase in payer data brought on in large part by the adoption of a new coding system–the ICD-10–which will come online this year, there will be a huge amount of additional data, including clinical data, that become available. At The Open Group, we consider payers—health insurance companies (some of which are integrated with providers)–as very important stakeholders in the big picture..

In the afternoon, we’re going to switch gears a bit and have a speaker talk about the challenges, the barriers, the “pain points” in introducing new technology into the healthcare systems. The focus will return to remote or mobile medical devices and the predictable but challenging barriers to getting newly generated health information to flow to doctors’ offices and into patients records, electronic health records, and hospitals data keeping and data sharing systems.

We’ll have a panel of experts that responds to these pain points, these challenges, and then we’ll draw heavily from the audience, who we believe will be very, very helpful, because they bring a great deal of expertise in guiding us in our work. So we’re very much looking forward to the afternoon as well.

Gardner: It’s really interesting. A couple of these different plenaries and discussions in the afternoon come back to this user-generated data. Jason, we really seem to be on the cusp of a whole new level of information that people will be able to develop from themselves through their lifestyle, new devices that are connected.

We hear from folks like Apple, Samsung, Google, and Microsoft. They’re all pulling together information and making it easier for people to not only monitor their exercise, but their diet, and maybe even start to use sensors to keep track of blood sugar levels, for example.

In fact, a new Flurry Analytics survey showed 62 percent increase in the use of health and fitness application over the last six months on the popular mobile devices. This compares to a 33 percent increase in other applications in general. So there’s an 87 percent faster uptick in the use of health and fitness applications.

Tell me a little bit how you see this factoring in. Is this a mixed blessing? Will so much data generated from people in addition to the electronic medical records, for example, be a bad thing? Is this going to be a garbage in, garbage out, or is this something that could potentially be a game-changer in terms of how people react to their own data and then bring more data into the interactions they have with care providers?

Lee: It’s always a challenge to predict what the market is going to do, but I think that’s a remarkable statistic that you cited. My prediction is that the increased volume of person- generated data from mobile health devices is going to be a game-changer. This view also reflects how the Healthcare Forum members (which includes members from Capgemini, Philips, IBM, Oracle and HP) view the future.

The commercial demand for mobile medical devices, things that can be worn, embedded, or swallowed, as in pills, as you mentioned, is growing ever more. The software and the applications that will be developed to be used with the devices is going to grow by leaps and bounds. As you say, there are big players getting involved. Already some of the pedometer type devices that measure the number of steps taken in a day have captured the interest of many, many people. Even David Sedaris, serious guy that he is, was writing about it recently in ‘The New Yorker’.

What we will find is that many of the health indicators that we used to have to go to the doctor or nurse or lab to get information on will become available to us through these remote devices.

There will be a question, of course, as to reliability and validity of the information, to your point about garbage in, garbage out, but I think standards development will help here This, again, is where The Open Group comes in. We might also see the FDA exercising its role in ensuring safety here, as well as other organizations, in determining which devices are reliable.

The Open Group is working in the area of mobile data and information systems that are developed around them, and their ability to (a) talk to one another and (b) talk to the data devices/infrastructure used in doctors’ offices and in hospitals. This is called interoperability and it’s certainly lacking in the country.

There are already problems around interoperability and connectivity of information in the healthcare establishment as it is now. When patients and consumers start collecting their own data, and the patient is put at the center of the nexus of healthcare, then the question becomes how does that information that patients collect get back to the doctor/clinician in ways in which the data can be trusted and where the data are helpful?

After all, if a patient is wearing a medical device, there is the opportunity to collect data, about blood sugar level let’s say, throughout the day. And this is really taking healthcare outside of the four walls of the clinic and bringing information to bear that can be very, very useful to clinicians and beneficial to patients.

In short, the rapid market dynamic in mobile medical devices and in the software and hardware that facilitates interoperability begs for standards-based solutions that reduce costs and improve quality, and all of which puts the patient at the center. This is The Open Group’s Healthcare Forum’s sweet spot.

Gardner: It seems to me a real potential game-changer as well, and that something like Boundaryless Information Flow and standards will play an essential role. Because one of the big question marks with many of the ailments in a modern society has to do with lifestyle and behavior.

So often, the providers of the care only really have the patient’s responses to questions, but imagine having a trove of data at their disposal, a 360-degree view of the patient to then further the cause of understanding what’s really going on, on a day-to-day basis.

But then, it’s also having a two-way street, being able to deliver perhaps in an automated fashion reinforcements and incentives, information back to the patient in real-time about behavior and lifestyles. So it strikes me as something quite promising, and I look forward to hearing more about it at the Boston conference.

Any other thoughts on this issue about patient flow of data, not just among and between providers and payers, for example, or providers in an ecosystem of care, but with the patient as the center of it all, as you said?

Lee: As more mobile medical devices come to the market, we’ll find that consumers own multiple types of devices at least some of which collect multiple types of data. So even for the patient, being at the center of their own healthcare information collection, there can be barriers to having one device talk to the other. If a patient wants to keep their own personal health record, there may be difficulties in bringing all that information into one place.

So the interoperability issue, the need for standards, guidelines, and voluntary consensus among stakeholders about how information is represented becomes an issue, not just between patients and their providers, but for individual consumers as well.

Gardner: And also the cloud providers. There will be a variety of large organizations with cloud-modeled services, and they are going to need to be, in some fashion, brought together, so that a complete 360-degree view of the patient is available when needed. It’s going to be an interesting time.

Of course, we’ve also looked at many other industries and tried to have a cloud synergy, a cloud-of-clouds approach to data and also the transaction. So it’s interesting how what’s going on in multiple industries is common, but it strikes me that, again, the scale and the impact of the healthcare industry makes it a leader now, and perhaps a driver for some of these long overdue structured and standardized activities.

Lee: It could become a leader. There is no question about it. Moreover, there is a lot Healthcare can learn from other companies, from mistakes that other companies have made, from lessons they have learned, from best practices they have developed (both on the content and process side). And there are issues, around security in particular, where Healthcare will be at the leading edge in trying to figure out how much is enough, how much is too much, and what kinds of solutions work.

There’s a great future ahead here. It’s not going to be without bumps in the road, but organizations like The Open Group are designed and experienced to help multiple stakeholders come together and have the conversations that they need to have in order to push forward and solve some of these problems.

Gardner: Well, great. I’m sure there will be a lot more about how to actually implement some of those activities at the conference. Again, that’s going to be in Boston, beginning on July 21, 2014.

We’ll have to leave it there. We’re about out of time. We’ve been talking with a new Director at The Open Group to learn how an expected continued deluge of data and information about patients and providers, outcomes and efficiencies are all working together to push the Healthcare industry to rapid change. And, as we’ve heard, that might very well spill over into other industries as well.

So we’ve seen how innovation and adaptation around technology, Enterprise Architecture and standards can improve the communication and collaboration among Healthcare ecosystem players.

It’s not too late to register for The Open Group Boston 2014 (http://www.opengroup.org/boston2014) and join the conversation via Twitter #ogchat #ogBOS, where you will be able to learn more about Boundaryless Information Flow, Open Platform 3.0, Healthcare and other relevant topics.

So a big thank you to our guest. We’ve been joined by Jason Lee, Healthcare and Security Forums Director at The Open Group. Thanks so much, Jason.

Lee: Thank you very much.

 

 

 

 

 

 

 

 

 

Leave a comment

Filed under Boundaryless Information Flow™, Cloud, Conference, Data management, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Interoperability, Open Platform 3.0, Standards, Uncategorized

The Open Group Boston 2014 to Explore How New IT Trends are Empowering Improvements in Business

By The Open Group

The Open Group Boston 2014 will be held on July 21-22 and will cover the major issues and trends surrounding Boundaryless Information Flow™. Thought-leaders at the event will share their outlook on IT trends, capabilities, best practices and global interoperability, and how this will lead to improvements in responsiveness and efficiency. The event will feature presentations from representatives of prominent organizations on topics including Healthcare, Service-Oriented Architecture, Security, Risk Management and Enterprise Architecture. The Open Group Boston will also explore how cross-organizational collaboration and trends such as big data and cloud computing are helping to make enterprises more effective.

The event will consist of two days of plenaries and interactive sessions that will provide in-depth insight on how new IT trends are leading to improvements in business. Attendees will learn how industry organizations are seeking large-scale transformation and some of the paths they are taking to realize that.

The first day of the event will bring together subject matter experts in the Open Platform 3.0™, Boundaryless Information Flow™ and Enterprise Architecture spaces. The day will feature thought-leaders from organizations including Boston University, Oracle, IBM and Raytheon. One of the keynotes is from Marshall Van Alstyne, Professor at Boston University School of Management & Researcher at MIT Center for Digital Business, which reveals the secret of internet-driven marketplaces. Other content:

• The Open Group Open Platform 3.0™ focuses on new and emerging technology trends converging with each other and leading to new business models and system designs. These trends include mobility, social media, big data analytics, cloud computing and the Internet of Things.
• Cloud security and the key differences in securing cloud computing environments vs. traditional ones as well as the methods for building secure cloud computing architectures
• Big Data as a service framework as well as preparing to deliver on Big Data promises through people, process and technology
• Integrated Data Analytics and using them to improve decision outcomes

The second day of the event will have an emphasis on Healthcare, with keynotes from Joseph Kvedar, MD, Partners HealthCare, Center for Connected Health, and Connect for Health Colorado CTO, Proteus Duxbury. The day will also showcase speakers from Hewlett Packard and Blue Cross Blue Shield, multiple tracks on a wide variety of topics such as Risk and Professional Development, and Archimate® tutorials. Key learnings include:

• Improving healthcare’s information flow is a key enabler to improving healthcare outcomes and implementing efficiencies within today’s delivery models
• Identifying the current state of IT standards and future opportunities which cover the healthcare ecosystem
• How Archimate® can be used by Enterprise Architects for driving business innovation with tried and true techniques and best practices
• Security and Risk Management evolving as software applications become more accessible through APIs – which can lead to vulnerabilities and the potential need to increase security while still understanding the business value of APIs

Member meetings will also be held on Wednesday and Thursday, June 23-24.

Don’t wait, register now to participate in these conversations and networking opportunities during The Open Group Boston 2014: http://www.opengroup.org/boston2014/registration

Join us on Twitter – #ogchat #ogBOS

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, Healthcare, Information security, Open Platform 3.0, Professional Development, RISK Management, Service Oriented Architecture, Standards, Uncategorized

The Onion & The Open Group Open Platform 3.0™

By Stuart Boardman, Senior Business Consultant, KPN Consulting, and Co-Chair of The Open Group Open Platform 3.0™

Onion1

The onion is widely used as an analogy for complex systems – from IT systems to mystical world views.Onion2

 

 

 

It’s a good analogy. From the outside it’s a solid whole but each layer you peel off reveals a new onion (new information) underneath.

And a slice through the onion looks quite different from the whole…Onion3

What (and how much) you see depends on where and how you slice it.Onion4

 

 

 

 

The Open Group Open Platform 3.0™ is like that. Use-cases for Open Platform 3.0 reveal multiple participants and technologies (Cloud Computing, Big Data Analytics, Social networks, Mobility and The Internet of Things) working together to achieve goals that vary by participant. Each participant’s goals represent a different slice through the onion.

The Ecosystem View
We commonly use the idea of peeling off layers to understand large ecosystems, which could be Open Platform 3.0 systems like the energy smart grid but could equally be the workings of a large cooperative or the transport infrastructure of a city. We want to know what is needed to keep the ecosystem healthy and what the effects could be of the actions of individuals on the whole and therefore on each other. So we start from the whole thing and work our way in.

Onion5

The Service at the Centre of the Onion

If you’re the provider or consumer (or both) of an Open Platform 3.0 service, you’re primarily concerned with your slice of the onion. You want to be able to obtain and/or deliver the expected value from your service(s). You need to know as much as possible about the things that can positively or negatively affect that. So your concern is not the onion (ecosystem) as a whole but your part of it.

Right in the middle is your part of the service. The first level out from that consists of other participants with whom you have a direct relationship (contractual or otherwise). These are the organizations that deliver the services you consume directly to enable your own service.

One level out from that (level 2) are participants with whom you have no direct relationship but on whose services you are still dependent. It’s common in Platform 3.0 that your partners too will consume other services in order to deliver their services (see the use cases we have documented). You need to know as much as possible about this level , because whatever happens here can have a positive or negative effect on you.

One level further from the centre we find indirect participants who don’t necessarily delivery any part of the service but whose actions may well affect the rest. They could just be indirect materials suppliers. They could also be part of a completely different value network in which your level 1 or 2 “partners” participate. You can’t expect to understand this level in detail but you know that how that value network performs can affect your partners’ strategy or even their very existence. The knock-on impact on your own strategy can be significant.

We can conceive of more levels but pretty soon a law of diminishing returns sets in. At each level further from your own organization you will see less detail and more variety. That in turn means that there will be fewer things you can actually know (with any certainty) and not much more that you can even guess at. That doesn’t mean that the ecosystem ends at this point. Ecosystems are potentially infinite. You just need to decide how deep you can usefully go.

Limits of the Onion
At a certain point one hits the limits of an analogy. If everybody sees their own organization as the centre of the onion, what we actually have is a bunch of different, overlapping onions.

Onion6

And you can’t actually make onions overlap, so let’s not take the analogy too literally. Just keep it in mind as we move on. Remember that our objective is to ensure the value of the service we’re delivering or consuming. What we need to know therefore is what can change that’s outside of our own control and what kind of change we might expect. At each visible level of the theoretical onion we will find these sources of variety. How certain of their behaviour we can be will vary – with a tendency to the less certain as we move further from the centre of the onion. We’ll need to decide how, if at all, we want to respond to each kind of variety.

But that will have to wait for my next blog. In the meantime, here are some ways people look at the onion.

Onion7   Onion8

 

 

 

 

SONY DSCStuart Boardman is a Senior Business Consultant with KPN Consulting where he leads the Enterprise Architecture practice and consults to clients on Cloud Computing, Enterprise Mobility and The Internet of Everything. He is Co-Chair of The Open Group Open Platform 3.0™ Forum and was Co-Chair of the Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by KPN, the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI as well as several Open Group white papers, guides and standards. He is a frequent speaker at conferences on the topics of Open Platform 3.0 and Identity.

2 Comments

Filed under Cloud, Cloud/SOA, Conference, Enterprise Architecture, Open Platform 3.0, Service Oriented Architecture, Standards, Uncategorized

ArchiMate® Users Group Meeting

By The Open Group

During a special ArchiMate® users group meeting on Wednesday, May 14 in Amsterdam, Andrew Josey, Director of Standards within The Open Group, presented on the ArchiMate certification program and adoption of the language. Andrew is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate 2.1, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4.

ArchiMate®, a standard of The Open Group, is an open and independent modeling language for Enterprise Architecture that is supported by different vendors and consulting firms. ArchiMate provides instruments to enable Enterprise Architects to describe, analyze and visualize the relationships among business domains in an unambiguous way. ArchiMate is not an isolated development. The relationships with existing methods and techniques, like modeling languages such as UML and BPMN, and methods and frameworks like TOGAF and Zachman, are well-described.

In this talk, Andrew provided an overview of the ArchiMate 2 certification program, including information on the adoption of the ArchiMate modeling language. He gave an overview of the major milestones in the development of Archimate and referred to the Dutch origins of the language. The Dutch Telematica Institute created the Archimate language in the period 2002-2004 and the language is now widespread. There have been over 41,000 downloads of different versions of the ArchiMate specification from more than 150 countries. At 52%, The Netherlands is leading the “Top 10 Certifications by country”. However, the “Top 20 Downloads by country” is dominated by the USA (19%), followed by the UK (14%) and The Netherlands (12%). One of the tools developed to support ArchiMate is Archi, a free open-source tool created by Phil Beauvoir at the University of Bolton in the UK. Since its development, Archi also has grown from a relatively small, home-grown tool to become a widely used open-source resource that averages 3,000 downloads per month and whose community ranges from independent practitioners to Fortune 500 companies. It is no surprise that again, Archi is mostly downloaded in The Netherlands (17.67%), the United States (12.42%) and the United Kingdom (8.81%).

After these noteworthy facts and figures, Henk Jonkers took a deep dive into modeling risk and security. Henk Jonkers is a senior research consultant, involved in BiZZdesign’s innovations in the areas of Enterprise Architecture and engineering. He was one of the main developers of the ArchiMate language, an author of the ArchiMate 1.0 and 2.0 Specifications, and is actively involved in the activities of the ArchiMate Forum of The Open Group. In this talk, Henk showed several examples of how risk and security aspects can be incorporated in Enterprise Architecture models using the ArchiMate language. He also explained how the resulting models could be used to analyze risks and vulnerabilities in the different architectural layers, and to visualize the business impact that they have.

First Henk described the limitations of current approaches – existing information security and risk management methods do not systematically identify potential attacks. They are based on checklists, heuristics and experience. Security controls are applied in a bottom-up way and are not based on a thorough analysis of risks and vulnerabilities. There is no explicit definition of security principles and requirements. Existing systems only focus on IT security. They have difficulties in dealing with complex attacks on socio-technical systems, combining physical and digital access, and social engineering. Current approaches focus on preventive security controls, and corrective and curative controls are not considered. Security by Design is a must, and there is always a trade-off between the risk factor versus process criticality. Henk gave some arguments as to why ArchiMate provides the right building blocks for a solid risk and security architecture. ArchiMate is widely accepted as an open standard for modeling Enterprise Architecture and support is widely available. ArchiMate is also suitable as a basis for qualitative and quantitative analysis. And last but not least: there is a good fit with other Enterprise Architecture and security frameworks (TOGAF, Zachman, SABSA).

“The nice thing about standards is that there are so many to choose from”, emeritus professor Andrew Stuart Tanenbaum once said. Using this quote as a starting point, Gerben Wierda focused his speech on the relationship between the ArchiMate language and Business Process Model and Notation (BPMN). In particular he discussed Bruce Silver’s BPMN Method and Style. He stated that ArchiMate and BPMN can exist side by side. Why would you link BPMN and Archimate? According to Gerben there is a fundamental vision behind all of this. “There are unavoidably many ‘models’ of the enterprise that are used. We cannot reduce that to one single model because of fundamentally different uses. We even cannot reduce that to a single meta-model (or pattern/structure) because of fundamentally different requirements. Therefore, what we need to do is look at the documentation of the enterprise as a collection of models with different structures. And what we thus need to do is make this collection coherent.”

Gerben is Lead Enterprise Architect of APG Asset Management, one of the largest Fiduciary Managers (± €330 billion Assets under Management) in the world, with offices in Heerlen, Amsterdam, New York, Hong Kong and Brussels. He has overseen the construction of one of the largest single ArchiMate models in the world to date and is the author of the book “Mastering ArchiMate”, based on his experience in large scale ArchiMate modeling. In his speech, Gerben showed how the leading standards ArchiMate and BPMN (Business Process Modeling Notation, an OMG standard) can be used together, creating one structured logically coherent and automatically synchronized description that combines architecture and process details.

Marc Lankhorst, Managing Consultant and Service Line Manager Enterprise Architecture at BiZZdesign, presented on the topic of capability modeling in ArchiMate. As an internationally recognized thought leader on Enterprise Architecture, he guides the development of BiZZdesign’s portfolio of services, methods, techniques and tools in this field. Marc is also active as a consultant in government and finance. In the past, he has managed the development of the ArchiMate language for Enterprise Architecture modeling, now a standard of The Open Group. Marc is a certified TOGAF9 Enterprise Architect and holds an MSc in Computer Science from the University of Twente and a PhD from the University of Groningen in the Netherlands. In his speech, Marc discussed different notions of “capability” and outlined the ways in which these might be modeled in ArchiMate. In short, a business capability is something an enterprise does or can do, given the various resources it possesses. Marc described the use of capability-based planning as a way of translating enterprise strategy to architectural choices and look ahead at potential extensions of ArchiMate for capability modeling. Business capabilities provide a high-level view of current and desired abilities of the organization, in relation to strategy and environment. Enterprise Architecture practitioners design extensive models of the enterprise, but these are often difficult to communicate with business leaders. Capabilities form a bridge between the business leaders and the Enterprise Architecture practitioners. They are very helpful in business transformation and are the ratio behind capability based planning, he concluded.

For more information on ArchiMate, please visit:

http://www.opengroup.org/subjectareas/enterprise/archimate

For information on the Archi tool, please visit: http://www.archimatetool.com/

For information on joining the ArchiMate Forum, please visit: http://www.opengroup.org/getinvolved/forums/archimate

 

1 Comment

Filed under Enterprise Architecture, TOGAF®, Certifications, ArchiMate®, Standards, Enterprise Transformation, Conference, Professional Development

The Open Group Summit Amsterdam 2014 – Day Three Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

May 14, day three of The Open Group Summit Amsterdam, was another busy day for our attendees and presenters.  Tracks included ArchiMate®The Open Group Open Platform 3.0™-Big Data, Open CITS, TOGAF®, Architecture Methods and Professional Development.

Mark Skilton, Professor of Practice, Information Systems Management, Warwick Business School, UK presented “Creating Value in the Digital Economy”. Skilton discussed how the digital media in social, networks, mobile devices, sensors and the explosion of big data and cloud computing networks is interconnecting potentially everything everywhere – amounting to a new digital ecosystem.  These trends have significantly enhanced the importance of IT in its role and impact on business and market value locally, regionally and globally.

Other notable speakers included Thomas Obitz, Principal Advisor, KPMG, LLK, UK, and Paul Bonnie, Head of Architecture Office, ING, The Netherlands, who shared how standards, such as TOGAF®, an Open Group standard, are necessary and effective in the financial services industry.

During a special users group meeting in the evening, Andrew Josey, Director of Standards within The Open Group, presented the ArchiMate certification program and adoption of the language. . Andrew is currently managing the standards process for The Open Group, and has recently led the standards development projects for TOGAF® 9.1, ArchiMate 2.1, IEEE Std 1003.1-2008 (POSIX), and the core specifications of the Single UNIX Specification, Version 4.

Andrew provided an overview of the ArchiMate 2 certification program, including information on the adoption of the ArchiMate modeling language. He discussed the major milestones in the development of ArchiMate and referred to the Dutch origins of the language. The ArchiMate language was developed beginning in 2002 and is now widespread.  There have been over 41,000 downloads of ArchiMate specifications from more than 150 countries.

Henk Jonkers, senior research consultant involved in BiZZdesign’s innovations in Enterprise Architecture (EA) and one of the main developers of the ArchiMate language, took a deep dive into modeling risk and security.

Henk JonkersHenk Jonkers, BiZZdesign

As a final farewell from Amsterdam, a special thanks goes to our sponsors and exhibitors during this dynamic summit:  BiZZdesign, MEGA, ARCA Strategic Group, Good e-Learning, Orbus Software, Corso, Van Haren, Metaplexity, Architecting the Enterprise, Biner and the Association of Enterprise Architects (AEA).

For those of you who attended the Summit, please give us your feedback! https://www.surveymonkey.com/s/AMST2014

Stay tuned for Summit proceedings to be posted soon!  See you at our event in Boston, Massachusetts July 21-22!

 

Leave a comment

Filed under ArchiMate®, Certifications, Conference, Enterprise Architecture, Enterprise Transformation, Open CITS, Open Platform 3.0, Standards, TOGAF®, Uncategorized

The Open Group Summit Amsterdam 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The Open Group Summit Amsterdam, held at the historic Hotel Krasnapolsky, began on Monday, May 12 by highlighting how the industry is moving further towards Boundaryless Information Flow™. After the successful introduction of The Open Group Healthcare Forum in San Francisco, the Governing Board is now considering other vertical Forums such as the airline industry and utilities sector.

The morning plenary began with a welcome from Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects (AEA). He mentioned that Amsterdam has a special place in his heart because of the remembrance of the 2001 event also held in Amsterdam, just one month after the 9/11 attacks which shocked the world. Today, with almost 300 registrations and people from 29 different countries, The Open Group is still appealing to a wide range of nationalities.

Allen Brown, President and CEO of The Open Group, took the audience on a journey as he described the transformation process that The Open Group has been on over the last thirty years from its inception in 1984. After a radically financial reorganization and raising new working capital, The Open Group is flourishing more than ever and is in good financial health.

It is amazing that 40 percent of the staff of 1984 is still working for The Open Group. What is the secret? You should have the right people in the boat with shared values and commitment. “In 2014, The Open Group runs a business, but stays a not-for-profit organization, a consortium”, Brown emphasized. “Enterprise Architecture is not a commercial vehicle or a ‘trendy’ topic. The Open Group always has a positive attitude and will never criticize other organizations. Our certification programs are a differentiator compared to other organizations. We collaborate with other consortia and standard bodies like ISO and ITIL”, Brown said.

Now the world is much more complex. Technology risk is increasing. A common language based on common standards is needed more than ever. TOGAF®, an Open Group standard, was in its infancy in 1998 and now it is the common standard for Enterprise Architects all over the world. In 1984, the UNIX® platform was the first platform of The Open Group. The Open Group Open Platform 3.0™, launched last year, focuses on new and emerging technology trends like mobility, big data, cloud computing and the Internet of Things converging with each other and leading to new business models and system designs. “The Open Group is all about building relationships and networking”, Brown concluded.

Leonardo Ramirez, CEO of ARCA SG and Chair of AEA Colombia, talked about the role of interoperability and Enterprise Architecture in Latin America. Colombia is now a safe country and has the strongest economy in the region. In 2011 Colombia promoted the electronic government and TOGAF was selected as the best choice for Enterprise Architecture. Ramirez is determined to stimulate social economic development projects in Latin America with the help of Enterprise Architecture. There is a law in Colombia (Regulation Law 1712, 2014) that says that every citizen has the right to access all the public information without boundaries.

Dr. Jonas Ridderstråle, Chairman, Mgruppen and Visiting Professor, Ashridge (UK) and IE Business Schools (Spain), said in his keynote speech, “Womenomics rules, the big winners of the personal freedom movement will be women. Women are far more risk averse. What would have happened with Lehman Brothers if it was managed by women? ‘Lehman Sisters’ probably had the potential to survive. Now women can spend 80 percent of their time on other things than just raising kids.” Ridderstråle continued to discuss life-changing and game-changing events throughout his presentation. He noted that The Open Group Open Platform 3.0 for instance is a good example of a successful reinvention.

“Towards a European Interoperability Architecture” was the title of one of the afternoon sessions led by Mr. R. Abril Jimenez. Analysis during the first phase of the European Interoperability Strategy (EIS) found that, at conceptual level, architecture guidelines were missing or inadequate. In particular, there are no architectural guidelines for cross-border interoperability of building blocks. Concrete, reusable interoperability guidelines and rules and principles on standards and architecture are also lacking. Based on the results achieved and direction set in the previous phases of the action, the EIA project has moved into a more practical phase that consists of two main parts: Conceptual Reference Architecture and Cartography.

Other tracks featured Healthcare, Professional Development and Dependability through Assuredness™.

The evening concluded with a lively networking reception in the hotel’s Winter Garden ballroom.

For those of you who attended the summit, please give us your feedback!  https://www.surveymonkey.com/s/AMST2014

Leave a comment

Filed under Uncategorized, Enterprise Architecture, TOGAF®, Standards, Enterprise Transformation, Conference, Professional Development, Healthcare, Open Platform 3.0, Dependability through Assuredness™, Boundaryless Information Flow™

ArchiMate® Q&A with Phil Beauvoir

By The Open Group

The Open Group’s upcoming Amsterdam Summit in May will feature a full day on May 14 dedicated to ArchiMate®, an open and independent modeling language for Enterprise Architecture, supported by tools that allow Enterprise Architects to describe, analyze and visualize relationships among business domains in an unambiguous way.

One of the tools developed to support ArchiMate is Archi, a free, open-source tool created by Phil Beauvoir at the University of Bolton in the UK as part of a Jisc-funded Enterprise Architecture project that ran from 2009-2012. Since its development, Archi has grown from a relatively small, home-grown tool to become a widely used open-source resource that averages 3000 downloads per month and whose community ranges from independent practitioners to Fortune 500 companies. Here we talk with Beauvoir about how Archi was developed, the problems inherent in sustaining an open source product, its latest features and whether it was named after the Archie comic strip.

Beauvoir will be a featured speaker during the ArchiMate Day in Amsterdam.

Tell us about the impetus for creating the Archi tool and how it was created…
My involvement with the ArchiMate language has mainly been through the development of the software tool, Archi. Archi has, I believe, acted as a driver and as a hub for activity around the ArchiMate language and Enterprise Architecture since it was first created.

I’ll tell you the story of how Archi came about. Let’s go back to the end of 2009. At that point, I think ArchiMate and Enterprise Architecture were probably being used quite extensively in the commercial sector, especially in The Netherlands. The ArchiMate language had been around for a while at that point but was a relatively new thing to many people, at least here in the UK. If you weren’t part of the EA scene, it would have been a new thing to you. In the UK, it was certainly new for many in higher education and universities, which is where I come in.

Jisc, the UK funding body, funded a number of programs in higher education exploring digital technologies and other initiatives. One of the programs being funded was to look at how to improve systems using Enterprise Architecture within the university sector. Some of the universities had already been led to ArchiMate and Enterprise Architecture and were trying it out for themselves – they were new to it and, of course, one of the first things they needed were tools. At that time, and I think it’s still true today, a lot of the tools were quite expensive. If you’re a big commercial organization, you might be able to afford the licensing costs for tools and support, but for a small university project it can be prohibitive, especially if you’re just dipping your toe into something like this. So some colleagues within Jisc and the university I worked at said, ‘well, what about creating a small, open source project tool which isn’t over-complicated but does enough to get people started in ArchiMate? And we can fund six months of money to do this as a proof of concept tool’.

That takes us into 2010, when I was working for the university that was approached to do this work. After six months, by June 2010, I had created the first 1.0 version of Archi and it was (and still is) free, open source and cross-platform. Some of the UK universities said ‘well, that’s great, because now the barrier to entry has been lowered, we can use this tool to start exploring the ArchiMate language and getting on board with Enterprise Architecture’. That’s really where it all started.

So some of the UK universities that were exploring ArchiMate and Enterprise Architecture had a look at this first version of Archi, version 1.0, and said ‘it’s good because it means that we can engage with it without committing at this stage to the bigger tooling solutions.’ You have to remember, of course, that universities were (and still are) a bit strapped for cash, so that’s a big issue for them. At the time, and even now, there really aren’t any other open-source or free tools doing this. That takes us to June 2010. At this point we got some more funding from the Jisc, and kept on developing the tool and adding more features to it. That takes us through 2011 and then up to the end of 2012, when my contract came to an end.

Since the official funding ended and my contract finished, I’ve continued to develop Archi and support the community that’s built up around it. I had to think about the sustainability of the software beyond the project, and sometimes this can be difficult, but I took it upon myself to continue to support and develop it and to engage with the Archi/ArchiMate community.

How did you get involved with The Open Group and bringing the tool to them?
I think it was inevitable really due to where Archi originated, and because the funding came from the Jisc, and they are involved with The Open Group. So, I guess The Open Group became aware of Archi through the Jisc program and then I became involved with the whole ArchiMate initiative and The Open Group. I think The Open Group is in favor of Archi, because it’s an open source tool that provides a neutral reference implementation of the ArchiMate language. When you have an open standard like ArchiMate, it’s good to have a neutral reference model implementation.

How is this tool different from other tools out there and what does it enable people to do?
Well, firstly Archi is a tool for modeling Enterprise Architecture using the ArchiMate language and notation, but what really makes it stand out from the other tools is its accessibility and the fact that it is free, open source and cross-platform. It can do a lot of, if not all of, the things that the bigger tools provide without any financial or other commitment. However, free is not much use if there’s no quality. One thing I’ve always strived for in developing Archi is to ensure that even if it only does a few things compared with the bigger tools, it does those things well. I think with a tool that is free and open-source, you have a lot of support and good-will from users who provide positive encouragement and feedback, and you end up with an interesting open development process.

I suppose you might regard Archi’s relationship to the bigger ArchiMate tools in the same way as you’d compare Notepad to Microsoft Word. Notepad provides the essential writing features, but if you want to go for the full McCoy then you go and buy Microsoft Word. The funny thing is, this is where Archi was originally targeted – at beginners, getting people to start to use the ArchiMate language. But then I started to get emails — even just a few months after its first release — from big companies, insurance companies and the like saying things like ‘hey, we’re using this tool and it’s great, and ‘thanks for this, when are we going to add this or that feature?’ or ‘how many more features are you going to add?’ This surprised me somewhat since I wondered why they hadn’t invested in one of the available commercial tools. Perhaps ArchiMate, and even Enterprise Architecture itself, was new to these organizations and they were using Archi as their first software tool before moving on to something else. Having said that, there are some large organizations out there that do use Archi exclusively.

Which leads to an interesting dilemma — if something is free, how do you continue developing and sustaining it? This is an issue that I’m contending with right now. There is a PayPal donation button on the front page of the website, but the software is open source and, in its present form, will remain open source; but how do you sustain something like this? I don’t have the complete answer right now.

Given that it’s a community product, it helps that the community contributes ideas and develops code, but at the same time you still need someone to give their time to coordinate all of the activity and support. I suppose the classic model is one of sponsorship, but we don’t have that right now, so at the moment I’m dealing with issues around sustainability.

How much has the community contributed to the tool thus far?
The community has contributed a lot in many different ways. Sometimes a user might find a bug and report it or they might offer a suggestion on how a feature can be improved. In fact, some of the better features have been suggested by users. Overall, community contributions seem to have really taken off more in the last few months than in the whole lifespan of Archi. I think this may be due to the new Archi website and a lot more renewed activity. Lately there have been more code contributions, corrections to the documentation and user engagement in the future of Archi. And then there are users who are happy to ask ‘when is Archi going to implement this big feature, and when is it going to have full support for repositories?’ and of course they want this for free. Sometimes that’s quite hard to accommodate, because you think ‘sure, but who’s going to do all this work and contribute the effort.’ That’s certainly an interesting issue for me.

How many downloads of the tool are you getting per month? Where is it being used?
At the moment we’re seeing around 3,000 downloads a month of the tool — I think that’s a lot actually. Also, I understand that some EA training organizations use Archi for their ArchiMate training, so there are quite a few users there, as well.

The number one country for downloading the app and visiting the website is the Netherlands, followed by the UK and the United States. In the past three months, the UK and The Netherlands have been about equal in numbers in their visits to the website and downloads, followed by the United States, France, Germany, Canada, then Australia, Belgium, and Norway. We have some interest from Russia too. Sometimes it depends on whether ArchiMate or Archi is in the news at any given time. I’ve noticed that when there’s a blog post about ArchiMate, for example, you’ll see a spike in the download figures and the number of people visiting the website.

How does the tool fit into the overall schema of the modeling language?
It supports all of the ArchiMate language concepts, and I think it offers the core functionality of you’d want from an ArchiMate modeling tool — the ability to create diagrams, viewpoints, analysis of model objects, reporting, color schemes and so on. Of course, the bigger ArchiMate tools will let you manipulate the model in more sophisticated ways and create more detailed reports and outputs. This is an area that we are trying to improve, and the people who are now actively contributing to Archi are full-time Enterprise Architects who are able to contribute to these areas. For example, we have a user and contributor from France, and he and his team use Archi, and so they are able to see first-hand where Archi falls short and they are able to say ‘well, OK, we would like it to do this, or that could be improved,’ so now they’re working towards strengthening any weak areas.

How did you come up with the name?
What happens is you have pet names for projects and I think it just came about that we started calling it “Archie,” like the guy’s name. When it was ready to be released I said, ‘OK, what should we really call the app?’ and by that point everyone had started to refer to it as “Archie.” Then somebody said ‘well, everybody’s calling it by that name so why don’t we just drop the “e” from the name and go with that?’ – so it became “Archi.” I suppose we could have spent more time coming up with a different name, but by then the name had stuck and everybody was calling it that. Funnily enough, there’s a comic strip called ‘Archie’ and an insurance company that was using the software at the time told me that they’d written a counterpart tool called ‘Veronica,’ named after a character in the comic strip.

What are you currently working on with the tool?
For the last few months, I’ve been adding new features – tweaks, improvements, tightening things up, engaging with the user community, listening to what’s needed and trying to implement these requests. I’ve also been adding new resources to the Archi website and participating on social media like Twitter, spreading the word. I think the use of social media is really important. Twitter, the User Forums and the Wikis are all points where people can provide feedback and engage with me and other Archi developers and users. On the development side of things, we host the code at GitHub, and again that’s an open resource that users and potential developers can go to. I think the key words are ‘open’ and ‘community driven.’ These social media tools, GitHub and the forums all contribute to that. In this way everyone, from developer to user, becomes a stakeholder – everyone can play their part in the development of Archi and its future. It’s a community product and my role is to try and manage it all.

What will you be speaking about in Amsterdam?
I think the angle I’m interested in is what can be achieved by a small number of people taking the open source approach to developing software and building and engaging with the community around it. For me, the interesting part of the Archi story is not so much about the software itself and what it does, but rather the strong community that’s grown around it, the extent of the uptake of the tool and the way in which it has enabled people to get on board with Enterprise Architecture and ArchiMate. It’s the accessibility and agility of this whole approach that I like and also the activity and buzz around the software and from the community – that for me is the interesting thing about this process.

For more information on ArchiMate, please visit:
http://www.opengroup.org/subjectareas/enterprise/archimate

For information on the Archi tool, please visit: http://www.archimatetool.com/

For information on joining the ArchiMate Forum, please visit: http://www.opengroup.org/getinvolved/forums/archimate

philbeauvoirPhil Beauvoir has been developing, writing, and speaking about software tools and development for over 25 years. He was Senior Researcher and Developer at Bangor University, and, later, the Institute for Educational Cybernetics at Bolton University, both in the UK. During this time he co-developed a peer-to-peer learning management and groupware system, a suite of software tools for authoring and delivery of standards-compliant learning objects and meta-data, and tooling to create IMS Learning Design compliant units of learning.  In 2010, working with the Institute for Educational Cybernetics, Phil created the open source ArchiMate Modelling Tool, Archi. Since 2013 he has been curating the development of Archi independently. Phil holds a degree in Medieval English and Anglo-Saxon Literature.

1 Comment

Filed under ArchiMate®, Certifications, Conference, Enterprise Architecture, Uncategorized

Q&A with Allen Brown, President and CEO of The Open Group

By The Open Group

Last month, The Open Group hosted its San Francisco 2014 conference themed “Toward Boundaryless Information Flow™.” Boundaryless Information Flow has been the pillar of The Open Group’s mission since 2002 when it was adopted as the organization’s vision for Enterprise Architecture. We sat down at the conference with The Open Group President and CEO Allen Brown to discuss the industry’s progress toward that goal and the industries that could most benefit from it now as well as The Open Group’s new Dependability through Assuredness™ Standard and what the organization’s Forums are working on in 2014.

The Open Group adopted Boundaryless Information Flow as its vision in 2002, and the theme of the San Francisco Conference has been “Towards Boundaryless Information Flow.” Where do you think the industry is at this point in progressing toward that goal?

Well, it’s progressing reasonably well but the challenge is, of course, when we established that vision back in 2002, life was a little less complex, a little bit less fast moving, a little bit less fast-paced. Although organizations are improving the way that they act in a boundaryless manner – and of course that changes by industry – some industries still have big silos and stovepipes, they still have big boundaries. But generally speaking we are moving and everyone understands the need for information to flow in a boundaryless manner, for people to be able to access and integrate information and to provide it to the teams that they need.

One of the keynotes on Day One focused on the opportunities within the healthcare industry and The Open Group recently started a Healthcare Forum. Do you see Healthcare industry as a test case for Boundaryless Information Flow and why?

Healthcare is one of the verticals that we’ve focused on. And it is not so much a test case, but it is an area that absolutely seems to need information to flow in a boundaryless manner so that everyone involved – from the patient through the administrator through the medical teams – have all got access to the right information at the right time. We know that in many situations there are shifts of medical teams, and from one medical team to another they don’t have access to the same information. Information isn’t easily shared between medical doctors, hospitals and payers. What we’re trying to do is to focus on the needs of the patient and improve the information flow so that you get better outcomes for the patient.

Are there other industries where this vision might be enabled sooner rather than later?

I think that we’re already making significant progress in what we call the Exploration, Mining and Minerals industry. Our EMMM™ Forum has produced an industry-wide model that is being adopted throughout that industry. We’re also looking at whether we can have an influence in the airline industry, automotive industry, manufacturing industry. There are many, many others, government and retail included.

The plenary on Day Two of the conference focused on The Open Group’s Dependability through Assuredness standard, which was released last August. Why is The Open Group looking at dependability and why is it important?

Dependability is ultimately what you need from any system. You need to be able to rely on that system to perform when needed. Systems are becoming more complex, they’re becoming bigger. We’re not just thinking about the things that arrive on the desktop, we’re thinking about systems like the barriers at subway stations or Tube stations, we’re looking at systems that operate any number of complex activities. And they bring an awful lot of things together that you have to rely upon.

Now in all of these systems, what we’re trying to do is to minimize the amount of downtime because downtime can result in financial loss or at worst human life, and we’re trying to focus on that. What is interesting about the Dependability through Assuredness Standard is that it brings together so many other aspects of what The Open Group is working on. Obviously the architecture is at the core, so it’s critical that there’s an architecture. It’s critical that we understand the requirements of that system. It’s also critical that we understand the risks, so that fits in with the work of the Security Forum, and the work that they’ve done on Risk Analysis, Dependency Modeling, and out of the dependency modeling we can get the use cases so that we can understand where the vulnerabilities are, what action has to be taken if we identify a vulnerability or what action needs to be taken in the event of a failure of the system. If we do that and assign accountability to people for who will do what by when, in the event of an anomaly being detected or a failure happening, we can actually minimize that downtime or remove it completely.

Now the other great thing about this is it’s not only a focus on the architecture for the actual system development, and as the system changes over time, requirements change, legislation changes that might affect it, external changes, that all goes into that system, but also there’s another circle within that system that deals with failure and analyzes it and makes sure it doesn’t happen again. But there have been so many evidences of failure recently. In the banks for example in the UK, a bank recently was unable to process debit cards or credit cards for customers for about three or four hours. And that was probably caused by the work done on a routine basis over a weekend. But if Dependability through Assuredness had been in place, that could have been averted, it could have saved an awfully lot of difficulty for an awful lot of people.

How does the Dependability through Assuredness Standard also move the industry toward Boundaryless Information Flow?

It’s part of it. It’s critical that with big systems the information has to flow. But this is not so much the information but how a system is going to work in a dependable manner.

Business Architecture was another featured topic in the San Francisco plenary. What role can business architecture play in enterprise transformation vis a vis the Enterprise Architecture as a whole?

A lot of people in the industry are talking about Business Architecture right now and trying to focus on that as a separate discipline. We see it as a fundamental part of Enterprise Architecture. And, in fact, there are three legs to Enterprise Architecture, there’s Business Architecture, there’s the need for business analysts, which are critical to supplying the information, and then there are the solutions, and other architects, data, applications architects and so on that are needed. So those three legs are needed.

We find that there are two or three different types of Business Architect. Those that are using the analysis to understand what the business is doing in order that they can inform the solutions architects and other architects for the development of solutions. There are those that are more integrated with the business that can understand what is going on and provide input into how that might be improved through technology. And there are those that can actually go another step and talk about here we have the advances and the technology and here are the opportunities for advancing our competitiveness and organization.

What are some of the other key initiatives that The Open Group’s forum and work groups will be working on in 2014?

That kind question is like if you’ve got an award, you’ve got to thank your friends, so apologies to anyone that I leave out. Let me start alphabetically with the Architecture Forum. The Architecture Forum obviously is working on the evolution of TOGAF®, they’re also working with the harmonization of TOGAF with Archimate® and they have a number of projects within that, of course Business Architecture is on one of the projects going on in the Architecture space. The Archimate Forum are pushing ahead with Archimate—they’ve got two interesting activities going on at the moment, one is called ArchiMetals, which is going to be a sister publication to the ArchiSurance case study, where the ArchiSurance provides the example of Archimate is used in the insurance industry, ArchiMetals is going to be used in a manufacturing context, so there will be a whitepaper on that and there will be examples and artifacts that we can use. They’re also working on in Archimate a standard for interoperability for modeling tools. There are four tools that are accredited and certified by The Open Group right now and we’re looking for that interoperability to help organizations that have multiple tools as many of them do.

Going down the alphabet, there’s DirecNet. Not many people know about DirecNet, but Direcnet™ is work that we do around the U.S. Navy. They’re working on standards for long range, high bandwidth mobile networking. We can go to the FACE™ Consortium, the Future Airborne Capability Environment. The FACE Consortium are working on their next version of their standard, they’re working toward accreditation, a certification program and the uptake of that through procurement is absolutely amazing, we’re thrilled about that.

Healthcare we’ve talked about. The Open Group Trusted Technology Forum, where they’re working on how we can trust the supply chain in developed systems, they’ve released the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program, that was launched this week, and we already have one accredited vendor and two certified test labs, assessment labs. That is really exciting because now we’ve got a way of helping any organization that has large complex systems that are developed through a global supply chain to make sure that they can trust their supply chain. And that is going to be invaluable to many industries but also to the safety of citizens and the infrastructure of many countries. So the other part of the O-TTPS is that standard we are planning to move toward ISO standardization shortly.

The next one moving down the list would be Open Platform 3.0™. This is really exciting part of Boundaryless Information Flow, it really is. This is talking about the convergence of SOA, Cloud, Social, Mobile, Internet of Things, Big Data, and bringing all of that together, this convergence, this bringing together of all of those activities is really something that is critical right now, and we need to focus on. In the different areas, some of our Cloud computing standards have already gone to ISO and have been adopted by ISO. We’re working right now on the next products that are going to move through. We have a governance standard in process and an ecosystem standard has recently been published. In the area of Big Data there’s a whitepaper that’s 25 percent completed, there’s also a lot of work on the definition of what Open Platform 3.0 is, so this week the members have been working on trying to define Open Platform 3.0. One of the really interesting activities that’s gone on, the members of the Open Platform 3.0 Forum have produced something like 22 different use cases and they’re really good. They’re concise and they’re precise and the cover a number of different industries, including healthcare and others, and the next stage is to look at those and work on the ROI of those, the monetization, the value from those use cases, and that’s really exciting, I’m looking forward to peeping at that from time to time.

The Real Time and Embedded Systems Forum (RTES) is next. Real-Time is where we incubated the Dependability through Assuredness Framework and that was where that happened and is continuing to develop and that’s really good. The core focus of the RTES Forum is high assurance system, and they’re doing some work with ISO on that and a lot of other areas with multicore and, of course, they have a number of EC projects that we’re partnering with other partners in the EC around RTES.

The Security Forum, as I mentioned earlier, they’ve done a lot of work on risk and dependability. So they’ve not only their standards for the Risk Taxonomy and Risk Analysis, but they’ve now also developed the Open FAIR Certification for People, which is based on those two standards of Risk Analysis and Risk Taxonomy. And we’re already starting to see people being trained and being certified under that Open FAIR Certification Program that the Security Forum developed.

A lot of other activities are going on. Like I said, I probably left a lot of things out, but I hope that gives you a flavor of what’s going on in The Open Group right now.

The Open Group will be hosting a summit in Amsterdam May 12-14, 2014. What can we look forward to at that conference?

In Amsterdam we have a summit – that’s going to bring together a lot of things, it’s going to be a bigger conference that we had here. We’ve got a lot of activity in all of our activities; we’re going to bring together top-level speakers, so we’re looking forward to some interesting work during that week.

 

 

 

1 Comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Cybersecurity, EMMMv™, Enterprise Architecture, FACE™, Healthcare, O-TTF, RISK Management, Standards, TOGAF®

Q&A with Jim Hietala on Security and Healthcare

By The Open Group

We recently spoke with Jim Hietala, Vice President, Security for The Open Group, at the 2014 San Francisco conference to discuss upcoming activities in The Open Group’s Security and Healthcare Forums.

Jim, can you tell us what the Security Forum’s priorities are going to be for 2014 and what we can expect to see from the Forum?

In terms of our priorities for 2014, we’re continuing to do work in Security Architecture and Information Security Management. In the area of Security Architecture, the big project that we’re doing is adding security to TOGAF®, so we’re working on the next version of the TOGAF standard and specification and there’s an active project involving folks from the Architecture Forum and the Security Forum to integrate security into and stripe it through TOGAF. So, on the Security Architecture side, that’s the priority. On the Information Security Management side, we’re continuing to do work in the area of Risk Management. We introduced a certification late last year, the OpenFAIR certification, and we’ll continue to do work in the area of Risk Management and Risk Analysis. We’re looking to add a second level to the certification program, and we’re doing some other work around the Risk Analysis standards that we’ve introduced.

The theme of this conference was “Towards Boundaryless Information Flow™” and many of the tracks focused on convergence, and the convergence of things Big Data, mobile, Cloud, also known as Open Platform 3.0. How are those things affecting the realm of security right now?

I think they’re just beginning to. Cloud—obviously the security issues around Cloud have been here as long as Cloud has been over the past four or five years. But if you look at things like the Internet of Things and some of the other things that comprise Open Platform 3.0, the security impacts are really just starting to be felt and considered. So I think information security professionals are really just starting to wrap their hands around, what are those new security risks that come with those technologies, and, more importantly, what do we need to do about them? What do we need to do to mitigate risk around something like the Internet of Things, for example?

What kind of security threats do you think companies need to be most worried about over the next couple of years?

There’s a plethora of things out there right now that organizations need to be concerned about. Certainly advanced persistent threat, the idea that maybe nation states are trying to attack other nations, is a big deal. It’s a very real threat, and it’s something that we have to think about – looking at the risks we’re facing, exactly what is that adversary and what are they capable of? I think profit-motivated criminals continue to be on everyone’s mind with all the credit card hacks that have just come out. We have to be concerned about cyber criminals who are profit motivated and who are very skilled and determined and obviously there’s a lot at stake there. All of those are very real things in the security world and things we have to defend against.

The Security track at the San Francisco conference focused primarily on risk management. How can companies better approach and manage risk?

As I mentioned, we did a lot of work over the last few years in the area of Risk Management and the FAIR Standard that we introduced breaks down risk into what’s the frequency of bad things happening and what’s the impact if they do happen? So I would suggest that taking that sort of approach, using something like taking the Risk Taxonomy Standard that we’ve introduced and the Risk Analysis Standard, and really looking at what are the critical assets to protect, who’s likely to attack them, what’s the probably frequency of attacks that we’ll see? And then looking at the impact side, what’s the consequence if somebody successfully attacks them? That’s really the key—breaking it down, looking at it that way and then taking the right mitigation steps to reduce risk on those assets that are really important.

You’ve recently become involved in The Open Group’s new Healthcare Forum. Why a healthcare vertical forum for The Open Group?

In the area of healthcare, what we see is that there’s just a highly fragmented aspect to the ecosystem. You’ve got healthcare information that’s captured in various places, and the information doesn’t necessarily flow from provider to payer to other providers. In looking at industry verticals, the healthcare industry seemed like an area that really needed a lot of approaches that we bring from The Open Group—TOGAF and Enterprise Architecture approaches that we have.

If you take it up to a higher level, it really needs the Boundaryless Information Flow that we talk about in The Open Group. We need to get to the point where our information as patients is readily available in a secure manner to the people who need to give us care, as well as to us because in a lot of cases the information exists as islands in the healthcare industry. In looking at healthcare it just seemed like a natural place where, in our economies – and it’s really a global problem – a lot of money is spent on healthcare and there’s a lot of opportunities for improvement, both in the economics but in the patient care that’s delivered to individuals through the healthcare system. It just seemed like a great area for us to focus on.

As the new Healthcare Forum kicks off this year, what are the priorities for the Forum?

The Healthcare Forum has just published a whitepaper summarizing the workshop findings for the workshop that we held in Philadelphia last summer. We’re also working on a treatise, which will outline our views about the healthcare ecosystem and where standards and architecture work is most needing to be done. We expect to have that whitepaper produced over the next couple of months. Beyond that, we see a lot of opportunities for doing architecture and standards work in the healthcare sector, and our membership is going to determine which of those areas to focus on, which projects to initiate first.

For more on the The Open Group Security Forum, please visit http://www.opengroup.org/subjectareas/security. For more on the The Open Group Healthcare Forum, see http://www.opengroup.org/getinvolved/industryverticals/healthcare.

62940-hietalaJim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security, risk management and healthcare programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Cloud/SOA, Conference, Data management, Healthcare, Information security, Open FAIR Certification, Open Platform 3.0, RISK Management, TOGAF®, Uncategorized

One Year Later: A Q&A Interview with Chris Harding and Dave Lounsbury about Open Platform 3.0™

By The Open Group

The Open Group launched its Open Platform 3.0™ Forum nearly one year ago at the 2013 Sydney conference. Open Platform 3.0 refers to the convergence of new and emerging technology trends such as Mobile, Social, Big Data, Cloud and the Internet of Things, as well as the new business models and system designs these trends are pushing organizations toward due to the consumerization of IT and evolving user behaviors. The Forum was created to help organizations address the architectural and structural considerations that businesses must consider to take advantage of and benefit from this evolutionary shift in how technology is used.

We sat down with The Open Group CTO Dave Lounsbury and Open Platform 3.0 Director Dr. Chris Harding at the recent San Francisco conference to catch up on the Forum’s activities and progress since launch and what they’ll be working on during 2014.

The Open Group’s Forum, Open Platform 3.0, was launched almost a year ago in April of 2013. What has the Forum been working on over the past year?

Chris Harding (CH): We launched at the Sydney conference in April of last year. What we’ve done since then first of all was to look at the requirements for the platform, and we did this using the proven TOGAF® technique of the Business Scenario. So over the course of last summer, the summer of 2013, we developed a Business Scenario capturing the requirements for Open Platform 3.0 and that was published just before The Open Group conference in October. Following that conference, the main activity that we’ve been doing is in fact furthering the requirements space. We’ve been developing analysis of use cases, so currently we have 22 different use cases that members of the forum have put together which are illustrating the use of the convergent technologies and most importantly the use of them in combination with each other.

What we’re doing here in this meeting in San Francisco is to obtain from that basis of requirements and use cases an understanding of what the platform fundamentally should be because it is our intention to produce a Snapshot definition of the platform by the end of March. So in the first year of the Forum, we hope that we will finish that year by producing a Snapshot definition of Open Platform 3.0.

Dave Lounsbury (DL): First, the roots of the Open Platform go deeper. Previous to that we had a number of works groups in the areas of Cloud, SOA and some other ones in terms of Semantic Interoperability. All of those were early pieces, and what we saw at the beginning of 2013 was a coalescing of that into this concept that businesses were looking for a new platform for their operations that combined aspects of Social, Mobile, Cloud computing, Big Data and the analytics that go along with it. We saw that emerging in the marketplace, and we formed the Forum to develop that direction. The Open Group always takes an end-to-end view of any problem – we like to look at the whole ecosystem. We want to make sure that the technical standards aren’t just point targets and actually address a business need.

Some of the work groups within The Open Group, such as Quantum Lifecycle Management (QLM) and Semantic Interoperability, have been brought under the umbrella of Open Platform 3.0, most notably the Cloud Work Group. How will the work of these groups continue under Platform 3.0?

CH: Some of the work already going on in The Open Group was directly or indirectly relevant to Open Platform 3.0. And that first and most importantly was the work of the Cloud Work Group, Cloud being one of the convergent technologies, and the Cloud Work Group became a part of Platform 3.0. Two other activities also became a part of Open Platform 3.0, one was of these was the Semantic Interoperability Work Group, and that is because we recognized that Semantic Interoperability has to be an important part of how these technologies work with each other. Though it may not be that we have a full definition of that in the first version of the standard – it’s a notoriously difficult area – but over the course of time, we hope to incorporate a Semantic Interoperability component in the Platform definition and that may well build on the work that we’ve been doing with the Universal Data Element Framework, the UDEF project, which is currently undergoing a major restructuring. The key thing from the Open Platform 3.0 perspective is how the semantic convention relates to the convergence of the technologies in the platform.

In terms of QLM, that became part become of Open Platform 3.0 because one of the key convergent technologies is the Internet of Things, and QLM overlaps significantly with that. QLM is not about the Internet of Things, as such, but it does have a strong component of understanding the way networked sensors and controls work, so that’s become an important contribution to the new Forum.

DL: Like in any platform there’s going to be multiple components. In Open Platform 3.0, one of the big drivers for this change is Big Data. Big Data is very trendy, right? But where does Big Data come from? Well, it comes from increased connectivity, increased use of mobile devices, increased use of sensors –  the ‘Internet of Things.’ All of these things are generating data about usage patterns, where people are, what they’re doing, what that they‘re buying, what they’re interested in and what their likes and dislikes are, creating a massive flood of data. Now the question becomes ‘how do you compute on that data?’ You need to handle that massively scalable stream of data. You need massively scalable computing  underneath it, you need the ability to move large amounts of information from one place to another. When you think about the analysis of data like that, you have algorithms that do a lot of data access and they’ll have big spikes of computation, as they create some model of it. If you’re going to look at 10 zillion records, you don’t want to buy enough computers so you can always look at 10 zillion records, you want to be able to turn that on, do your analysis and turn it back off.  That’s, of course, why Cloud is a critical component of Open Platform 3.0.

Open Platform 3.0 encompasses a lot of different technologies as well as how they are converging. How do you piece apart everything that Platform 3.0 entails to begin to formulate a standard for it?

CH: I mentioned that we developed 22 use cases. The way that we’re addressing this is to look at use cases and the business and technical ecosystems that those use cases exemplify and to abstract from that some fundamental architectural patterns. These we believe will be the basis for the initial definition of the platform.

DL: That gets back to this question about how were starting up. Again it’s The Open Group’s mantra that we look at a business problem as an end-to-end problem. So what you’ll see in Open Platform 3.0, is that we’ve done the Business Scenario to figure out what’s the business motivator, what do business people need to get this done, and we’re fleshing that out with these details in these detailed use cases.

One of the things that we’re very careful about in The Open Group is that we don’t replicate what’s going on in other standards bodies. If you look at what’s going on in Cloud, and what continues to go on in Cloud under the Open Platform 3.0, banner, we really focused in on what do business people really need in the cloud guides – those are how business people really use it.  We’ve stayed away for a long time from the bits and bytes – we’re now doing a Cloud Reference Architecture – but we’ve also created the Cloud Ecosystem Reference Model, which was just published. That Cloud Ecosystem Reference Model, if you read through it, isn’t about how bits flow around, it’s about how partners interact with each other – what to look for in your Cloud partner, who are the players? When you go to use Cloud in your business, what players do you have to engage with? What are the roles that you have to engage with them on? So again it’s really that business level of guidance that The Open Group is really good at, and we do liaison with other organizations in order to get technical stuff if we need it – or if not, we’ll create it ourselves because we’ve got very competent technical people – but again, it’s that balanced business approach that distinguishes The Open Group way.

Many industry pundits have said that Open Platform 3.0 is ultimately about a shift toward user-driven IT. How does that change the standards making process when most standards are ultimately put in place by technologists not necessarily end-users?

CH:  It’s an interesting question. I mentioned the Business Scenario that we developed over the summer – one of the key things that came out of that was that there is this shift towards a more direct use of the technologies by business users.  And that is partly because it’s becoming more possible. Cloud is one of the key factors that has shortened the cycle of procuring and putting IT in place to support business use, and made it more possible to manage IT directly. At the same time [users are] becoming impatient with delay and wanting to gain the benefits of technology directly and not at arms length through the IT department. We’re seeing in connection with these phenomena such as the business technologist, the technical specialist who works with or is employed by the business department rather than within a separate IT department, and one of whose key strengths is an understanding of the business.  So that is certainly an important dimension that we’re seeing and one of the requirements for the Platform is that it should be usable in an environment where business is using IT more directly.

But that wasn’t the question you asked. The question was, ‘isn’t it a problem that the standards are defined by technologists?’ We don’t believe it’s a problem provided that the technologists do have an understanding of the business environment. That was why in the Business Scenario activity that we conducted, one of the key inputs was a roundtable workshop with CIO level people, and that is where a lot of our perspective on why things are changing comes from. Open Platform 3.0 certainly does have dimension of fundamental architecture patterns and part of that is business architecture patterns but it also has a technical dimension, and obviously you do really need the technical people to explore that dimension though they do always need to keep in mind the technology is there to serve the business.

DL: If you actually look at trends in the marketplace about how IT is done, and in fact if you look at the last blog post that Allen [Brown] did about agile, the whole thrust of agile methodologies and its successor DevOps is to really get the implementers right next to the business people and have a very tight arrangement in order to get fast iteration and really have the implementer do what the business person needs. I actually view consumerization not as some outside threat but actually a logical extension of that trend. What’s happening in my opinion is that people who are not technologists, who are not part of the IT department, are getting comfortable using and managing their own technology. And so they’re making decisions that used to be made by the IT department years ago – or what used to be the IT department. First there was the big mainframe, and you handed in your cards at a window and you got your printout in your little cubby hole. Then the IT department bought your PC, and now we bring our own devices. There’s nothing wrong with that, that’s people getting comfortable with technology and making decisions. I think that’s one of the reasons we have need for an Open Platform 3.0 approach – to develop business guidance and eventually technical standards on how we keep up with that trend. Because it’s a very natural trend – people want to control the resources they need to get their job done, and if those resources are technical resources, and they’re comfortable doing that, great!

Convergence and Open Platform 3.0 seem to take us closer and closer to The Open Group’s vision of Boundaryless Information Flow™.  Is Open Platform 3.0 the fulfillment of that vision?

DL: I think I’d be crazy to say that it’s the endpoint of that vision. I think being able to move large amounts of data and make decisions on it is a significant step forward in Boundaryless Information Flow, but this is a two-edged sword. I talked about all that data being generated by mobile devices and sensors and retail networks and social networks and things like that. That data is growing exponentially.  The number of people who can make decisions on that data are growing at best linearly and not very quickly. So if there’s all this data out there and nobody to look at it, we need to ask if we have we lowered the boundary for communications or have we actually raised it by creating a pile of data that no one can climb? That’s why I think a next step is, in fact, more machine-assisted analytics and predictive analytics and machine learning that will help humans digest and understand that data. That will be, I think, yet another step toward Boundaryless Information Flow. Moving bits around does not equate to information flow – its only information when it moves from data to being information in a human’s brain. Until we lower that barrier as well, we’re not there. And even beyond that, there’s still lots of things that can be done, in terms of breaking down human language barriers and things like that or social networks in more intuitive ways. I think there’s a long way to go. I think this is a really important step forward, but fulfillment is too strong a word.

CH:  Not in itself, I don’t believe. It is a major contribution towards the vision of Boundaryless Information Flow but it is not the complete fulfillment of that vision. Since we’ve formulated the problem statement of Boundaryless Information Flow there have been a number of developments that have impacted on it and maybe helped to bring it closer. So you might think of SOA as an important enabling technology for Boundaryless Information Flow, replacing the information silos with interacting services. Now we’re seeing Open Platform 3.0, which is certainly going to have a service-oriented flavor, shall we say, although it probably will not look exactly like traditional SOA. The Boundaryless Information Flow requirement was a very far-reaching problem statement. The Interoperable Business Scenario was where it was first set out and since then we’ve been gradually making process toward it. Open Platform 3.0 will bring it closer, but I’m sure there will be other things still needed to make it happen. 

One of the key things for Boundaryless Information Flow is Enterprise Architecture. So within a particular enterprise, the business and IT needs to be architected to enable Boundaryless Information Flow, and TOGAF is the method that is defined and maintained by The Open Group for how enterprises define enterprise architectures. Open Platform 3.0 will complement that by providing a ‘this is what an architecture looks like that enables the business to take advantage of this new converging technologies.’ But there will still be a need for the Enterprise Architect to put that together with the other particular factors involved in an enterprise to create an architecture for Boundaryless Information Flow within that enterprise.

When can we expect the first standard from Open Platform 3.0?

DL: Well, we published the Cloud Ecosystem Reference Guide, and again the understanding of how business partners relate in the Cloud world is a key component of Open Platform 3.0. The Forum has a roadmap, and will start publishing the case studies still in process.

The message I would say is there’s already early value in the Cloud Ecosystem Reference Model, which is a logical continuation of cloud work that had already gone on in the Work Group, but is now part of the Forum as part of Open Platform 3.0.

CH: That’s always a tricky question however I can tell you what is planned. The intention, as I said, was to produce a Snapshot definition by the end of March and, given we are a quarter of the way through the meeting at this conference, which is the key meeting that will define the basis for that, the progress has been good so far, so I’m optimistic. A Snapshot is not a Standard. A Snapshot is a statement of ‘this is what we are thinking and might be what it will look like,’ but it’s not guaranteed in any way that the Standard will follow the Snapshot. We are intending to produce the first Standard definition of the platform in about a year’s time after the Snapshot.  That will give the opportunity for people not only within The Open Group but outside The Open Group to give us input and further understanding of the way people intend to use the platform as feedback on the snapshot, which should be the basis for the first published standard.

For more on the Open Platform 3.0 Forum, please visit: http://www3.opengroup.org/subjectareas/platform3.0.

If you have any questions about Open Platform 3.0 or if you would like to join the new Forum, please contact Chris Harding (c.harding@opengroup.org) for queries regarding the Forum or Chris Parnell (c.parnell@opengroup.org) for queries regarding membership.

Chris HardingDr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

Dave LounsburyDave is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, Dave leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia. Dave holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.

Comments Off

Filed under Cloud, Cloud/SOA, Conference, Open Platform 3.0, Standards, TOGAF®

Facing the Challenges of the Healthcare Industry – An Interview with Eric Stephens of The Open Group Healthcare Forum

By The Open Group

The Open Group launched its new Healthcare Forum at the Philadelphia conference in July 2013. The forum’s focus is on bringing Boundaryless Information Flow™ to the healthcare industry to enable data to flow more easily throughout the complete healthcare ecosystem through a standardized vocabulary and messaging. Leveraging the discipline and principles of Enterprise Architecture, including TOGAF®, the forum aims to develop standards that will result in higher quality outcomes, streamlined business practices and innovation within the industry.

At the recent San Francisco 2014 conference, Eric Stephens, Enterprise Architect at Oracle, delivered a keynote address entitled, “Enabling the Opportunity to Achieve Boundaryless Information Flow” along with Larry Schmidt, HP Fellow at Hewlett-Packard. A veteran of the healthcare industry, Stephens was Senior Director of Enterprise Architects Excellus for BlueCross BlueShield prior to joining Oracle and he is an active member of the Healthcare Forum.

We sat down after the keynote to speak with Stephens about the challenges of healthcare, how standards can help realign the industry and the goals of the forum. The opinions expressed here are Stephens’ own, not of his employer.

What are some of the challenges currently facing the healthcare industry?

There are a number of challenges, and I think when we look at it as a U.S.-centric problem, there’s a disproportionate amount of spending that’s taking place in the U.S. For example, if you look at GDP or percentage of GDP expenditures, we’re looking at now probably 18 percent of GDP [in the U.S.], and other developed countries are spending a full 5 percent less than that of their GDP, and in some cases they’re getting better outcomes outside the U.S.

The mere fact that there’s the existence of what we call “medical tourism, where if I need a hip replacement, I can get it done for a fraction of the cost in another country, same or better quality care and have a vacation—a rehab vacation—at the same time and bring along a spouse or significant other, means there’s a real wide range of disparity there. 

There’s also a lack of transparency. Having worked at an insurance company, I can tell you that with the advent of high deductible plans, there’s a need for additional cost information. When I go on Amazon or go to a local furniture store, I know what the cost is going to be for what I’m about to purchase. In the healthcare system, we don’t get that. With high deductible plans, if I’m going to be responsible for a portion or a larger portion of the fee, I want to know what it is. And what happens is, the incentives to drive costs down force the patient to be a consumer. The consumer now asks the tough questions. If my daughter’s going in for a tonsillectomy, show me a bill of materials that shows me what’s going to be done – if you are charging me $20/pill for Tylenol, I’ll bring my own. Increased transparency is what will in turn drive down the overall costs.

I think there’s one more thing, and this gets into the legal side of things. There is an exorbitant amount of legislation and regulation around what needs to be done. And because every time something goes sideways, there’s going to be a lawsuit, doctors will prescribe an extra test, and extra X-ray for a patient whether they need it or not.

The healthcare system is designed around a vicious cycle of diagnose-treat-release. It’s not incentivized to focus on prevention and management. Oregon is promoting these coordinated care organizations (CCOs) that would be this intermediary that works with all medical professionals – whether it was physical, mental, dental, even social worker – to coordinate episodes of care for patients. This drives down inappropriate utilization – for example, using an ER as a primary care facility and drives the medical system towards prevention and management of health. 

Your keynote with Larry Schmidt of HP focused a lot on cultural changes that need to take place within the healthcare industry – what are some of the changes necessary for the healthcare industry to put standards into place?

I would say culturally, it goes back to those incentives, and it goes back to introducing this idea of patient-centricity. And for the medical community, to really start recognizing that these individuals are consumers and increased choice is being introduced, just like you see in other industries. There are disruptive business models. As a for instance, medical tourism is a disruptive business model for United States-based healthcare. The idea of pharmacies introducing clinical medicine for routine care, such as what you see at a CVS, Wal-Mart or Walgreens. I can get a flu shot, I can get a well-check visit, I can get a vaccine – routine stuff that doesn’t warrant a full-blown medical professional. It’s applying the right amount of medical care to a particular situation.

Why haven’t existing standards been adopted more broadly within the industry? What will help providers be more likely to adopt standards?

I think the standards adoption is about “what’s in it for me, the WIIFM idea. It’s demonstrating to providers that utilizing standards is going to help them get out of the medical administration business and focus on their core business, the same way that any other business would want to standardize its information through integration, processes and components. It reduces your overall maintenance costs going forward and arguably you don’t need a team of billing folks sitting in an doctor’s office because you have standardized exchanges of information.

Why haven’t they been adopted? It’s still a question in my mind. Why would a doctor not want to do that is perhaps a question we’re going to need to explore as part of the Healthcare Forum.

Is it doctors that need to adopt the standards or technologies or combination of different constituents within the ecosystem?

I think it’s a combination. We hear a lot about the Affordable Care Act (ACA) and the health exchanges. What we don’t hear about is the legislation to drive toward standardization to increase interoperability. So unfortunately it would seem the financial incentives or things we’ve tried before haven’t worked, and we may simply have to resort to legislation or at least legislative incentives to make it happen because part of the funding does cover information exchanges so you can move health information between providers and other actors in the healthcare system.

You’re advocating putting the individual at the center of the healthcare ecosystem. What changes need to take place within the industry in order to do this?

I think it’s education, a lot of education that has to take place. I think that individuals via the incentive model around high deductible plans will force some of that but it’s taking responsibility and understanding the individual role in healthcare. It’s also a cultural/societal phenomenon.

I’m kind of speculating here, and going way beyond what enterprise architecture or what IT would deliver, but this is a philosophical thing around if I have an ailment, chances are there’s a pill to fix it. Look at the commercials, every ailment say hypertension, it’s easy, you just dial the medication correctly and you don’t worry as much about diet and exercise. These sorts of things – our over-reliance on medication. I’m certainly not going to knock the medications that are needed for folks that absolutely need them – but I think we can become too dependent on pharmacological solutions for our health problems.   

What responsibility will individuals then have for their healthcare? Will that also require a cultural and behavioral shift for the individual?

The individual has to start managing his or her own health. We manage our careers and families proactively. Now we need to focus on our health and not just float through the system. It may come to financial incentives for certain “individual KPIs such as blood pressure, sugar levels, or BMI. Advances in medical technology may facilitate more personal management of one’s health.

One of the Healthcare Forum’s goals is to help establish Boundaryless Information Flow within the Healthcare industry you’ve said that understanding the healthcare ecosystem will be a key component for that what does that ecosystem encompass and why is it important to know that first?

Very simply we’re talking about the member/patient/consumer, then we get into the payers, the providers, and we have to take into account government agencies and other non-medical agents, but they all have to work in concert and information needs to flow between those organizations in a very standardized way so that decisions can be made in a very timely fashion.

It can’t be bottled up, it’s got to be provided to the right provider at the right time, otherwise, best case, it’s going to cost more to manage all the actors in the system. Worst case, somebody dies or there is a “never event due to misinformation or lack of information during the course of care. The idea of Boundaryless Information Flow gives us the opportunity to standardize, have easily accessible information – and by the way secured – it can really aide in that decision-making process going forward. It’s no different than Wal-Mart knowing what kind of merchandise sells well before and after a hurricane (i.e., beer and toaster pastries, BTW). It’s the same kind of real-time information that’s made available to a Google car so it can steer its way down the road. It’s that kind of viscosity needed to make the right decisions at the right time.

Healthcare is a highly regulated industry, how can Boundarylesss Information Flow and data collection on individuals be achieved and still protect patient privacy?

We can talk about standards and the flow and the technical side. We need to focus on the security and privacy side.  And there’s going to be a legislative side because we’re going to touch on real fundamental data governance issue – who owns the patient record? Each actor in the system thinks they own the patient record. If we’re going to require more personal accountability for healthcare, then shouldn’t the consumer have more ownership? 

We also need to address privacy disclosure regulations to avoid catastrophic data leaks of protected health information (PHI). We need bright IT talent to pull off the integration we are talking about here. We also need folks who are well versed in the privacy laws and regulations. I’ve seen project teams of 200 have up to eight folks just focusing on the security and privacy considerations. We can argue about headcount later but my point is the same – one needs some focused resources around this topic.

What will standards bring to the healthcare industry that is missing now?

I think the standards, and more specifically the harmonization of the standards, is going to bring increased maintainability of solutions, I think it’s going to bring increased interoperability, I think it’s going to bring increased opportunities too. We see mobile computing or even DropBox, that has API hooks into all sorts of tools, and it’s well integrated – so I can integrate and I can move files between devices, I can move files between apps because they have hooks it’s easy to work with. So it’s building these communities of developers, apps and technical capabilities that makes it easy to move the personal health record for example, back and forth between providers and it’s not a cataclysmic event to integrate a new version of electronic health records (EHR) or to integrate the next version of an EHR. This idea of standardization but also some flexibility that goes into it.

Are you looking just at the U.S. or how do you make a standard that can go across borders and be international?

It is a concern, much of my thinking and much of what I’ve conveyed today is U.S.-centric, based on our problems, but many of these interoperability problems are international. We’re going to need to address it; I couldn’t tell you what the sequence is right now. There are other considerations, for example, single vs. multi-payer—that came up in the keynote. We tend to think that if we stay focused on the consumer/patient we’re going to get it for all constituencies. It will take time to go international with a standard, but it wouldn’t be the first time. We have a host of technical standards for the Internet (e.g., TCP/IP, HTTP). The industry has been able to instill these standards across geographies and vendors. Admittedly, the harmonization of health care-related standards will be more difficult. However, as our world shrinks with globalization an international lens will need to be applied to this challenge. 

Eric StephensEric Stephens (@EricStephens) is a member of Oracle’s executive advisory community where he focuses on advancing clients’ business initiatives leveraging the practice of Business and Enterprise Architecture. Prior to joining Oracle he was Senior Director of Enterprise Architecture at Excellus BlueCross BlueShield leading the organization with architecture design, innovation, and technology adoption capabilities within the healthcare industry.

 

Comments Off

Filed under Conference, Data management, Enterprise Architecture, Healthcare, Information security, Standards, TOGAF®

The Open Group San Francisco 2014 – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications

Day two, February 4th, of The Open Group San Francisco conference kicked off with a welcome and opening remarks from Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects.

Nunn introduced Allen Brown, President and CEO of The Open Group, who provided highlights from The Open Group’s last quarter.  As of Q4 2013, The Open Group had 45,000 individual members in 134 countries hailing from 449 member companies in 38 countries worldwide. Ten new member companies have already joined The Open Group in 2014, and 24 members joined in the last quarter of 2013, with the first member company joining from Vietnam. In addition, 6,500 individuals attended events sponsored by The Open Group in Q4 2013 worldwide.

Updates on The Open Group’s ongoing work were provided including updates on the FACE™ Consortium, DirectNet® Waveform Standard, Architecture Forum, Archimate® Forum, Open Platform 3.0™ Forum and Security Forum.

Of note was the ongoing development of TOGAF® and introduction of a three-volume work including individual volumes outlining the TOGAF framework, guidance and tools and techniques for the standard, as well as collaborative work that allows the Archimate modeling language to be used for risk management in enterprise architectures.

In addition, Open Platform 3.0 Forum has already put together 22 business use cases outlining ROI and business value for various uses related to technology convergence. The Cloud Work Group’s Cloud Reference Architecture has also been submitted to ISO for international standards certification, and the Security Forum has introduced certification programs for OpenFAIR risk management certification for individuals.

The morning plenary centered on The Open Group’s Dependability through Assuredness™ (O-DA) Framework, which was released last August.

Speaking first about the framework was Dr. Mario Tokoro, Founder and Executive Advisor for Sony Computer Science Laboratories. Dr. Tokoro gave an overview of the Dependable Embedded OS project (DEOS), a large national project in Japan originally intended to strengthen the country’s embedded systems. After considerable research, the project leaders discovered they needed to consider whether large, open systems could be dependable when it came to business continuity, accountability and ensuring consistency throughout the systems’ lifecycle. Because the boundaries of large open systems are ever-changing, the project leaders knew they must put together dependability requirements that could accommodate constant change, allow for continuous service and provide continuous accountability for the systems based on consensus. As a result, they put together a framework to address both the change accommodation cycle and failure response cycles for large systems – this framework was donated to The Open Group’s Real-Time Embedded Systems Forum and released as the O-DA standard.

Dr. Tokoro’s presentation was followed by a panel discussion on the O-DA standard. Moderated by Dave Lounsbury, VP and CTO of The Open Group, the panel included Dr. Tokoro; Jack Fujieda, Founder and CEO ReGIS, Inc.; T.J. Virdi, Senior Enterprise IT Architect at Boeing; and Bill Brierly, Partner and Senior Consultant, Conexiam. The panel discussed the importance of openness for systems, iterating the conference theme of boundaries and the realities of having standards that can ensure openness and dependability at the same time. They also discussed how the O-DA standard provides end-to-end requirements for system architectures that also account for accommodating changes within the system and accountability for it.

Lounsbury concluded the track by iterating that assuring systems’ dependability is not only fundamental to The Open Group mission of Boundaryless Information Flow™ and interoperability but also in preventing large system failures.

Tuesday’s late morning sessions were split into two tracks, with one track continuing the Dependability through Assuredness theme hosted by Joe Bergmann, Forum Chair of The Open Group’s Real-Time and Embedded Systems Forum. In this track, Fujieda and Brierly furthered the discussion of O-DA outlining the philosophy and vision of the standard, as well as providing a roadmap for the standard.

In the morning Business Innovation & Transformation track, Alan Hakimi, Consulting Executive, Microsoft presented “Zen and the Art of Enterprise Architecture: The Dynamics of Transformation in a Complex World.” Hakimi emphasized that transformation needs to focus on a holistic view of an organization’s ecosystem and motivations, economics, culture and existing systems to help foster real change. Based on Buddhist philosophy, he presented an eightfold path to transformation that can allow enterprise architects to approach transformation and discuss it with other architects and business constituents in a way that is meaningful to them and allows for complexity and balance.

This was followed by “Building the Knowledge-Based Enterprise,” a session given by Bob Weisman, Head Management Consultant for Build the Vision.

Tuesday’s afternoon sessions centered on a number of topics including Business Innovation and Transformation, Risk Management, Archimate, TOGAF tutorials and case studies and Professional Development.

In the Archimate track, Vadim Polyakov of Inovalon, Inc., presented “Implementing an EA Practice in an Agile Enterprise” a case study centered on how his company integrated its enterprise architecture with the principles of agile development and how they customized the Archimate framework as part of the process.

The Risk Management track featured William Estrem, President, Metaplexity Associates, and Jim May of Windsor Software discussing how the Open FAIR Standard can be used in conjunction with TOGAF 9.1 to enhance risk management in organizations in their session, “Integrating Open FAIR Risk Analysis into the Enterprise Architecture Capability.” Jack Jones, President of CXOWARE, also discussed the best ways for “Communicating the Value Proposition” for cohesive enterprise architectures to business managers using risk management scenarios.

The plenary sessions and many of the track sessions from today’s tracks can be viewed on The Open Group’s Livestream channel at http://new.livestream.com/opengroup.

The day culminated with dinner and a Lion Dance performance in honor of Chinese New Year performed by Leung’s White Crane Lion & Dragon Dance School of San Francisco.

We would like to express our gratitude for the support by our following sponsors:  BIZZDesign, Corso, Good e-Learning, I-Server and Metaplexity Associates.

IMG_1460 copy

O-DA standard panel discussion with Dave Lounsbury, Bill Brierly, Dr. Mario Tokoro, Jack Fujieda and TJ Virdi

Comments Off

Filed under Conference, Enterprise Architecture, Enterprise Transformation, Standards, TOGAF®, Uncategorized

The Open Group San Francisco 2014 – Day One Highlights

By Loren K. Baynes, Director, Global Marketing Communications

The Open Group’s San Francisco conference, held at the Marriott Union Square, began today highlighting the theme of how the industry is moving Toward Boundaryless Information Flow™.”

The morning plenary began with a welcome from The Open Group President and CEO Allen Brown.  He began the day’s sessions by discussing the conference theme, reminding the audience that The Open Group’s vision of Boundaryless Information Flow began in 2002 as a means to breakdown the silos within organizations and provide better communications within, throughout and beyond organizational walls.

Heather Kreger, Distinguished Engineer and CTO of International Standards at IBM, presented the first session of the day, “Open Technologies Fuel the Business and IT Renaissance.” Kreger discussed how converging technologies such as social and mobile, Big Data, the Internet of Things, analytics, etc.—all powered by the cloud and open architectures—are forcing a renaissance within both IT and companies. Fueling this renaissance is a combination of open standards and open source technologies, which can be used to build out the platforms needed to support these technologies at the speed that is enabling innovation. To adapt to these new circumstances, architects should broaden their skillsets so they have deeper skills and competencies in multiple disciplines, technologies and cultures in order to better navigate this world of open source based development platforms.

The second keynote of the morning, “Enabling the Opportunity to Achieve Boundaryless Information Flow™,” was presented by Larry Schmidt, HP Fellow at Hewlett-Packard, and Eric Stephens, Enterprise Architect, Oracle. Schmidt and Stephens addressed how to cultivate a culture within healthcare ecosystems to enable better information flow. Because healthcare ecosystems are now primarily digital (including not just individuals but technology architectures and the Internet of Things), boundaryless communication is imperative so that individuals can become the managers of their health and the healthcare ecosystem can be better defined. This in turn will help in creating standards that help solve the architectural problems currently hindering the information flow within current healthcare systems, driving better costs and better outcomes.

Following the first two morning keynotes Schmidt provided a brief overview of The Open Group’s new Healthcare Forum. The forum plans to leverage existing Open Group best practices such as harmonization, existing standards (such as TOGAF®) and work with other forums and vertical to create new standards to address the problems facing the healthcare industry today.

Mike Walker, Enterprise Architect at Hewlett-Packard, and Mark Dorfmueller, Associate Director Global Business Services for Procter & Gamble, presented the morning’s final keynote entitled “Business Architecture: The Key to Enterprise Transformation.” According to Walker, business architecture is beginning to change how enterprise architecture is done within organizations. In order to do so, Walker believes that business architects must be able to understand business processes, communicate ideas and engage with others (including other architects) within the business and offer services in order to implement and deliver successful programs. Dorfmueller illustrated business architecture in action by presenting how Procter & Gamble uses their business architecture to change how business is done within the company based on three primary principles—being relevant, practical and making their work consumable for those within the company that implement the architectures.

The morning plenary sessions culminated with a panel discussion on “Future Technology and Enterprise Transformation,” led by Dave Lounsbury, VP and CTO of The Open Group. The panel, which included all of the morning’s speakers, took a high-level view of how emerging technologies are eroding traditional boundaries within organizations. Things within IT that have been specialized in the past are now becoming commoditized to the point where they are now offering new opportunities for companies. This is due to how commonplace they’ve become and because we’re becoming smarter in how we use and get value out of our technologies, as well as the rapid pace of technology innovation we’re experiencing today.

Finally, wrapping up the morning was the Open Trusted Technology Forum (OTTF), a forum of The Open Group, with forum director Sally Long presenting an overview of a new Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program which launched today.  The program is the first such accreditation to provide third-party certification for companies guaranteeing their supply chains are free from maliciously tainted or counterfeit products and conformant to the Open Trusted Technology Provider™ Standard (O-TTPS). IBM is the first company to earn the accreditation and there are at least two other companies that are currently going through the accreditation process.

Monday’s afternoon sessions were split between two tracks, Enterprise Architecture (EA) and Enterprise Transformation and Open Platform 3.0.

In the EA & Enterprise Transformation track, Purna Roy and John Raspen, both Directors of Consulting at Cognizant Technology Solutions, discussed the need to take a broad view and consider factors beyond just IT architectures in their session, “Enterprise Transformation: More than an Architectural Transformation.”  In contrast, Kirk DeCosta, Solution Architect at PNC Financial Services, argued that existing architectures can indeed serve as the foundation for transformation in “The Case for Current State – A Contrarian Viewpoint.”

The Open Platform 3.0 track addressed issues around the convergence of technologies based on cloud platforms, including the impact of Big Data as an enabler of information architectures by Helen Sun, Enterprise Architect at Oracle, and predictive analytics. Dipanjan Sengupta, Principal Architect at Cognizant Technology Solutions, discussed why integration platforms are critical for managing distribution application portfolios in “The Need for a High Performance Integration Platform in the Cloud Era.”

Today’s plenary sessions and many of the track sessions can be viewed on The Open Group’s Livestream channel at http://new.livestream.com/opengroup.

The day ended with an opportunity for everyone to share cocktails and conversation at a networking reception held at the hotel.

photo

Andras Szakal, VP & CTO, IBM U.S. Federal and Chair of the OTTF, presented with a plaque in honor of IBM’s contribution to the O-TTPS Accreditation Program, along with the esteemed panel who were key to the success of the launch.

Comments Off

Filed under Business Architecture, Conference, Enterprise Architecture, Enterprise Transformation, Uncategorized

What I learnt at The Open Group Bangalore Conference last weekend

By Sreekanth Iyer, Executive IT Architect, IBM

It was quite a lot of learning on a Saturday attending The Open Group conference at Bangalore. Actually it was a two day program this year. I could not make it on Friday because of other work commitments. I heard from the people who attended that it was a great session on Friday. At least I knew about a fellow IBMer Jithesh Kozhipurath’s presentation on Friday. I’d the chance to look at that excellent material on applying TOGAF® practices for integrated IT Operations Enterprise Architecture which was his experience sharing of the lab infra optimization work that he was leading.

I started bit late on Saturday, thinking it was happening at the Leela Palace which was near to my home (Ah.. that was in 2008) Realized late that it was at the Philips Innovation Campus at Manyata. But managed to reach just on time before the start of the sessions.

The day started with an Architecture as a Service discussion. The presentation was short but there were lot of interesting questions and interactions post the session.  I was curious know more about the “self-service” aspect on that topic.

Then we had Jason Uppal of ClinicialMessage Inc. on stage (see picture below) , who gave a wonderful presentation on the human touch to the architecture and how to leverage EA to make disruptive changes without disrupting the working systems.

Jason bangaloreLots of take-aways from the session. Importantly the typical reasons why certain Architectures can fail… caused many a times we have a solution already in our mind and we are trying to fit that into the requirement. And most of these times if we look at the Requirements artifact we will be see that the problems are not rightly captured. Couldn’t agree more with the good practices that he discussed.

Starting with  “Identifying the Problem Right” – I thought that is definitely the first and important step in Architecture.  Then Jason talked about significance of communicating and engaging people and stakeholders in the architecture — point that he drove home with a good example from the health care industry. He talked about the criticality of communicating and engaging the stakeholders — engagement of course improves quality. Building the right levers in the architecture and solving the whole problem were some of the other key points that I noted down. More importantly the key message was as Architects, we have to go beyond drawing the lines and boxes to deliver the change, may be look to deliver things that can create an impact in 30 days balancing the short term and long term goals.

I got the stage for couple of minutes to update on the AEA Bangalore Chapter activities. My request to the attendees was to leverage the chapter for their own professional development – using that as a platform to share expertise, get answers to queries, connect with other professionals of similar interest and build the network. Hopefully will see more participation in the Bangalore chapter events this year.

On the security track, had multiple interesting sessions. Began with Jim Hietala of The Open Group discussing the Risk Management Framework. I’ve been attending a course on the subject. But this one provided a lot of insight on the taxonomy (O-RT) and the analysis part – more of taking a quantitative approach than a qualitative approach. Though the example was based on risks with regard to laptop thefts, there is no reason we can’t apply the principles to real issues like quantifying the threats for moving workloads to cloud. (that’s another to-do added to my list).

Then it was my session on the Best practices for moving workloads to cloud for Indian Banks. Talked about the progress so far with the whitepaper. The attendees were limited as there was Jason’s EA workshop happening in parallel. But those who attended were really interested in the subject. We did have a good discussion on the benefits, challenges and regulations with regard to the Indian Banking workloads and their movement to cloud.  We discussed few interesting case studies. There are areas that need more content and I’ve requested the people who attended the session to participate in the workgroup. We are looking at getting a first draft done in the next 30 days.

Finally, also sat in the presentation by Ajit A. Matthew on the security implementation at Intel. Everywhere the message is clear. You need to implement context based security and security intelligence to enable the new age innovation but at the same time protect your core assets.

It was a Saturday well spent. Added had some opportunities to connect with few new folks and understand their security challenges with cloud.  Looking to keep the dialog going and have an AEA Bangalore chapter event sometime during Q1. In that direction, I took the first step to write this up and share with my network.

Event Details:
The Open Group Bangalore, India
January 24-25, 2014

Sreekanth IyerSreekanth Iyer is an Executive IT Architect in IBM Security Systems CTO office and works on developing IBM’s Cloud Security Technical Strategy. He is an Open Group Certified Distinguished Architect and is a core member of the Bangalore Chapter of the Association of Enterprise Architects. He has over 18 years’ industry experience and has led several client solutions across multiple industries. His key areas of work include Information Security, Cloud Computing, SOA, Event Processing, and Business Process management. He has authored several technical articles, blogs and is a core contributor to multiple Open Group as well as IBM publications. He works out of the IBM India Software Lab Bangalore and you can follow him on Twitter @sreek.

Comments Off

Filed under Conference, Enterprise Architecture, Healthcare, TOGAF®

Introducing Two New Security Standards for Risk Analysis—Part II – Risk Analysis Standard

By Jim Hietala, VP Security, The Open Group

Last week we took a look at one of the new risk standards recently introduced by The Open Group® Security Forum at the The Open Group London Conference 2013, the Risk Taxonomy Technical Standard 2.0 (O-RT). Today’s blog looks at its sister standard, the Risk Analysis (O-RA) Standard, which provides risk professionals the tools they need to perform thorough risk analyses within their organizations for better decision-making about risk.

Risk Analysis (O-RA) Standard

The new Risk Analysis Standard provides a comprehensive guide for performing effective analysis scenarios within organizations using the Factor Analysis of Information Risk (FAIR™) framework. O-RA is geared toward managing the frequency and magnitude of loss that can arise from a threat, whether human, animal or a natural event–in other words “how often bad things happened and how bad they are when they occur.” Used together, the O-RT and O-RA Standards provide organizations with a way to perform consistent risk modeling, that can not only help thoroughly explain risk factors to stakeholders but allow information security professionals to strengthen existing or create better analysis methods. O-RA may also be used in conjunction with other risk frameworks to perform risk analysis.

The O-RA standard is also meant to provide something more than a mere assessment of risk. Many professionals within the security industry often fail to distinguish between “assessing” risk vs. “analysis” of risk. This standard goes beyond assessment by supporting effective analyses so that risk statements are less vulnerable to problems and are more meaningful and defensible than assessments that provide only the broad risk-ratings (“this is a 4 on a scale of 1-to-5”) normally used in assessments.

O-RA also lays out standard process for approaching risk analysis that can help organizations streamline the way they approach risk measurement. By focusing in on these four core process elements, organizations are able to perform more effective analyses:

  • Clearly identifying and characterizing the assets, threats, controls and impact/loss elements at play within the scenario being assessed
  • Understanding the organizational context for analysis (i.e. what’s at stake from an organizational perspective)
  • Measuring/estimating various risk factors
  • Calculating risk using a model that represents a logical, rational, and useful view of what risk is and how it works.

Because measurement and calculation are essential elements of properly analyzing risk variables, an entire chapter of the standard is dedicated to how to measure and calibrate risk. This chapter lays out a number of useful approaches for establishing risk variables, including establishing baseline risk estimates and ranges; creating distribution ranges and most likely values; using Monte Carlo simulations; accounting for uncertainty; determining accuracy vs. precision and subjective vs. objective criteria; deriving vulnerability; using ordinal scales; and determining diminishing returns.

Finally, a practical, real-world example is provided to take readers through an actual risk analysis scenario. Using the FAIR model, the example outlines the process for dealing with an threat in which an HR executive at a large bank has left the user name and password that allow him access to all the company’s HR systems on a Post-It note tacked onto his computer in his office in clear view of anyone (other employees, cleaning crews, etc.) who comes into the office.

The scenario outlines four stages in assessing this risk:

  1. .    Stage 1: Identify Scenario Components (Scope the Analysis)
  2. .    Stage 2: Evaluate Loss Event Frequency (LEF)
  3. .    Stage 3: Evaluate Loss Magnitude (LM)
  4. .    Stage 4: Derive and Articulate Risk

Each step of the risk analysis process is thoroughly outlined for the scenario to provide Risk Analysts an example of how to perform an analysis process using the FAIR framework. Considerable guidance is provided for stages 2 and 3, in particular, as those are the most critical elements in determining organizational risk.

Ultimately, the O-RA is a guide to help organizations make better decisions about which risks are the most critical for the organization to prioritize and pay attention to versus those that are less important and may not warrant attention. It is critical for Risk Analysts and organizations to become more consistent in this practice because lack of consistency in determining risk among information security professionals has been a major obstacle in allowing security professionals a more legitimate “seat at the table” in the boardroom with other business functions (finance, HR, etc.) within organizations.

For our profession to evolve and grow, consistency and accurate measurement is key. Issues and solutions must be identified consistently and comparisons and measurement must be based on solid foundations, as illustrated below.

Risk2

Chained Dependencies

O-RA can help organizations arrive at better decisions through consistent analysis techniques as well as provide more legitimacy within the profession.  Without a foundation from which to manage information risk, Risk Analysts and information security professionals may rely too heavily on intuition, bias, commercial or personal agendas for their analyses and decision making. By outlining a thorough foundation for Risk Analysis, O-RA provides not only a common foundation for performing risk analyses but the opportunity to make better decisions and advance the security profession.

For more on the O-RA Standard or to download it, please visit: https://www2.opengroup.org/ogsys/catalog/C13G.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Introducing Two New Security Standards for Risk Analysis—Part I – Risk Taxonomy Technical Standard 2.0

By Jim Hietala, VP Security, The Open Group

At the The Open Group London 2013 Conference, The Open Group® announced three new initiatives related to the Security Forum’s work around Risk Management. The first of these was the establishment of a new certification program for Risk Analysts working within the security profession, the Open FAIR Certification Program.  Aimed at providing a professional certification for Risk Analysts, the program will bring a much-needed level of assuredness to companies looking to hire Risk Analysts, certifying that analysts who have completed the Open FAIR program understand the fundamentals of risk analysis and are qualified to perform that analysis.

Forming the basis of the Open FAIR certification program are two new Open Group standards, version 2.0 of the Risk Taxonomy (O-RT) standard originally introduced by the Security Forum in 2009, and a new Risk Analysis (O-RA) Standard, both of which were also announced at the London conference. These standards are the result of ongoing work around risk analysis that the Security Forum has been conducting for a number of years now in order to help organizations better understand and identify their exposure to risk, particularly when it comes to information security risk.

The Risk Taxonomy and Risk Analysis standards not only form the basis and body of knowledge for the Open FAIR certification, but provide practical advice for security practitioners who need to evaluate and counter the potential threats their organization may face.

Today’s blog will look at the first standard, the Risk Taxonomy Technical Standard, version 2.0. Next week, we’ll look at the other standard for Risk Analysis.

Risk Taxonomy (O-RT) Technical Standard 2.0

Originally, published in January 2009, the O-RT is intended to provide a common language and references for security and business professionals who need to understand or analyze risk conditions, providing a common language for them to use when discussing those risks. Version 2.0 of the standard contains a number of updates based both on feedback provided by professionals that have been using the standard and as a result of research conducted by Security Forum member CXOWARE.

The majority of the changes to Version 2.0 are refinements in terminology, including changes in language that better reflect what each term encompasses. For example, the term “Control Strength” in the original standard has now been changed to “Resistance Strength” to reflect that controls used in that part of the taxonomy must be resistive in nature.

More substantive changes were made to the portion of the taxonomy that discusses how Loss Magnitude is evaluated.

Why create a taxonomy for risk?  For two reasons. First, the taxonomy provides a foundation from which risk analysis can be performed and talked about. Second, a tightly defined taxonomy reduces the inability to effectively measure or estimate risk scenarios, leading to better decision making, as illustrated by the following “risk management stack.”

Effective Management


↑

Well-informed Decisions

Effective Comparisons


↑

Meaningful Measurements

Accurate Risk Model

The complete Risk Taxonomy is comprised of two branches: Loss Event Frequency (LEF) and Loss Magnitude (LM), illustrated here:

Risk1

Focusing solely on pure risk (which only results in loss) rather than speculative risk (which might result in either loss or profit), the O-RT is meant to help estimate the probable frequency and magnitude of future loss.

Traditionally LM has been far more difficult to determine than LEF, in part because organizations don’t always perform analyses on their losses or they just stick to evaluating “low hanging fruit” variables rather than delve into determining more complex risk factors. The new taxonomy takes a deep dive into the Loss Magnitude branch of the risk analysis taxonomy providing guidance that will allow Risk Analysts to better tackle the difficult task of determining LM. It includes terminology outlining six specific forms of loss an organization can experience (productivity, response, replacement, fines and judgments, competitive advantage, reputation) as well as how to determine Loss Flow, a new concept in this standard.

The Loss Flow analysis helps identify how a loss may affect both primary (owners, employees, etc.) and secondary (customers, stockholders, regulators, etc.) stakeholders as a result of a threat agent’s action on an asset. The new standard provides a thorough overview on how to assess Loss Flow and identify the loss factors of any given threat.

Finally, the standard also includes a practical, real-world scenario to help analysts understand how to put the taxonomy to use in within their organizations. O-RT provides a common linguistic foundation that will allow security professionals to then perform the risk analyses as outlined in the O-RA Standard.

For more on the Risk Taxonomy Standard or to download it, visit: https://www2.opengroup.org/ogsys/catalog/C13K.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Open FAIR Certification, RISK Management, Security Architecture

Jericho Forum declares “success” and sunsets

By Ian Dobson & Jim Hietala, The Open Group
Ten years ago, the Jericho Forum set out on a mission to evangelise the issues, problems, solutions and provide thought-leadership around the emerging business and security issues of de-perimeterisation, with the aim of one day being able to declare “job-done”.

That day has now arrived.  Today, de-perimeterisation is an established “fact” – touching not just information security but all areas of modern business, including the bring your own IT phenomenon (devices, IDs, services) as well as all forms of cloud computing. It’s widely understood and quoted by the entire industry.  It has become part of today’s computing and security lexicon.

With our de-perimeterisation mission accomplished, the Jericho Forum has decided the time has come to “declare success”, celebrate it as a landmark victory in the evolution of information security, and sunset as a separate Forum in The Open Group.

Our “declare success and sunset” victory celebration on Monday 21st Oct 2013 at the Central Hall Westminster, London UK, was our valedictory announcement that the Jericho Forum will formally sunset on 1st Nov 2013.  The event included many past leading Jericho Forum members attending as guests, with awards of commemorative plaques to those whose distinctive leadership steered the information security mind-set change success that the Jericho Forum has now achieved.

For those who missed the live-streamed event, you can watch it on the livestream recording at http://new.livestream.com/opengroup/Lon13

We are fortunate to be able to pass our Jericho Forum legacy of de-perimeterisation achievements and publications to the good care of The Open Group’s Security Forum, which has undertaken to maintain the Jericho Forum’s deliverables, protect it’s legacy from mis-representation, and perhaps adopt and evolve Jericho’s thought-leadership approach on future information security challenges.

Ian Dobson, Director Jericho Forum
Jim Hietala, VP Security
The Open Group
21st October 2013


Ian Dobson is the director of the Security Forum and the Jericho Forum for The Open Group, coordinating and facilitating the members to achieve their goals in our challenging information security world. In the Security Forum, his focus is on supporting development of open standards and guides on security architectures and management of risk and security, while in the Jericho Forum he works with members to anticipate the requirements for the security solutions we will need in future.

Jim Hietala, CISSP, GSEC, is the Vice President, Security for The Open Group, where he manages all IT security and risk management programs and standards activities. He participates in the SANS Analyst/Expert program and has also published numerous articles on information security, risk management, and compliance topics in publications including The ISSA Journal, Bank Accounting & Finance, Risk Factor, SC Magazine, and others.

Comments Off

Filed under Conference, Security Architecture

Secure Integration of Convergent Technologies – a Challenge for Open Platform™

By Dr. Chris Harding, The Open Group

The results of The Open Group Convergent Technologies survey point to secure integration of the technologies as a major challenge for Open Platform 3.0.  This and other input is the basis for the definition of the platform, where the discussion took place at The Open Group conference in London.

Survey Highlights

Here are some of the highlights from The Open Group Convergent Technologies survey.

  • 95% of respondents felt that the convergence of technologies such as social media, mobility, cloud, big data, and the Internet of things represents an opportunity for business
  • Mobility currently has greatest take-up of these technologies, and the Internet of things has least.
  • 84% of those from companies creating solutions want to deal with two or more of the technologies in combination.
  • Developing the understanding of the technologies by potential customers is the first problem that solution creators must overcome. This is followed by integrating with products, services and solutions from other suppliers, and using more than one technology in combination.
  • Respondents saw security, vendor lock-in, integration and regulatory compliance as the main problems for users of software that enables use of these convergent technologies for business purposes.
  • When users are considered separately from other respondents, security and vendor lock-in show particularly strongly as issues.

The full survey report is available at: https://www2.opengroup.org/ogsys/catalog/R130

Open Platform 3.0

Analysts forecast that convergence of technical phenomena including mobility, cloud, social media, and big data will drive the growth in use of information technology through 2020. Open Platform 3.0 is an initiative that will advance The Open Group vision of Boundaryless Information Flow™ by helping enterprises to use them.

The survey confirms the value of an open platform to protect users of these technologies from vendor lock-in. It also shows that security is a key concern that must be addressed, that the platform must make the technologies easy to use, and that it must enable them to be used in combination.

Understanding the Requirements

The Open Group is conducting other work to develop an understanding of the requirements of Open Platform 3.0. This includes:

  • The Open Platform 3.0 Business Scenario, that was recently published, and is available from https://www2.opengroup.org/ogsys/catalog/R130
  • A set of business use cases, currently in development
  • A high-level round-table meeting to gain the perspective of CIOs, who will be key stakeholders.

The requirements input have been part of the discussion at The Open Group Conference, which took place in London this week. Monday’s keynote presentation by Andy Mulholland, Former Global CTO at Capgemini on “Just Exactly What Is Going on in Business and Technology?” included the conclusions from the round-table meeting. This week’s presentation and panel discussion on the requirements for Open Platform 3.0 covered all the inputs.

Delivering the Platform

Review of the inputs in the conference was followed by a members meeting of the Open Platform 3.0 Forum, to start developing the architecture of Open Platform 3.0, and to plan the delivery of the platform definition. The aim is to have a snapshot of the definition early in 2014, and to deliver the first version of the standard a year later.

Meeting the Challenge

Open Platform 3.0 will be crucial to establishing openness and interoperability in the new generation of information technologies. This is of first importance for everyone in the IT industry.

Following the conference, there will be an opportunity for everyone to input material and ideas for the definition of the platform. If you want to be part of the community that shapes the definition, to work on it with like-minded people in other companies, and to gain early insight of what it will be, then your company must join the Open Platform 3.0 Forum. (For more information on this, contact Chris Parnell – c.parnell@opengroup.org)

Providing for secure integration of the convergent technologies, and meeting the other requirements for Open Platform 3.0, will be a difficult but exciting challenge. I’m looking forward to continue to tackle the challenge with the Forum members.

Dr. Chris Harding

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Open Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

1 Comment

Filed under Cloud/SOA, Conference, Data management, Future Technologies, Open Platform 3.0, Semantic Interoperability, Service Oriented Architecture, Standards