Tag Archives: standards

How the Open Trusted Technology Provider Standard (O-TTPS) and Accreditation Will Help Lower Cyber Risk

By Andras Szakal, Vice President and Chief Technology Officer, IBM U.S. Federal

Changing business dynamics and enabling technologies

In 2008, IBM introduced the concept of a “Smarter Planet.” The Smarter Planet initiative focused, in part, on the evolution of globalization against the backdrop of changing business dynamics and enabling technologies. A key concept was the need for infrastructure to be tightly integrated, interconnected, and intelligent, thereby facilitating collaboration between people, government and businesses in order to meet the world’s growing appetite for data and automation. Since then, many industries and businesses have adopted this approach, including the ICT (information and communications technology) industries that support the global technology manufacturing supply chain.

Intelligent and interconnected critical systems

This transformation has infused technology into virtually all aspects of our lives, and involves, for example, government systems, the electric grid and healthcare. Most of these technological solutions are made up of hundreds or even thousands of components that are sourced from the growing global technology supply chain.
Intelligent and interconnected critical systems

In the global technology economy, no one technology vendor or integrator is able to always provide a single source solution. It is no longer cost competitive to design all of the electronic components, printed circuit boards, card assemblies, or other sub-assemblies in-house. Adapting to the changing market place and landscape by balancing response time and cost efficiency, in an expedient manner, drives a more wide-spread use of OEM (original equipment manufacturer) products.

As a result, most technology providers procure from a myriad of global component suppliers, who very often require similarly complex supply chains to source their components. Every enterprise has a supplier network, and each of their suppliers has a supply chain network, and these sub-tier suppliers have their own supply chain networks. The resultant technology supply chain is manifested into a network of integrated suppliers.

Increasingly, the critical systems of the planet — telecommunications, banking, energy and others — depend on and benefit from the intelligence and interconnectedness enabled by existing and emerging technologies. As evidence, one need only look to the increase in enterprise mobile applications and BYOD strategies to support corporate and government employees.

Cybersecurity by design: Addressing risk in a sustainable way across the ecosystem

Whether these systems are trusted by the societies they serve depends in part on whether the technologies incorporated into them are fit for the purpose they are intended to serve. Fit for purpose is manifested in two essential ways:

- Does the product meet essential functional requirements?
- Has the product or component been produced by trustworthy provider?

Of course, the leaders or owners of these systems have to do their part to achieve security and safety: e.g., to install, use and maintain technology appropriately, and to pay attention to people and process aspects such as insider threats. Cybersecurity considerations must be addressed in a sustainable way from the get-go, by design, and across the whole ecosystem — not after the fact, or in just one sector or another, or in reaction to crisis.

Assuring the quality and integrity of mission-critical technology

In addressing the broader cybersecurity challenge, however, buyers of mission-critical technology naturally seek reassurance as to the quality and integrity of the products they procure. In our view, the fundamentals of the institutional response to that need are similar to those that have worked in prior eras and in other industries — like food.

The very process of manufacturing technology is not immune to cyber-attack. The primary purpose of attacking the supply chain typically is motivated by monetary gain. The primary goals of a technology supply chain attack are intended to inflict massive economic damage in an effort to gain global economic advantage or as a way to seeding targets with malware that provides unfettered access for attackers.

It is for this reason that the global technology manufacturing industry must establish practices that mitigate this risk by increasing the cost barriers of launching such attacks and increasing the likelihood of being caught before the effects of such an attack are irreversible. As these threats evolve, the global ICT industry must deploy enhanced security through advanced automated cyber intelligence analysis. As critical infrastructure becomes more automated, integrated and essential to critical to functions, the technology supply chain that surrounds it must be considered a principle theme of the overall global security and risk mitigation strategy.

A global, agile, and scalable approach to supply chain security

Certainly, the manner in which technologies are invented, produced, and sold requires a global, agile, and scalable approach to supply chain assurance and is essential to achieve the desired results. Any technology supply chain security standard that hopes to be widely adopted must be flexible and country-agnostic. The very nature of the global supply chain (massively segmented and diverse) requires an approach that provides practicable guidance but avoids being overtly prescriptive. Such an approach would require the aggregation of industry practices that have been proven beneficial and effective at mitigating risk.

The OTTF (The Open Group Trusted Technology Forum) is an increasingly recognized and promising industry initiative to establish best practices to mitigate the risk of technology supply chain attack. Facilitated by The Open Group, a recognized international standards and certification body, the OTTF is working with governments and industry worldwide to create vendor-neutral open standards and best practices that can be implemented by anyone. Current membership includes a list of the most well-known technology vendors, integrators, and technology assessment laboratories.

The benefits of O-TTPS for governments and enterprises

IBM is currently a member of the OTTF and has been honored to hold the Chair for the last three years.  Governments and enterprises alike will benefit from the work of the OTTF. Technology purchasers can use the Open Trusted Technology Provider™ Standard (O-TTPS) and Framework best-practice recommendations to guide their strategies.

A wide range of technology vendors can use O-TTPS approaches to build security and integrity into their end-to-end supply chains. The first version of the O-TTPS is focused on mitigating the risk of maliciously tainted and counterfeit technology components or products. Note that a maliciously tainted product is one that has been produced by the provider and acquired through reputable channels but which has been tampered maliciously. A counterfeit product is produced other than by or for the provider, or is supplied by a non-reputable channel, and is represented as legitimate. The OTTF is currently working on a program that will accredit technology providers who conform to the O-TTPS. IBM expects to complete pilot testing of the program by 2014.

IBM has actively supported the formation of the OTTF and the development of the O-TTPS for several reasons. These include but are not limited to the following:

- The Forum was established within a trusted and respected international standards body – The Open Group.
- The Forum was founded, in part, through active participation by governments in a true public-private partnership in which government members actively participate.
- The OTTF membership includes some of the most mature and trusted commercial technology manufactures and vendors because a primary objective of the OTTF was harmonization with other standards groups such as ISO (International Organization for Standardization) and Common Criteria.

The O-TTPS defines a framework of organizational guidelines and best practices that enhance the security and integrity of COTS ICT. The first version of the O-TTPS is focused on mitigating certain risks of maliciously tainted and counterfeit products within the technology development / engineering lifecycle. These best practices are equally applicable for systems integrators; however, the standard is intended to primarily address the point of view of the technology manufacturer.

O-TTPS requirements

The O-TTPS requirements are divided into three categories:

1. Development / Engineering Process and Method
2. Secure Engineering Practices
3. Supply Chain Security Practices

The O-TTPS is intended to establish a normalized set of criteria against which a technology provider, component supplier, or integrator can be assessed. The standard is divided into categories that define best practices for engineering development practices, secure engineering, and supply chain security and integrity intended to mitigate the risk of maliciously tainted and counterfeit components.

The accreditation program

As part of the process for developing the accreditation criteria and policy, the OTTF established a pilot accreditation program. The purpose of the pilot was to take a handful of companies through the accreditation process and remediate any potential process or interpretation issues. IBM participated in the OTTP-S accreditation pilot to accredit a very significant segment of the software product portfolio; the Application Infrastructure Middleware Division (AIM) which includes the flagship WebSphere product line. The AIM pilot started in mid-2013 and completed in the first week of 2014 and was formally recognized as accredited in the fist week of February 2014.

IBM is currently leveraging the value of the O-TTPS and working to accredit additional development organizations. Some of the lessons learned during the IBM AIM initial O-TTPS accreditation include:

- Conducting a pre-assessment against the O-TTPS should be conducted by an organization before formally entering accreditation. This allows for remediation of any gaps and reduces potential assessment costs and project schedule.
- Starting with a segment of your development portfolio that has a mature secure engineering practices and processes. This helps an organization address accreditation requirements and facilitates interactions with the 3rd party lab.
- Using your first successful O-TTPS accreditation to create templates that will help drive data gathering and validate practices to establish a repeatable process as your organization undertakes additional accreditations.

andras-szakalAndras Szakal, VP and CTO, IBM U.S. Federal, is responsible for IBM’s industry solution technology strategy in support of the U.S. Federal customer. Andras was appointed IBM Distinguished Engineer and Director of IBM’s Federal Software Architecture team in 2005. He is an Open Group Distinguished Certified IT Architect, IBM Certified SOA Solution Designer and a Certified Secure Software Lifecycle Professional (CSSLP).  Andras holds undergraduate degrees in Biology and Computer Science and a Masters Degree in Computer Science from James Madison University. He has been a driving force behind IBM’s adoption of government IT standards as a member of the IBM Software Group Government Standards Strategy Team and the IBM Corporate Security Executive Board focused on secure development and cybersecurity. Andras represents the IBM Software Group on the Board of Directors of The Open Group and currently holds the Chair of the IT Architect Profession Certification Standard (ITAC). More recently, he was appointed chair of The Open Group Trusted Technology Forum and leads the development of The Open Trusted Technology Provider Framework.

1 Comment

Filed under Accreditations, Cybersecurity, government, O-TTF, O-TTPS, OTTF, RISK Management, Standards, supply chain, Supply chain risk

Q&A with Allen Brown, President and CEO of The Open Group

By The Open Group

Last month, The Open Group hosted its San Francisco 2014 conference themed “Toward Boundaryless Information Flow™.” Boundaryless Information Flow has been the pillar of The Open Group’s mission since 2002 when it was adopted as the organization’s vision for Enterprise Architecture. We sat down at the conference with The Open Group President and CEO Allen Brown to discuss the industry’s progress toward that goal and the industries that could most benefit from it now as well as The Open Group’s new Dependability through Assuredness™ Standard and what the organization’s Forums are working on in 2014.

The Open Group adopted Boundaryless Information Flow as its vision in 2002, and the theme of the San Francisco Conference has been “Towards Boundaryless Information Flow.” Where do you think the industry is at this point in progressing toward that goal?

Well, it’s progressing reasonably well but the challenge is, of course, when we established that vision back in 2002, life was a little less complex, a little bit less fast moving, a little bit less fast-paced. Although organizations are improving the way that they act in a boundaryless manner – and of course that changes by industry – some industries still have big silos and stovepipes, they still have big boundaries. But generally speaking we are moving and everyone understands the need for information to flow in a boundaryless manner, for people to be able to access and integrate information and to provide it to the teams that they need.

One of the keynotes on Day One focused on the opportunities within the healthcare industry and The Open Group recently started a Healthcare Forum. Do you see Healthcare industry as a test case for Boundaryless Information Flow and why?

Healthcare is one of the verticals that we’ve focused on. And it is not so much a test case, but it is an area that absolutely seems to need information to flow in a boundaryless manner so that everyone involved – from the patient through the administrator through the medical teams – have all got access to the right information at the right time. We know that in many situations there are shifts of medical teams, and from one medical team to another they don’t have access to the same information. Information isn’t easily shared between medical doctors, hospitals and payers. What we’re trying to do is to focus on the needs of the patient and improve the information flow so that you get better outcomes for the patient.

Are there other industries where this vision might be enabled sooner rather than later?

I think that we’re already making significant progress in what we call the Exploration, Mining and Minerals industry. Our EMMM™ Forum has produced an industry-wide model that is being adopted throughout that industry. We’re also looking at whether we can have an influence in the airline industry, automotive industry, manufacturing industry. There are many, many others, government and retail included.

The plenary on Day Two of the conference focused on The Open Group’s Dependability through Assuredness standard, which was released last August. Why is The Open Group looking at dependability and why is it important?

Dependability is ultimately what you need from any system. You need to be able to rely on that system to perform when needed. Systems are becoming more complex, they’re becoming bigger. We’re not just thinking about the things that arrive on the desktop, we’re thinking about systems like the barriers at subway stations or Tube stations, we’re looking at systems that operate any number of complex activities. And they bring an awful lot of things together that you have to rely upon.

Now in all of these systems, what we’re trying to do is to minimize the amount of downtime because downtime can result in financial loss or at worst human life, and we’re trying to focus on that. What is interesting about the Dependability through Assuredness Standard is that it brings together so many other aspects of what The Open Group is working on. Obviously the architecture is at the core, so it’s critical that there’s an architecture. It’s critical that we understand the requirements of that system. It’s also critical that we understand the risks, so that fits in with the work of the Security Forum, and the work that they’ve done on Risk Analysis, Dependency Modeling, and out of the dependency modeling we can get the use cases so that we can understand where the vulnerabilities are, what action has to be taken if we identify a vulnerability or what action needs to be taken in the event of a failure of the system. If we do that and assign accountability to people for who will do what by when, in the event of an anomaly being detected or a failure happening, we can actually minimize that downtime or remove it completely.

Now the other great thing about this is it’s not only a focus on the architecture for the actual system development, and as the system changes over time, requirements change, legislation changes that might affect it, external changes, that all goes into that system, but also there’s another circle within that system that deals with failure and analyzes it and makes sure it doesn’t happen again. But there have been so many evidences of failure recently. In the banks for example in the UK, a bank recently was unable to process debit cards or credit cards for customers for about three or four hours. And that was probably caused by the work done on a routine basis over a weekend. But if Dependability through Assuredness had been in place, that could have been averted, it could have saved an awfully lot of difficulty for an awful lot of people.

How does the Dependability through Assuredness Standard also move the industry toward Boundaryless Information Flow?

It’s part of it. It’s critical that with big systems the information has to flow. But this is not so much the information but how a system is going to work in a dependable manner.

Business Architecture was another featured topic in the San Francisco plenary. What role can business architecture play in enterprise transformation vis a vis the Enterprise Architecture as a whole?

A lot of people in the industry are talking about Business Architecture right now and trying to focus on that as a separate discipline. We see it as a fundamental part of Enterprise Architecture. And, in fact, there are three legs to Enterprise Architecture, there’s Business Architecture, there’s the need for business analysts, which are critical to supplying the information, and then there are the solutions, and other architects, data, applications architects and so on that are needed. So those three legs are needed.

We find that there are two or three different types of Business Architect. Those that are using the analysis to understand what the business is doing in order that they can inform the solutions architects and other architects for the development of solutions. There are those that are more integrated with the business that can understand what is going on and provide input into how that might be improved through technology. And there are those that can actually go another step and talk about here we have the advances and the technology and here are the opportunities for advancing our competitiveness and organization.

What are some of the other key initiatives that The Open Group’s forum and work groups will be working on in 2014?

That kind question is like if you’ve got an award, you’ve got to thank your friends, so apologies to anyone that I leave out. Let me start alphabetically with the Architecture Forum. The Architecture Forum obviously is working on the evolution of TOGAF®, they’re also working with the harmonization of TOGAF with Archimate® and they have a number of projects within that, of course Business Architecture is on one of the projects going on in the Architecture space. The Archimate Forum are pushing ahead with Archimate—they’ve got two interesting activities going on at the moment, one is called ArchiMetals, which is going to be a sister publication to the ArchiSurance case study, where the ArchiSurance provides the example of Archimate is used in the insurance industry, ArchiMetals is going to be used in a manufacturing context, so there will be a whitepaper on that and there will be examples and artifacts that we can use. They’re also working on in Archimate a standard for interoperability for modeling tools. There are four tools that are accredited and certified by The Open Group right now and we’re looking for that interoperability to help organizations that have multiple tools as many of them do.

Going down the alphabet, there’s DirecNet. Not many people know about DirecNet, but Direcnet™ is work that we do around the U.S. Navy. They’re working on standards for long range, high bandwidth mobile networking. We can go to the FACE™ Consortium, the Future Airborne Capability Environment. The FACE Consortium are working on their next version of their standard, they’re working toward accreditation, a certification program and the uptake of that through procurement is absolutely amazing, we’re thrilled about that.

Healthcare we’ve talked about. The Open Group Trusted Technology Forum, where they’re working on how we can trust the supply chain in developed systems, they’ve released the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program, that was launched this week, and we already have one accredited vendor and two certified test labs, assessment labs. That is really exciting because now we’ve got a way of helping any organization that has large complex systems that are developed through a global supply chain to make sure that they can trust their supply chain. And that is going to be invaluable to many industries but also to the safety of citizens and the infrastructure of many countries. So the other part of the O-TTPS is that standard we are planning to move toward ISO standardization shortly.

The next one moving down the list would be Open Platform 3.0™. This is really exciting part of Boundaryless Information Flow, it really is. This is talking about the convergence of SOA, Cloud, Social, Mobile, Internet of Things, Big Data, and bringing all of that together, this convergence, this bringing together of all of those activities is really something that is critical right now, and we need to focus on. In the different areas, some of our Cloud computing standards have already gone to ISO and have been adopted by ISO. We’re working right now on the next products that are going to move through. We have a governance standard in process and an ecosystem standard has recently been published. In the area of Big Data there’s a whitepaper that’s 25 percent completed, there’s also a lot of work on the definition of what Open Platform 3.0 is, so this week the members have been working on trying to define Open Platform 3.0. One of the really interesting activities that’s gone on, the members of the Open Platform 3.0 Forum have produced something like 22 different use cases and they’re really good. They’re concise and they’re precise and the cover a number of different industries, including healthcare and others, and the next stage is to look at those and work on the ROI of those, the monetization, the value from those use cases, and that’s really exciting, I’m looking forward to peeping at that from time to time.

The Real Time and Embedded Systems Forum (RTES) is next. Real-Time is where we incubated the Dependability through Assuredness Framework and that was where that happened and is continuing to develop and that’s really good. The core focus of the RTES Forum is high assurance system, and they’re doing some work with ISO on that and a lot of other areas with multicore and, of course, they have a number of EC projects that we’re partnering with other partners in the EC around RTES.

The Security Forum, as I mentioned earlier, they’ve done a lot of work on risk and dependability. So they’ve not only their standards for the Risk Taxonomy and Risk Analysis, but they’ve now also developed the Open FAIR Certification for People, which is based on those two standards of Risk Analysis and Risk Taxonomy. And we’re already starting to see people being trained and being certified under that Open FAIR Certification Program that the Security Forum developed.

A lot of other activities are going on. Like I said, I probably left a lot of things out, but I hope that gives you a flavor of what’s going on in The Open Group right now.

The Open Group will be hosting a summit in Amsterdam May 12-14, 2014. What can we look forward to at that conference?

In Amsterdam we have a summit – that’s going to bring together a lot of things, it’s going to be a bigger conference that we had here. We’ve got a lot of activity in all of our activities; we’re going to bring together top-level speakers, so we’re looking forward to some interesting work during that week.

 

 

 

Leave a comment

Filed under ArchiMate®, Boundaryless Information Flow™, Business Architecture, Conference, Cybersecurity, EMMMv™, Enterprise Architecture, FACE™, Healthcare, O-TTF, RISK Management, Standards, TOGAF®

Accrediting the Global Supply Chain: A Conversation with O-TTPS Recognized Assessors Fiona Pattinson and Erin Connor

By The Open Group 

At the recent San Francisco 2014 conference, The Open Group Trusted Technology Forum (OTTF) announced the launch of the Open Trusted Technology Provider™ Standard (O-TTPS) Accreditation Program.

The program is one the first accreditation programs worldwide aimed at assuring the integrity of commercial off-the-shelf (COTS) information and communication technology (ICT) products and the security of their supply chains.

In three short years since OTTF launched, the forum has grown to include more than 25 member companies dedicated to safeguarding the global supply chain against the increasing sophistication of cybersecurity attacks through standards. Accreditation is yet another step in the process of protecting global technology supply chains from maliciously tainted and counterfeit products.

As part of the program, third-party assessor companies will be employed to assess organizations applying for accreditation, with The Open Group serving as the vendor-neutral Accreditation Authority that operates the program.  Prior to the launch, the forum conducted a pilot program with a number of member companies. It was announced at the conference that IBM is the first company to becoming accredited, earning accreditation for its Application, Infrastructure and Middleware (AIM), software business division for its product integrity and supply chain practices.

We recently spoke with OTTF members Fiona Pattinson, director of strategy and business development at Atsec Information Security, and Erin Connor, director at EWA-Canada, at the San Francisco conference to learn more about the assessment process and the new program.

The O-TTPS focus is on securing the technology supply chain. What would you say are the biggest threats facing the supply chain today?

Fiona Pattinson (FP): I think in the three years since the forum began certainly all the members have discussed the various threats quite a lot. It was one of things we discussed as an important topic early on, and I don’t know if it’s the ‘biggest threat,’ but certainly the most important threats that we needed to address initially were those of counterfeit and maliciously tainted products. We came to that through both discussion with all the industry experts in the forum and also through research into some of the requirements from government, so that’s exactly how we knew which threats [to start with].

Erin Connor (EC):  And the forum benefits from having both sides of the acquisition process, both acquirers, and the suppliers and vendors. So they get both perspectives.

How would you define maliciously tainted and counterfeit products?

FP:  They are very carefully defined in the standard—we needed to do that because people’s understanding of that can vary so much.

EC: And actually the concept of ‘maliciously’ tainted was incorporated close to the end of the development process for the standard at the request of members on the acquisition side of the process.

[Note: The standard precisely defines maliciously tainted and counterfeit products as follows:

"The two major threats that acquirers face today in their COTS ICT procurements, as addressed in this Standard, are defined as:

1. Maliciously tainted product – the product is produced by the provider and is acquired

through a provider’s authorized channel, but has been tampered with maliciously.

2. Counterfeit product – the product is produced other than by, or for, the provider, or is

supplied to the provider by other than a provider’s authorized channel and is presented as being legitimate even though it is not."]

The OTTF announced the Accreditation Program for the OTTP Standard at the recent San Francisco conference. Tell us about the standard and how the accreditation program will help ensure conformance to it?

EC: The program is intended to provide organizations with a way to accredit their lifecycle processes for their product development so they can prevent counterfeit or maliciously tainted components from getting into the products they are selling to an end user or into somebody else’s supply chain. It was determined that a third-party type of assessment program would be used. For the organizations, they will know that we Assessors have gone through a qualification process with The Open Group and that we have in place all that’s required on the management side to properly do an assessment. From the consumer side, they have confidence the assessment has been completed by an independent third-party, so they know we aren’t beholden to the organizations to give them a passing grade when perhaps they don’t deserve it. And then of course The Open Group is in position to oversee the whole process and award the final accreditation based on the recommendation we provide.  The Open Group will also be the arbiter of the process between the assessors and organizations if necessary. 

FP:  So The Open Group’s accreditation authority is validating the results of the assessors.

EC: It’s a model that is employed in many, many other product or process assessment and evaluation programs where the actual accreditation authority steps back and have third parties do the assessment.

FP: It is important that the assessor companies are working to the same standard so that there’s no advantage in taking one assessor over the other in terms of the quality of the assessments that are produced.

How does the accreditation program work?

FP: Well, it’s brand new so we don’t know if it is perfect yet, but having said that, we have worked over several months on defining the process, and we have drawn from The Open Group’s existing accreditation programs, as well as from the forum experts who have worked in the accreditation field for many years. We have been performing pilot accreditations in order to check out how the process works. So it is already tested.

How does it actually work? Well, first of all an organization will feel the need to become accredited and at that point will apply to The Open Group to get the accreditation underway. Once their scope of accreditation – which may be as small as one product or theoretically as large as a whole global company – and once the application is reviewed and approved by The Open Group, then they engage an assessor.

There is a way of sampling a large scope to identify the process variations in a larger scope using something we term ‘selective representative products.’ It’s basically a way of logically sampling a big scope so that we capture the process variations within the scope and make sure that the assessment is kept to a reasonable size for the organization undergoing the assessment, but it also gives good assurance to the consumers that it is a representative sample. The assessment is performed by the Recognized Assessor company, and a final report is written and provided to The Open Group for their validation. If everything is in order, then the company will be accredited and their scope of conformance will be added to the accreditation register and trademarked.

EC: So the customers of that organization can go and check the registration for exactly what products are covered by the scope.

FP: Yes, the register is public and anybody can check. So if IBM says WebSphere is accredited, you can go and check that claim on The Open Group web site.

How long does the process take or does it vary?

EC: It will vary depending on how large the scope to be accredited is in terms of the size of the representative set and the documentation evidence. It really does depend on what the variations in the processes are among the product lines as to how long it takes the assessor to go through the evidence and then to produce the report. The other side of the coin is how long it takes the organization to produce the evidence. It may well be that they might not have it totally there at the outset and will have to create some of it.

FP: As Erin said, it varies by the complexity and the variation of the processes and hence the number of selected representative products. There are other factors that can influence the duration. There are three parties influencing that: The applicant Organization, The Open Group’s Accreditation Authority and the Recognized Assessor.

For example, we found that the initial work by the Organization and the Accreditation Authority in checking the scope and the initial documentation can take a few weeks for a complex scope, of course for the pilots we were all new at doing that. In this early part of the project it is vital to get the scope both clearly defined and approved since it is key to a successful accreditation.

It is important that an Organization assigns adequate resources to help keep this to the shortest time possible, both during the initial scope discussions, and during the assessment. If the Organization can provide all the documentation before they get started, then the assessors are not waiting for that and the duration of the assessment can be kept as short as possible.

Of course the resources assigned by the Recognized Assessor also influences how long an assessment takes. A variable for the assessors is how much documentation do they have to read and review? It might be small or it might be a mountain.

The Open Group’s final review and oversight of the assessment takes some time and is influenced by resource availability within that organization. If they have any questions it may take a little while to resolve.

What kind of safeguards does the accreditation program put in place for enforcing the standard?

FP: It is a voluntary standard—there’s no requirement to comply. Currently some of the U.S. government organizations are recommending it. For example, NASA in their SEWP contract and some of the draft NIST documents on Supply Chain refer to it, too.

EC: In terms of actual oversight, we review what their processes are as assessors, and the report and our recommendations are based on that review. The accreditation expires after three years so before the three years is up, the organization should actually get the process underway to obtain a re-accreditation.  They would have to go through the process again but there will be a few more efficiencies because they’ve done it before. They may also wish to expand the scope to include the other product lines and portions of the company. There aren’t any periodic ‘spot checks’ after accreditation to make sure they’re still following the accredited processes, but part of what we look at during the assessment is that they have controls in place to ensure they continue doing the things they are supposed to be doing in terms of securing their supply chain.

FP:  And then the key part is the agreement the organizations signs with The Open Group includes the fact the organization warrant and represent that they remain in conformance with the standard throughout the accreditation period. So there is that assurance too, which builds on the more formal assessment checks.

What are the next steps for The Open Group Trusted Technology Forum?  What will you be working on this year now that the accreditation program has started?

FP: Reviewing the lessons we learned through the pilot!

EC: And reviewing comments from members on the standard now that it’s publicly available and working on version 1.1 to make any corrections or minor modifications. While that’s going on, we’re also looking ahead to version 2 to make more substantial changes, if necessary. The standard is definitely going to be evolving for a couple of years and then it will reach a steady state, which is the normal evolution for a standard.

For more details on the O-TTPS accreditation program, to apply for accreditation, or to learn more about becoming an O-TTPS Recognized Assessor visit the O-TTPS Accreditation page.

For more information on The Open Group Trusted Technology Forum please visit the OTTF Home Page.

The O-TTPS standard and the O-TTPS Accreditation Policy they are freely available from the Trusted Technology Section in The Open Group Bookstore.

For information on joining the OTTF membership please contact Mike Hickey – m.hickey@opengroup.org

Fiona Pattinson Fiona Pattinson is responsible for developing new and existing atsec service offerings.  Under the auspices of The Open Group’s OTTF, alongside many expert industry colleagues, Fiona has helped develop The Open Group’s O-TTPS, including developing the accreditation program for supply chain security.  In the past, Fiona has led service developments which have included establishing atsec’s US Common Criteria laboratory, the CMVP cryptographic module testing laboratory, the GSA FIPS 201 TP laboratory, TWIC reader compliance testing, NPIVP, SCAP, PCI, biometrics testing and penetration testing. Fiona has responsibility for understanding a broad range of information security topics and the application of security in a wide variety of technology areas from low-level design to the enterprise level.

ErinConnorErin Connor is the Director at EWA-Canada responsible for EWA-Canada’s Information Technology Security Evaluation & Testing Facility, which includes a Common Criteria Test Lab, a Cryptographic & Security Test Lab (FIPS 140 and SCAP), a Payment Assurance Test Lab (device testing for PCI PTS POI & HSM, Australian Payment Clearing Association and Visa mPOS) and an O-TTPS Assessor lab Recognized by the Open Group.  Erin participated with other expert members of the Open Group Trusted Technology Forum (OTTF) in the development of The Open Group Trusted Technology Provider Standard for supply chain security and its accompanying Accreditation Program.  Erin joined EWA-Canada in 1994 and his initial activities in the IT Security and Infrastructure Assurance field included working on the team fielding a large scale Public Key Infrastructure system, Year 2000 remediation and studies of wireless device vulnerabilities.  Since 2000, Erin has been working on evaluations of a wide variety of products including hardware security modules, enterprise security management products, firewalls, mobile device and management products, as well as system and network vulnerability management products.  He was also the only representative of an evaluation lab in the Biometric Evaluation Methodology Working Group, which developed a proposed methodology for the evaluation of biometric technologies under the Common Criteria.

Leave a comment

Filed under Accreditations, Cybersecurity, OTTF, Professional Development, Standards, Supply chain risk

How to Build a Smarter City – Join The Open Group Tweet Jam on February 26

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

On Wednesday, February 26, The Open Group will host a Tweet Jam examining smart cities and how Real-time and Embedded Systems can seamlessly integrate inputs from various agencies and locations. That collective data allows local governments to better adapt to change by implementing an analytics-based approach to measure:

  • Economic activity
  • Mobility patterns
  • Resource consumption
  • Waste management and sustainability measures
  • Inclement weather
  • And much more!

These metrics allow smart cities to do much more than just coordinate responses to traffic jams, they are forecasting and coordinating safety measures in advance of physical disasters and inclement weather; calculating where offices and shops can be laid out most efficiently; and how all the parts of urban life should be fitted together including energy, sustainability and infrastructural repairs and planning and development.

Smart cities are already very much a reality in the Middle East and in Korea and those have become a model for developers in China, and for redevelopment in Europe. Market research firm, IDC Government Insights projects that 2014 is the year cities around the world start getting smart. It predicts a $265 billion spend by cities worldwide this year alone to implement new technology and integrate agency data. Part of the reason for that spend is likely spurred by the fact that more than half the world’s population currently lives in urban areas. With urbanization rates rapidly increasing, Brookings Institution estimates that number could swell up to 75 percent of the global populace by 2050.

While the awe-inspiring smart city of Rio de Janeiro is proving to be an interesting smart city model for cities across the world, are smart cities always the best option for informing city decisions?  Could the beauty of a self-regulating open grid allow people to decide how best to use spaces in the city?

Please join us on Wednesday, February 26 at 9:00 am PT/12:00 pm ET/5:00 pm GMT for a tweet jam, that will discuss the issues around smart cities.  We welcome The Open Group members and interested participants from all backgrounds to join the discussion and interact with our panel of thought-leaders including  David Lounsbury, CTO and Chris Harding, Director of Interoperability from The Open Group. To access the discussion, please follow the #ogchat hashtag during the allotted discussion time.

What Is a Tweet Jam?

A tweet jam is a one-hour “discussion” hosted on Twitter. The purpose of the tweet jam is to share knowledge and answer questions on relevant and thought-provoking issues. Each tweet jam is led by a moderator and a dedicated group of experts to keep the discussion flowing. The public (or anyone using Twitter interested in the topic) is encouraged to join the discussion.

Participation Guidance

Whether you’re a newbie or veteran Twitter user, here are a few tips to keep in mind:

Have your first #ogchat tweet be a self-introduction: name, affiliation, occupation.

Start all other tweets with the question number you’re responding to and add the #ogchat hashtag.

Sample: “A1: There are already a number of cities implementing tech to get smarter. #ogchat”

Please refrain from product or service promotions. The goal of a tweet jam is to encourage an exchange of knowledge and stimulate discussion.

While this is a professional get-together, we don’t have to be stiff! Informality will not be an issue.

A tweet jam is akin to a public forum, panel discussion or Town Hall meeting – let’s be focused and thoughtful.

If you have any questions prior to the event or would like to join as a participant, please contact Rob Checkal (@robcheckal or rob.checkal@hotwirepr.com). We anticipate a lively chat and hope you will be able to join!

Leave a comment

Filed under real-time and embedded systems, Tweet Jam

Facing the Challenges of the Healthcare Industry – An Interview with Eric Stephens of The Open Group Healthcare Forum

By The Open Group

The Open Group launched its new Healthcare Forum at the Philadelphia conference in July 2013. The forum’s focus is on bringing Boundaryless Information Flow™ to the healthcare industry to enable data to flow more easily throughout the complete healthcare ecosystem through a standardized vocabulary and messaging. Leveraging the discipline and principles of Enterprise Architecture, including TOGAF®, the forum aims to develop standards that will result in higher quality outcomes, streamlined business practices and innovation within the industry.

At the recent San Francisco 2014 conference, Eric Stephens, Enterprise Architect at Oracle, delivered a keynote address entitled, “Enabling the Opportunity to Achieve Boundaryless Information Flow” along with Larry Schmidt, HP Fellow at Hewlett-Packard. A veteran of the healthcare industry, Stephens was Senior Director of Enterprise Architects Excellus for BlueCross BlueShield prior to joining Oracle and he is an active member of the Healthcare Forum.

We sat down after the keynote to speak with Stephens about the challenges of healthcare, how standards can help realign the industry and the goals of the forum. The opinions expressed here are Stephens’ own, not of his employer.

What are some of the challenges currently facing the healthcare industry?

There are a number of challenges, and I think when we look at it as a U.S.-centric problem, there’s a disproportionate amount of spending that’s taking place in the U.S. For example, if you look at GDP or percentage of GDP expenditures, we’re looking at now probably 18 percent of GDP [in the U.S.], and other developed countries are spending a full 5 percent less than that of their GDP, and in some cases they’re getting better outcomes outside the U.S.

The mere fact that there’s the existence of what we call “medical tourism, where if I need a hip replacement, I can get it done for a fraction of the cost in another country, same or better quality care and have a vacation—a rehab vacation—at the same time and bring along a spouse or significant other, means there’s a real wide range of disparity there. 

There’s also a lack of transparency. Having worked at an insurance company, I can tell you that with the advent of high deductible plans, there’s a need for additional cost information. When I go on Amazon or go to a local furniture store, I know what the cost is going to be for what I’m about to purchase. In the healthcare system, we don’t get that. With high deductible plans, if I’m going to be responsible for a portion or a larger portion of the fee, I want to know what it is. And what happens is, the incentives to drive costs down force the patient to be a consumer. The consumer now asks the tough questions. If my daughter’s going in for a tonsillectomy, show me a bill of materials that shows me what’s going to be done – if you are charging me $20/pill for Tylenol, I’ll bring my own. Increased transparency is what will in turn drive down the overall costs.

I think there’s one more thing, and this gets into the legal side of things. There is an exorbitant amount of legislation and regulation around what needs to be done. And because every time something goes sideways, there’s going to be a lawsuit, doctors will prescribe an extra test, and extra X-ray for a patient whether they need it or not.

The healthcare system is designed around a vicious cycle of diagnose-treat-release. It’s not incentivized to focus on prevention and management. Oregon is promoting these coordinated care organizations (CCOs) that would be this intermediary that works with all medical professionals – whether it was physical, mental, dental, even social worker – to coordinate episodes of care for patients. This drives down inappropriate utilization – for example, using an ER as a primary care facility and drives the medical system towards prevention and management of health. 

Your keynote with Larry Schmidt of HP focused a lot on cultural changes that need to take place within the healthcare industry – what are some of the changes necessary for the healthcare industry to put standards into place?

I would say culturally, it goes back to those incentives, and it goes back to introducing this idea of patient-centricity. And for the medical community, to really start recognizing that these individuals are consumers and increased choice is being introduced, just like you see in other industries. There are disruptive business models. As a for instance, medical tourism is a disruptive business model for United States-based healthcare. The idea of pharmacies introducing clinical medicine for routine care, such as what you see at a CVS, Wal-Mart or Walgreens. I can get a flu shot, I can get a well-check visit, I can get a vaccine – routine stuff that doesn’t warrant a full-blown medical professional. It’s applying the right amount of medical care to a particular situation.

Why haven’t existing standards been adopted more broadly within the industry? What will help providers be more likely to adopt standards?

I think the standards adoption is about “what’s in it for me, the WIIFM idea. It’s demonstrating to providers that utilizing standards is going to help them get out of the medical administration business and focus on their core business, the same way that any other business would want to standardize its information through integration, processes and components. It reduces your overall maintenance costs going forward and arguably you don’t need a team of billing folks sitting in an doctor’s office because you have standardized exchanges of information.

Why haven’t they been adopted? It’s still a question in my mind. Why would a doctor not want to do that is perhaps a question we’re going to need to explore as part of the Healthcare Forum.

Is it doctors that need to adopt the standards or technologies or combination of different constituents within the ecosystem?

I think it’s a combination. We hear a lot about the Affordable Care Act (ACA) and the health exchanges. What we don’t hear about is the legislation to drive toward standardization to increase interoperability. So unfortunately it would seem the financial incentives or things we’ve tried before haven’t worked, and we may simply have to resort to legislation or at least legislative incentives to make it happen because part of the funding does cover information exchanges so you can move health information between providers and other actors in the healthcare system.

You’re advocating putting the individual at the center of the healthcare ecosystem. What changes need to take place within the industry in order to do this?

I think it’s education, a lot of education that has to take place. I think that individuals via the incentive model around high deductible plans will force some of that but it’s taking responsibility and understanding the individual role in healthcare. It’s also a cultural/societal phenomenon.

I’m kind of speculating here, and going way beyond what enterprise architecture or what IT would deliver, but this is a philosophical thing around if I have an ailment, chances are there’s a pill to fix it. Look at the commercials, every ailment say hypertension, it’s easy, you just dial the medication correctly and you don’t worry as much about diet and exercise. These sorts of things – our over-reliance on medication. I’m certainly not going to knock the medications that are needed for folks that absolutely need them – but I think we can become too dependent on pharmacological solutions for our health problems.   

What responsibility will individuals then have for their healthcare? Will that also require a cultural and behavioral shift for the individual?

The individual has to start managing his or her own health. We manage our careers and families proactively. Now we need to focus on our health and not just float through the system. It may come to financial incentives for certain “individual KPIs such as blood pressure, sugar levels, or BMI. Advances in medical technology may facilitate more personal management of one’s health.

One of the Healthcare Forum’s goals is to help establish Boundaryless Information Flow within the Healthcare industry you’ve said that understanding the healthcare ecosystem will be a key component for that what does that ecosystem encompass and why is it important to know that first?

Very simply we’re talking about the member/patient/consumer, then we get into the payers, the providers, and we have to take into account government agencies and other non-medical agents, but they all have to work in concert and information needs to flow between those organizations in a very standardized way so that decisions can be made in a very timely fashion.

It can’t be bottled up, it’s got to be provided to the right provider at the right time, otherwise, best case, it’s going to cost more to manage all the actors in the system. Worst case, somebody dies or there is a “never event due to misinformation or lack of information during the course of care. The idea of Boundaryless Information Flow gives us the opportunity to standardize, have easily accessible information – and by the way secured – it can really aide in that decision-making process going forward. It’s no different than Wal-Mart knowing what kind of merchandise sells well before and after a hurricane (i.e., beer and toaster pastries, BTW). It’s the same kind of real-time information that’s made available to a Google car so it can steer its way down the road. It’s that kind of viscosity needed to make the right decisions at the right time.

Healthcare is a highly regulated industry, how can Boundarylesss Information Flow and data collection on individuals be achieved and still protect patient privacy?

We can talk about standards and the flow and the technical side. We need to focus on the security and privacy side.  And there’s going to be a legislative side because we’re going to touch on real fundamental data governance issue – who owns the patient record? Each actor in the system thinks they own the patient record. If we’re going to require more personal accountability for healthcare, then shouldn’t the consumer have more ownership? 

We also need to address privacy disclosure regulations to avoid catastrophic data leaks of protected health information (PHI). We need bright IT talent to pull off the integration we are talking about here. We also need folks who are well versed in the privacy laws and regulations. I’ve seen project teams of 200 have up to eight folks just focusing on the security and privacy considerations. We can argue about headcount later but my point is the same – one needs some focused resources around this topic.

What will standards bring to the healthcare industry that is missing now?

I think the standards, and more specifically the harmonization of the standards, is going to bring increased maintainability of solutions, I think it’s going to bring increased interoperability, I think it’s going to bring increased opportunities too. We see mobile computing or even DropBox, that has API hooks into all sorts of tools, and it’s well integrated – so I can integrate and I can move files between devices, I can move files between apps because they have hooks it’s easy to work with. So it’s building these communities of developers, apps and technical capabilities that makes it easy to move the personal health record for example, back and forth between providers and it’s not a cataclysmic event to integrate a new version of electronic health records (EHR) or to integrate the next version of an EHR. This idea of standardization but also some flexibility that goes into it.

Are you looking just at the U.S. or how do you make a standard that can go across borders and be international?

It is a concern, much of my thinking and much of what I’ve conveyed today is U.S.-centric, based on our problems, but many of these interoperability problems are international. We’re going to need to address it; I couldn’t tell you what the sequence is right now. There are other considerations, for example, single vs. multi-payer—that came up in the keynote. We tend to think that if we stay focused on the consumer/patient we’re going to get it for all constituencies. It will take time to go international with a standard, but it wouldn’t be the first time. We have a host of technical standards for the Internet (e.g., TCP/IP, HTTP). The industry has been able to instill these standards across geographies and vendors. Admittedly, the harmonization of health care-related standards will be more difficult. However, as our world shrinks with globalization an international lens will need to be applied to this challenge. 

Eric StephensEric Stephens (@EricStephens) is a member of Oracle’s executive advisory community where he focuses on advancing clients’ business initiatives leveraging the practice of Business and Enterprise Architecture. Prior to joining Oracle he was Senior Director of Enterprise Architecture at Excellus BlueCross BlueShield leading the organization with architecture design, innovation, and technology adoption capabilities within the healthcare industry.

 

Leave a comment

Filed under Conference, Data management, Enterprise Architecture, Healthcare, Information security, Standards, TOGAF®

The Open Group San Francisco 2014 – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications

Day two, February 4th, of The Open Group San Francisco conference kicked off with a welcome and opening remarks from Steve Nunn, COO of The Open Group and CEO of the Association of Enterprise Architects.

Nunn introduced Allen Brown, President and CEO of The Open Group, who provided highlights from The Open Group’s last quarter.  As of Q4 2013, The Open Group had 45,000 individual members in 134 countries hailing from 449 member companies in 38 countries worldwide. Ten new member companies have already joined The Open Group in 2014, and 24 members joined in the last quarter of 2013, with the first member company joining from Vietnam. In addition, 6,500 individuals attended events sponsored by The Open Group in Q4 2013 worldwide.

Updates on The Open Group’s ongoing work were provided including updates on the FACE™ Consortium, DirectNet® Waveform Standard, Architecture Forum, Archimate® Forum, Open Platform 3.0™ Forum and Security Forum.

Of note was the ongoing development of TOGAF® and introduction of a three-volume work including individual volumes outlining the TOGAF framework, guidance and tools and techniques for the standard, as well as collaborative work that allows the Archimate modeling language to be used for risk management in enterprise architectures.

In addition, Open Platform 3.0 Forum has already put together 22 business use cases outlining ROI and business value for various uses related to technology convergence. The Cloud Work Group’s Cloud Reference Architecture has also been submitted to ISO for international standards certification, and the Security Forum has introduced certification programs for OpenFAIR risk management certification for individuals.

The morning plenary centered on The Open Group’s Dependability through Assuredness™ (O-DA) Framework, which was released last August.

Speaking first about the framework was Dr. Mario Tokoro, Founder and Executive Advisor for Sony Computer Science Laboratories. Dr. Tokoro gave an overview of the Dependable Embedded OS project (DEOS), a large national project in Japan originally intended to strengthen the country’s embedded systems. After considerable research, the project leaders discovered they needed to consider whether large, open systems could be dependable when it came to business continuity, accountability and ensuring consistency throughout the systems’ lifecycle. Because the boundaries of large open systems are ever-changing, the project leaders knew they must put together dependability requirements that could accommodate constant change, allow for continuous service and provide continuous accountability for the systems based on consensus. As a result, they put together a framework to address both the change accommodation cycle and failure response cycles for large systems – this framework was donated to The Open Group’s Real-Time Embedded Systems Forum and released as the O-DA standard.

Dr. Tokoro’s presentation was followed by a panel discussion on the O-DA standard. Moderated by Dave Lounsbury, VP and CTO of The Open Group, the panel included Dr. Tokoro; Jack Fujieda, Founder and CEO ReGIS, Inc.; T.J. Virdi, Senior Enterprise IT Architect at Boeing; and Bill Brierly, Partner and Senior Consultant, Conexiam. The panel discussed the importance of openness for systems, iterating the conference theme of boundaries and the realities of having standards that can ensure openness and dependability at the same time. They also discussed how the O-DA standard provides end-to-end requirements for system architectures that also account for accommodating changes within the system and accountability for it.

Lounsbury concluded the track by iterating that assuring systems’ dependability is not only fundamental to The Open Group mission of Boundaryless Information Flow™ and interoperability but also in preventing large system failures.

Tuesday’s late morning sessions were split into two tracks, with one track continuing the Dependability through Assuredness theme hosted by Joe Bergmann, Forum Chair of The Open Group’s Real-Time and Embedded Systems Forum. In this track, Fujieda and Brierly furthered the discussion of O-DA outlining the philosophy and vision of the standard, as well as providing a roadmap for the standard.

In the morning Business Innovation & Transformation track, Alan Hakimi, Consulting Executive, Microsoft presented “Zen and the Art of Enterprise Architecture: The Dynamics of Transformation in a Complex World.” Hakimi emphasized that transformation needs to focus on a holistic view of an organization’s ecosystem and motivations, economics, culture and existing systems to help foster real change. Based on Buddhist philosophy, he presented an eightfold path to transformation that can allow enterprise architects to approach transformation and discuss it with other architects and business constituents in a way that is meaningful to them and allows for complexity and balance.

This was followed by “Building the Knowledge-Based Enterprise,” a session given by Bob Weisman, Head Management Consultant for Build the Vision.

Tuesday’s afternoon sessions centered on a number of topics including Business Innovation and Transformation, Risk Management, Archimate, TOGAF tutorials and case studies and Professional Development.

In the Archimate track, Vadim Polyakov of Inovalon, Inc., presented “Implementing an EA Practice in an Agile Enterprise” a case study centered on how his company integrated its enterprise architecture with the principles of agile development and how they customized the Archimate framework as part of the process.

The Risk Management track featured William Estrem, President, Metaplexity Associates, and Jim May of Windsor Software discussing how the Open FAIR Standard can be used in conjunction with TOGAF 9.1 to enhance risk management in organizations in their session, “Integrating Open FAIR Risk Analysis into the Enterprise Architecture Capability.” Jack Jones, President of CXOWARE, also discussed the best ways for “Communicating the Value Proposition” for cohesive enterprise architectures to business managers using risk management scenarios.

The plenary sessions and many of the track sessions from today’s tracks can be viewed on The Open Group’s Livestream channel at http://new.livestream.com/opengroup.

The day culminated with dinner and a Lion Dance performance in honor of Chinese New Year performed by Leung’s White Crane Lion & Dragon Dance School of San Francisco.

We would like to express our gratitude for the support by our following sponsors:  BIZZDesign, Corso, Good e-Learning, I-Server and Metaplexity Associates.

IMG_1460 copy

O-DA standard panel discussion with Dave Lounsbury, Bill Brierly, Dr. Mario Tokoro, Jack Fujieda and TJ Virdi

Leave a comment

Filed under Conference, Enterprise Architecture, Enterprise Transformation, Standards, TOGAF®, Uncategorized

Open Platform 3.0™ to Help Rally Organizations in Innovation and Development

by Andy Mulholland, Former Global CTO, Capgemini

The Open Platform 3.0™ initiative, launched by The Open Group, provides a forum in which organizations, including standards bodies, as much as users and product vendors, can coordinate their approach to new business models and new practices for use of IT, can define or identify common vendor-neutral standards, and can foster the adoption of those standards in a cohesive and aligned manner to ensure a integrated commercial viable set of solutions.

The goal is to enable effective business architectures, that support a new generation of interoperable business solutions, quickly, and at low cost using new technologies and provisioning methods, but with integration to existing IT environments.

Acting on behalf of its core franchise base of CIOs, and in association with the US and European CIO associations, Open Platform 3.0 will act as a rallying point for all involved in the development of technology solutions that new innovative business models and practices require.

There is a distinctive sea change in the way that organizations are adopting and using a range of new technologies, mostly relating to a front office revolution in how business is performed with their customers, suppliers and even within their markets. More than ever The Open Group mission of Boundaryless Information Flow™ through the mutual development of technology standards and methods is relevant to this change.

The competitive benefits are driving rapid Business adoption but mostly through a series of business management owned and driven pilots, usually with no long term thought as to scale, compliance, security, even data integrity. Rightly the CIO is concerned as to these issues, but too often in the absence of experience in this new environment and the ability to offer constructive approaches these concerns are viewed as unacceptable barriers.

This situation is further enflamed by the sheer variety of products and different groups, both technological and business, to try to develop workable standards for particular elements. Currently there is little, if any, overall coordination and alignment between all of these individually valuable elements towards a true ‘system’ approach with understandable methods to deliver the comprehensive enterprise approach in a manner that will truly serve the full business purposes.

The business imperatives supported by the teaching of Business Schools are focused on time as a key issue and advocate small fast projects built on externally provisioned, paid for against use, cloud services.  These are elements of the sea change that have to be accepted, and indeed will grow as society overall expects to do business and obtain their own requirements in the same way.

Much of these changes are outside the knowledge, experience of often power of current IT departments, yet they rightly understand that to continue in their essential role of maintaining the core internal operations and commercial stability this change must introduce a new generation of deployment, integration, and management methods. The risk is to continue the polarization that has already started to develop between the internal IT operations based on Client-Server Enterprise Applications versus the external operations of sales and marketing using Browser-Cloud based Apps and Services.

At best this will result in an increasingly decentralized and difficult to manage business, at worst Audit and Compliance management will report this business as being in breach of financial and commercial rules. This is being recognized by organizations introducing a new type of role supported by Business Schools and Universities termed a Business Architect. Their role in the application of new technology is to determine how to orchestrate complex business processes through Big Data and Big Process from the ‘Services’ available to users. This is in many ways a direct equivalent, though with different skills, to an Enterprise Architect in conventional IT who will focus on the data integrity from designing Applications and their integration.

The Open Group’s massive experience in the development of TOGAF®, together with its wide spread global acceptability, lead to a deep understanding of the problem, the issues, and how to develop a solution both for Business Architecture, as well as for its integration with Enterprise Architecture.

The Open Group believes that it is uniquely positioned to play this role due to its extensive experience in the development of standards on behalf of user enterprises to enable Boundaryless Information Flow including its globally recognized Enterprise Architecture standard TOGAF. Moreover it believes from feedback received from many directions this move will be welcomed by many of those involved in the various aspects of this exciting period of change.

mulhollandAndy joined Capgemini in 1996, bringing with him thirteen years of experience from previous senior IT roles across all major industry sectors.

In his former role as Global Chief Technology Officer, Andy was a member of the Capgemini Group management board and advised on all aspects of technology-driven market changes, as well as serving on the technology advisory boards of several organizations and enterprises.

A popular speaker with many appearances at major events all around the World, and frequently quoted by the press, in 2009 Andy was voted one of the top 25 most influential CTOs in the world by InfoWorld USA, and in 2010 his CTOblog was voted best Blog for Business Managers and CIOs for the third third year running by Computing Weekly UK. Andy retired in June 2012, but still maintains an active association with the Capgemini Group and his activities across the Industry lead to his achieving 29th place in 2012 in the prestigious USA ExecRank ratings category ‘Top CTOs’.

Comments Off

Filed under Open Platform 3.0, TOGAF

Evolving Business and Technology Toward an Open Platform 3.0™

By Dave Lounsbury, Chief Technical Officer, The Open Group

The role of IT within the business is one that constantly evolves and changes. If you’ve been in the technology industry long enough, you’ve likely had the privilege of seeing IT grow to become integral to how businesses and organizations function.

In his recent keynote “Just Exactly What Is Going On in Business and Technology?” at The Open Group London Conference in October, Andy Mulholland, former Global Chief Technology Officer at Capgemini, discussed how the role of IT has changed from being traditionally internally focused (inside the firewall, proprietary, a few massive applications, controlled by IT) to one that is increasingly externally focused (outside the firewall, open systems, lots of small applications, increasingly controlled by users). This is due to the rise of a number of disruptive forces currently affecting the industry such as BYOD, Cloud, social media tools, Big Data, the Internet of Things, cognitive computing. As Mulholland pointed out, IT today is about how people are using technology in the front office. They are bringing their own devices, they are using apps to get outside of the firewall, they are moving further and further away from traditional “back office” IT.

Due to the rise of the Internet, the client/server model of the 1980s and 1990s that kept everything within the enterprise is no more. That model has been subsumed by a model in which development is fast and iterative and information is constantly being pushed and pulled primarily from outside organizations. The current model is also increasingly mobile, allowing users to get the information they need anytime and anywhere from any device.

At the same time, there is a push from business and management for increasingly rapid turnaround times and smaller scale projects that are, more often than not, being sourced via Cloud services. The focus of these projects is on innovating business models and acting in areas where the competition does not act. These forces are causing polarization within IT departments between internal IT operations based on legacy systems and new external operations serving buyers in business functions that are sourcing their own services through Cloud-based apps.

Just as UNIX® provided a standard platform for applications on single computers and the combination of servers, PCs and the Internet provided a second platform for web apps and services, we now need a new platform to support the apps and services that use cloud, social, mobile, big data and the Internet of Things. Rather than merely aligning with business goals or enabling business, the next platform will be embedded within the business as an integral element bringing together users, activity and data. To work properly, this must be a standard platform so that these things can work together effectively and at low cost, providing vendors a worthwhile market for their products.

Industry pundits have already begun to talk about this layer of technology. Gartner calls it the “Nexus of Forces.” IDC calls it the “third platform.” At the The Open Group, we refer to it as Open Platform 3.0™, and we announced a new Forum to address how organizations can address and support these technologies earlier this year. Open Platform 3.0 is meant to enable organizations (including standards bodies, users and vendors) coordinate their approaches to the new business models and IT practices driving the new platform to support a new generation of interoperable business solutions.

As is always the case with technologies, a point is reached where technical innovation must transition to business benefit. Open Platform 3.0 is, in essence, the next evolution of computing. To help the industry sort through these changes and create vendor-neutral standards that foster the cohesive adoption of new technologies, The Open Group must also evolve its focus and standards to respond to where the industry is headed.

The work of the Open Platform 3.0 Forum has already begun. Initial actions for the Forum have been identified and were shared during the London conference.  Our recent survey on Convergent Technologies confirmed the need to address these issues. Of those surveyed, 95 percent of respondents felt that converged technologies were an opportunity for business, and 84 percent of solution providers are already dealing with two or more of these technologies in combination. Respondents also saw vendor lock-in as a potential hindrance to using these technologies underscoring the need for an industry standard that will address interoperability. In addition to the survey, the Forum has also produced an initial Business Scenario to begin to address these industry needs and formulate requirements for this new platform.

If you have any questions about Open Platform 3.0 or if you would like to join the new Forum, please contact Chris Harding (c.harding@opengroup.org) for queries regarding the Forum or Chris Parnell (c.parnell@opengroup.org) for queries regarding membership.

 

Dave LounsburyDave is Chief Technical Officer (CTO) and Vice President, Services for The Open Group. As CTO, he ensures that The Open Group’s people and IT resources are effectively used to implement the organization’s strategy and mission.  As VP of Services, Dave leads the delivery of The Open Group’s proven collaboration processes for collaboration and certification both within the organization and in support of third-party consortia. Dave holds a degree in Electrical Engineering from Worcester Polytechnic Institute, and is holder of three U.S. patents.

 

 

1 Comment

Filed under Cloud, Data management, Future Technologies, Open Platform 3.0, Standards, Uncategorized, UNIX

The Open Group London – Day Two Highlights

By Loren K. Baynes, Director, Global Marketing Communications

We eagerly jumped into the second day of our Business Transformation conference in London on Tuesday October 22nd!  The setting is the magnificent Central Hall Westminster.

Steve Nunn, COO of The Open Group and CEO of Association of Enterprise Architects (AEA), started off the morning introducing our plenary based on Healthcare Transformation.  Steve noted that the numbers in healthcare spend are huge and bringing Enterprise Architecture (EA) to healthcare will help with efficiencies.

The well-renowned Dr. Peter Sudbury, Healthcare Specialist with HP Enterprise Services, discussed the healthcare crisis (dollars, demand, demographics), the new healthcare paradigm, barriers to change and innovation. Dr. Sudbury also commented on the real drivers of healthcare costs: healthcare inflation is higher intrinsically; innovation increases cost; productivity improvements lag other industries.

IMG_sudburyDr. Peter Sudbury

Dr. Sudbury, Larry Schmidt (Chief Technologist, HP) and Roar Engen (Head of Enterprise Architecture, Helse Sør-Øst RHF, Norway) participated in the Healthcare Transformation Panel, moderated by Steve Nunn.  The group discussed opportunities for improvement by applying EA in healthcare.  They mentioned that physicians, hospitals, drug manufacturers, nutritionists, etc. should all be working together and using Boundaryless Information Flow™ to ensure data is smoothly shared across all entities.  It was also stated that TOGAF® is beneficial for efficiencies.

Following the panel, Dr. Mario Tokoro (Founder & Executive Advisor of Sony Computer Science Laboratories, Inc. Japanese Science & Technology Agency, DEOS Project Leader) reviewed the Dependability through Assuredness™ standard, a standard of The Open Group.

The conference also offered many sessions in Finance/Commerce, Government and Tutorials/Workshops.

Margaret Ford, Consult Hyperion, UK and Henk Jonkers of BIZZdesign, Netherlands discussed “From Enterprise Architecture to Cyber Security Risk Assessment”.  The key takeaways were: complex cyber security risks require systematic, model-based risk assessment; attack navigators can provide this by linking ArchiMate® to the Risk Taxonomy.

“Applying Service-Oriented Architecture within a Business Technology Environment in the Finance Sector” was presented by Gerard Peters, Managing Consultant, Capgemini, The Netherlands. This case study is part of a white paper on Service-Oriented Architecture for Business Technology (SOA4BT).

You can view all of the plenary and many of the track presentations at livestream.com.  And for those who attended, full conference proceedings will be available.

The night culminated with a spectacular experience on the London Eye, the largest Ferris wheel in Europe located on the River Thames.

Comments Off

Filed under ArchiMate®, Cloud/SOA, Enterprise Architecture, Enterprise Transformation, Healthcare, Professional Development, Service Oriented Architecture, TOGAF®

New Brunswick Leverages TOGAF®

The OCIO of GNB Announces an Ambitious EA Roadmap using TOGAF® and Capability-Based Thinking

On Wednesday September 25th, the Office of the Chief Information Officer (OCIO) for the Government of New Brunswick (GNB) held an Enterprise Architecture (EA) Symposium for the vendor community at the Delta Fredericton. This event drew well over a hundred plus attendees from the vendor community across the province, the Atlantic area and parts of Canada.

During this event, Christian Couturier, GNB CIO, announced an EA roadmap across the domains of Information, Application, Technology and Security; areas of mandate for the OCIO. He presented a vision for transformation at GNB that would make its departments more efficient and effective by standardizing their practice and services around TOGAF® and capability-based thinking. Christian also shed valuable insights into how the vendor community can engage with GNB and support the OCIO for their EA vision and roadmap.

TOGAF® and capability-based thinking were prominent themes throughout the symposium and were alluded to and shown throughout the presentation by Christian and his extended EA team. The OCIO has also created a strong governance structure that positions itself as an influential stakeholder in provisioning solutions across its domains. In the near term, vendors will need to show how their solutions not only meet functional requirements but demonstrate improvement in capability performance explicitly. This will help GNB to improve the definition and management of contracts with third party vendors.

Each Architecture Domain Chief presented the roadmap for their area in breakout sessions and answered questions from vendors. These sessions offered further insight into the EA roadmap and impact on particular areas within GNB such as current efforts being made in Service Oriented Architecture.

Here is a summary of the benefits Christian Couturier strived to achieve:

  • Improve transparency and accountability of investment in information technology across government departments
  • Rationalize portfolios of technologies and applications across GNB departments
  • Improve GNB’s ability to respond to citizen needs faster and more cost effectively
  • Develop internal resource competencies for achieving self-sufficiency

QRS has been working with the OCIO and GNB departments since March 2013 to enhance their TOGAF and capability-based thinking competencies. QRS will continue to work with the OCIO and GNB and look forward to their successes as both a corporate citizen and individual residents that benefit from its services.

Originally posted on the QRS blog. See http://www.qrs3e.com/gnb_ocio_togaf/

Christian CouturierChristian Couturier is Chief Information Officer of the Government of New Brunswick (GNB) which leads, enables and oversees the Information Management and Information Communication Technology (IM&ICT) investments for the enterprise.  Christian’s leadership has been recognized by several awards including Canada’s “Top 40 Under 40.” His research team’s success continues to be celebrated through many international, national and local awards including the 2007 Canadian Information Productivity Awards (CIPA) Gold Award of Excellence for innovation in the Health Care Sector.

LinkedIn Profile <http://ca.linkedin.com/pub/christian-couturier/46/b55/713/>

1 Comment

Filed under Enterprise Architecture, Service Oriented Architecture, Standards, TOGAF®

The Open Group TweetJam on Digital-Disruption – by Tom Graves

On 2 October 2013, the Open Group ran one of its occasional ‘TweetJam’ Twitter-discussions – also known as an #ogChat. This time it was on digital disruption - disruption to existing business-models, typically (but, as we will see, not only) by changes in technology.

I think I captured almost all of the one-hour conversation – all tweets tagged with the #ogChat hashtag – but I may well have missed a few here and there. I’ve also attempted to bring the cross-chat (@soandso references) into correct sense-order, but I’ll admit I’m likely to have made more errors there. Each text-line is essentially as published on Twitter, minus the RT @ prefix and the identifying #ogChat tag.

The legal bit: Copyright of each statement is as per Twitter’s published policy: I make no claim whatsoever to any of the tweets here other than my own (i.e. tetradian). The material is re-published here under ‘fair-use’ rules for copyright, as a public service to the enterprise-architecture community.

The TweetJam was split into seven sections, each guided by a question previously summarised on the Open Group website – see Open Group, ‘Leading Business Disruption Strategy with Enterprise Architecture‘. I’ve also added a few extra comments of my own after each section.

Introductions

(The TweetJam started with a request for each person to introduce themselves, which also serves as a useful cross-reference between name and Twitter-ID. Not every who joined in the TweetJam did this, but most did so – enough to help make sense of the conversation, anyway.)

  • theopengroup: Please introduce yourself and get ready for question 1, identified by “Q1″ …and so on. You may respond with “A1″ and so on using #ogChat // And do tweet your agreement/disagreement with other participants’ views using #ogChat, we’re interested to hear from all sides #EntArch
  • enterprisearchs: Hi all, from Hugh Evans, Enterprise Architects (@enterprisearchs), CEO and Founder
  • tetradian: Tom Graves (tetradian)
  • eatraining: Craig Martin
  • TheWombatWho: Andrew Gallagher – Change Strategy / Business Architect
  • chrisjharding: Hi from Chris Harding, The Open Group Forum Director for Open Platform 3.0
  • dianedanamac: Good day! Social Media Manager, Membership & Events at @theopengroup   I’m your contact if you have questions on The Open Group.
  • InfoRacer: Chris Bradley
  • David_A_OHara: Hi all, Dave O’Hara here, enteprise/biz architect
  • TalmanAJ: Aarne Talman – IT Startegy/EA consultant at Accenture
  • zslayton: Good morning.  Zach Slayton here from Collaborative Consulting @consultcollab
  • efeatherston: Good morning. Ed Featherston, Enterprise Arch from Collaborative Consulting
  • filiphdr: Filip Hendrickx, business architect @AE_NV
  • Frustin_Jetwell: Hello, I’m late, Justin Fretwell here, technical enterprise architecture

Question 1: What is ‘disruption’?

  • theopengroup: Let’s kick things off: Q1 What is #Disruption? #EntArch
  • TheWombatWho: A1 Disruption is normality
  • enterprisearchs: A1 Disruptors offer a new #BizModel that defines a different frontier of value
  • enterprisearchs: A1 Disruptors often introduce new technologies or processes that set them apart
  • chrisjharding: A1 Could be many things. Cloud, mobile, social, and other new technologies are disrupting the relation between business and IT
  • tetradian: A1: anything that changes business-as-usual (scale from trivial to world-shaking)
  • enterprisearchs: A1 Disruptors offer equal or better performance at prices incumbents can’t match
  • TheWombatWho: A1 agree with @tetradian but add that it is normal state of things.
  • David_A_OHara: A1,  not just tech-led disruption, but consumers actively driving innovation by finding new ways to use tech in work & social lives
  • zslayton: A1:  Disruptors are anything that breaks a norm or widely-held paradigm
  • enterprisearchs: A1 #Disruption begins when the entrant catches up to incumbents
  • InfoRacer: A1 Disruption is inevitable & BAU for many organisations these day
  • chrisjharding: @TheWombatWho Yes we live in disruptive (and interesting) times.
  • enterprisearchs: A1 Thanks to disruptive forces business models now have a much shorter shelf-life
  • DadaBeatnik: A1: To disrupt doesn’t mean more of the same. Example – iPhone was a true disrupter – no more Blackberry!
  • TalmanAJ: A1: Business disruptors offer new business model(s).
  • eatraining: A1 Innovation that creates a new value network or reorganized value system
  • TheWombatWho: A1 Disruptors can be global mega trends but can be localised.  Localised can provide ‘canary down the mine’ opportunity
  • TalmanAJ: A1: IT disruptors fundamentally change the way IT supports business models or change the business model
  • tetradian: .@TheWombatWho: A1 “…but add that [disruption] is normal state of things” – problem is that many folks don’t recognise that! :-)
  • chrisjharding: @David_A_OHara and disrupting traditional organization because they want to use it hands on, not through IT department
  • efeatherston: @chrisjharding good point on the bypassing IT, thats the #mobile disruption in full force
  • DadaBeatnik: Re: “disruption” read http://t.co/y0HrM3fcKH
  • eatraining: A1 Digital allows a far more effective entrepreneur and innovator environment, putting disruptive pressures on incumbents

Note an important point that’s perhaps easily missed (as some responders in fact do): that ‘disruption’ may include technology, or may be driven by technology – but that’s not always the case at all. Consider, for example, the huge disruption – on a literally global scale – caused by financial deregulation in the US in the 1980s and beyond: changes in law, not technology.

And, yes, as several people commented above, significant disruptions are becoming more common and more intense – a trend that most of us in EA would probably accept is only accelerating. As some might suggest, “you ain’t seen nothin’ yet…”: certainly the old stable-seeming business-models and seeming-guaranteed ‘sustainable competitive-advantage’ and the like would seem to be like pleasant fantasies from a fast-fading past…

Question 2: What is ‘digital disruption’?

  • theopengroup: Q2 Some interesting views on disruption, but what then, is #DigitalDisruption?
  • efeatherston: A2: disruption that is focused/based on technology issue, changes in technology, how things are done
  • enterprisearchs: A2 Disruptive business models that leverage digital capabilities to create, distribute or market their offerings
  • enterprisearchs: A2 Commonly applies #Cloud, #Mobile, #Social and or #BigData capabilities
  • efeatherston: A2: yes, #SMAC is the latest #digitaldisruption
  • TheWombatWho: A2 key with digital is not the medium it is the shift of power & control to the end user.  Digital enables it but its power shift
  • tetradian: A2: ‘digital’ used to mean technology, also to mean e.g. social/mobile (i.e. not solely technology) or more open business generally
  • enterprisearchs: A2 Many incumbents defend #digitaldisruption by moving to customer centric #BizModel
  • chrisjharding: A2 Disruption caused by digital technology – the main source of enterprise disruption today
  • enterprisearchs: A2 #digitaldisruption is seeing a convergence of business, technology and marketing disciplines
  • eatraining: A1&A2 Disruption not always digital but is it always technological? JEEP disruption on modern warfare
  • zslayton: @TheWombatWho Agreed…excellent point.  Shift towards user is key for #SMAC especially
  • Technodad: @TalmanAJ Agree – but digital disruption also invalidates existing business models.
  • enterprisearchs: A2 #Cloud enables ubiquitous access and effortless scalability
  • enterprisearchs: A2 #Mobile offers access anywhere, anytime and opens up previously untapped socioeconomic segments
  • enterprisearchs: A2 #Social accelerates viral uptake of demand and opinion, creating brand opportunities and threats
  • chrisjharding: @efeatherston They do what works for the business
  • TheWombatWho: A2 @enterprisearchs is it really marketing?  That discipline is going through fundamental change – hardly recognisable old vs new
  • David_A_OHara: @eatraining  real disruption now social rather than purely technical but enabled by seamless integration of tech in daily life
  • zslayton: @enterprisearchs #cloud = effortless scalability…a bit of an over-simplification but I do get your point.
  • efeatherston: @chrisjharding agree completely, just changes the paradigm for IT who are struggling to adapt
  • enterprisearchs: A2 #BigData enables ultra-personalisation of customer experience and powerful market insights
  • InfoRacer: A2 Digital Disruption also means avoiding blind alleys & the “me too” chase after some trends.  Eg #BigData isn’t necessarily…
  • TheWombatWho: @enterprisearchs its where work of Marshall McLuhan is worth a revisit.
  • chrisjharding: @David_A_OHara @eatraining social disruption caused by tech-based social media
  • DadaBeatnik: Some of these answers sound like they come from one of those buzzword phrase generators!
  • InfoRacer: @DadaBeatnik Like Predictive big cloud master data governance ;-) Surely the next big thang!
  • David_A_OHara: @enterprisearchs easier to deploy mobile internet vs fixed in growing economies: demand from developing world is uncharted territory
  • eatraining: A2 Digital reduces barriers to entry and blurs category boundaries
  • efeatherston: @David_A_OHara @eatraining #socialmedia definitely having impact, how people interact with tech in personal now fully into business
  • zslayton: @David_A_OHara @enterprisearchs Business models in developing world also uncharted.  New opportunities and challenges
  • David_A_OHara: @chrisjharding @eatraining yup, we have lived through a rapid (tech-enabled) social revolution almost without realising!
  • TheWombatWho: A2 its not the ‘technology’ it’s what ‘they do with it’ that changes everything.  Old IT paradigms are yet to adapt to this
  • Technodad: @David_A_OHara Agree – Near-ubiquitous global-scale communication channels changes balance between customer and enterprise.
  • chrisjharding: @David_A_OHara @eatraining yes – and it’s not finished yet!
  • InfoRacer: @TheWombatWho Right, it’s not just the technology.  #BigData 3 Vs but without 4th V (value) then big data = little information
  • David_A_OHara: @TheWombatWho Bang on!  so there’s the real challenge for EA, right? Changing the traditional IT mindset…?
  • afigueiredo: A2 Development that transforms lives, businesses, causing impact to global economy

To me there are two quite different things going on, but which are often blurred together:

– ‘digital-disruption proper’ – disruptions within which existing and/or new digital-based technologies are explicitly the core drivers

– ‘disruption-with-digital’: ‘digital’ as a catch-all for sociotechnical changes in which digital-based technologies are, at most, an important yet never the sole enabler – in other words, where the social side of ‘sociotechnical’ is more central than the technology itself

In my experience and understanding, most of so-called ‘digital disruption’ is more correctly in the latter category, not the former. Hence, for example, my comment about the [UK] Government Digital Service: it’s actually far more about changes in the nature of government-services itself – in effect, a much more ‘customer-centric’ view of service – rather than a focus on ‘going digital’ for digital’s sake. This is not to say that the technology doesn’t matter – for example, I do understand and agree with Andrew McAfee’s complaint about critiques of his ‘Enterprise 2.0′ concept, that “it’s not not about the technology” – but again, it’s more sociotechnical, not merely technical as such, and that distinction is often extremely important.

Interestingly, most of the examples cited above as ‘digital-disruptions – the often-overhyped ‘cloud’ and ‘big-data’ and suchlike – are ultimately more sociotechnical issues than technical. By contrast, most of the themes I’d see as ‘digital-disruption proper’ – for example, the rapidly-expanding developments around ‘smart-materials’, ‘smart-cities’ and ‘the internet of things’ - don’t get a mention here at all. Odd…

 Question 3: What are good examples of disruptive business-models?

  • theopengroup: Q3 Bearing these points in mind, what are good examples of disruptive #Bizmodels? #EntArch
  • enterprisearchs: A3 @Airbnb: Disrupting the hotel industry with a #Cloud & #Social based model to open up lodging capacity for people seeking accom
  • enterprisearchs: A3 @Uber: leveraging #Cloud and #Mobile to release existing capacity in the personal transport industry http://t.co/31Xmj7LwQ6
  • enterprisearchs: A3 @99designs: Rethinking how we access good design through #Social, #Cloud and competitive #crowdsourcing
  • enterprisearchs: A3 @Groupon: re-architecting retail to provide #Social buying power, reducing cost per unit and increasing vendor volumes
  • eatraining: @zslayton Reverse innovation in developing countries producing disruption in developed nations
  • chrisjharding: A3: marketing using social media
  • TheWombatWho: @InfoRacer and combined with behavioural sciences & predictive analytics
  • efeatherston: A3: Netflix is a disruptive business model, they threw the whole cable/broadcast/rental industry on its ears
  • enterprisearchs: A3 @iTunesMusic: creating a #Cloud based platform to lock in customers and deliver #Digital content
  • eatraining: Reverse Innovation in Tech Startups: The Story of Capillary Technologies – @HarvardBiz http://t.co/Ud7UN7ZxzQ
  • TheWombatWho: @David_A_OHara not just mindset but also disciplines around portfolio & programme planning, aspects of project mgmt etc
  • enterprisearchs: A3 @facebook: Using #Social #Cloud #Mobile and #BigData to get you & 1 billion other people to generate their product: your updates
  • David_A_OHara: @enterprisearchs @Groupon Here’s retail disruption: why cant I just walk into store, scan stuff on my phone and walk out with it?
  • zslayton: @efeatherston Absolutely.  Discussed this in a recent blog posts:  http://t.co/zWzzAN4Fsn
  • Technodad: @David_A_OHara @TheWombatWho Don’t assume enterprises lead or control change. Many examples imposed externally, e.g. Music industry
  • eatraining: @efeatherston Agree – @netflix: Shifting the #ValueProposition to low-cost on demand video content from the #Cloud
  • tetradian: A3 (also A2): UK Government Digital Service (GDS) – is ‘digital’, but change of business-service/paradigm is even more important
  • mjcavaretta: Value from #BigData primarily from…  RT @TheWombatWho: @InfoRacer behavioural sciences & #predictive #analytics
  • zslayton: @Technodad @David_A_OHara @TheWombatWho Spot on.  External event triggers change.  Org treats as opportunity/threat. IT must adapt
  • InfoRacer: A3 Expedia, Travelocity etc … where are High st travel agents now?
  • enterprisearchs: A3 ING Direct: delivering a simple #ValueProposition of no-frills and trusted high returns for depositors
  • Technodad: @enterprisearchs Disagree. ITunes was the enterprise consolidation -original disruption was peer-to-peer delivery of ripped music.
  • chrisjharding: @David_A_OHara @enterprisearchs @Groupon or plan a mixed bus/train journey on my ‘phone and download tickets to it?
  • eatraining: A3 DELL – game changing cost structures
  • TheWombatWho: @tetradian Great example.  UK Gov digital is fascinating.  Take that approach & apply it to competitive commercial enviro.
  • eatraining: A3 MOOC Platforms disrupting education? Scalability disruption
  • eatraining: A3 Nespresso – getting us to pay 8 times more for a cup of coffee.
  • tetradian: A3: many non-IT-oriented technologies – nanotechnology, micro-satellites, materials-science (water-filtration etc)
  • filiphdr: @chrisjharding @David_A_OHara @enterprisearchs @Groupon bus/train combo: yes – download tickets: no
  • zslayton: @Technodad @enterprisearchs Maybe.  But now with Google, spotify etc, a new model has emerged.

Some good examples, but I’ll admit that I find it disappointing that almost all of them focus primarily on shunting data around in the ‘social/local/mobile’ space – yes, all of them valid, but a very narrow subset of the actual ‘digital-disruption’ that’s going on these days. (Near the end, there is a good example of the broader view: “Nespresso – getting us to pay 8 times more for a cup of coffee”.)

As enterprise-architects and business-architects, we really do need to break out of the seemingly-reflex assumptions of IT-centrism, and learn instead to look at the contexts from a much broader perspective. For example, a common illustration I use is that the key competition for Netflix is not some other streaming-video provider, but booksellers, bars and restaurants – other types of services entirely, but that compete for the same social/time-slots in potential-customers’ lives.

Question 4: What is the role of enterprise-architecture in driving and responding to disruption?

  • enterprisearchs: A4 #EntArch will identify which capabilities will be needed, and when, to enable disruptive strategies
  • efeatherston: A4: #entArch is key to surviving tech disruption, need the high level view/impact on the business
  • chrisjharding: A4: #EntArch must be business-led, not technology-led
  • InfoRacer: A4 #EntArch can play an orchestration, impact analysis and sanity check role
  • efeatherston: Agree 100%, its all about the impact to the business RT @chrisjharding: A4: #EntArch must be business-led, not technology-led
  • enterprisearchs: A4 #EntArch will lead enterprise response to #disruption by plotting the execution path to winning strategies http://t.co/FdgqXOVKug
  • chrisjharding: A4: and #Entarch must be able to focus on business differentiation not common technology
  • enterprisearchs: A4 #EntArch will lead enterprise response to #disruption by plotting the execution path to winning strategies http://t.co/FdgqXOVKug
  • afigueiredo: A4 #entarch should be flexible to accommodate/support #disruption caused by new advances and changes
  • TheWombatWho: A4 help clarify & stick to intent of business.  It is key in choosing the critical capabilities vs non essential capabilities
  • enterprisearchs: A4 #EntArch will provide the strategic insights to identify what business changes are viable
  • chrisjharding: @InfoRacer or enable business users to orchestrate – give them the tools
  • enterprisearchs: A4 #EntArch will provide the strategic infrastructure to bring cohesion to business change
  • TalmanAJ: A4: identify existing and needed business and IT capabilities and ensure agility to respond to disruption #entarch
  • efeatherston: a4: #entarch needs to work with business to determine how to leverage/use/survive  #disruption to help the business processes
  • David_A_OHara: @enterprisearchs so you need very business-savvy and creative EAs (no longer a tech discipline but sustainable biz innovation role?)
  • InfoRacer: RT @enterprisearchs: A4 #EntArch will provide the strategic insights to identify what business changes are viable
  • TheWombatWho: A4 have to travel light so linking intent to critical capability is essential if Biz is to remain flexible & adaptable
  • zslayton: A4 #EntArch must steer the IT ship to adapt in the new world.  steady hand on the tiller!
  • TheWombatWho: A4 have to travel light so linking intent to critical capability is essential if Biz is to remain flexible & adaptable
  • TheWombatWho: @enterprisearchs agree
  • chrisjharding: @David_A_OHara @enterprisearchs Yup!
  • efeatherston: @David_A_OHara @enterprisearchs Agree, EA’S need both business and tech, act as the bridge for the business to help them respond
  • enterprisearchs: A4 #EntArch will assist in managing lifecycles at the #BizModel, market model, product & service and operating model levels
  • zslayton: @chrisjharding Absolutely.  Focus on commoditized tech will lead to lagging IT.  Focus on differentiators is key.
  • eatraining: A4 Business design and architecture will facilitate a more structured approach to business prototyping
  • tetradian: A4: identifying/describing the overall shared-enterprise space (tech + human); also lean-startup style ‘jobs to be done’ etc
  • Technodad: @TheWombatWho yes, but a tough job- how would #entarch have advised Tower Records in face of digital music disruption, loss of ROE?
  • David_A_OHara: @Technodad @TheWombatWho good challenge: same question can be posed re: Game and HMV in the UK…
  • eatraining: A4 Business model prototyping is the conversation we have with our ideas – @tomwujec
  • tetradian: @eatraining re business-prototyping – yes, strong agree
  • tetradian: A4 for ‘digital disruption’, crucial that #entarch covers a much broader space than just IT – pref. out to entire shared-enterprise
  • enterprisearchs: @tetradian agree – the boundaries of the enterprise are defined by the value discipline orientation, not by the balance sheet

In contradiction to what I said just above, that too-common predominance of IT-centrism in current EA is not so much in evidence here. It’s a pleasant contrast, but it doesn’t last…

Question 5: Why is enterprise-architecture well placed to respond to disruption?

  • theopengroup: Q5 And on a similar note, what is the role of #EntArch in driving and responding to #disruption?
  • enterprisearchs: A5 #EntArch has a unique appreciation of existing and required business capabilities to execute strategy
  • enterprisearchs: A5 Speed to change is now a competitive advantage. #EntArch can map the shortest path to deliver business outcomes
  • filiphdr: A5 Keep short term decisions in line w/ long term vision
  • enterprisearchs: A5 #EntArch provides the tools to better manage investment lifecycles, helping to time capability deployment and divestment
  • InfoRacer: A5 Advising, giving informed analysis, recommendations & impact so the Business officers can make decision with their eyes open!
  • enterprisearchs: A5 #EntArch is the only discipline that stitches strategic and business management disciplines together in a coherent manner
  • enterprisearchs: A5 Speed of response requires a clear mandate and execution plan. #EntArch will deliver this
  • zslayton: @enterprisearchs Agreed.  Toss in leadership and we may have something!
  • TheWombatWho: @Technodad key is “why was tower special?”  Advice, passion & knowledge…..still relevant?  Not the music – was the knowledge.
  • efeatherston: @enterprisearchs well said #entarch
  • enterprisearchs: A5 #EntArch provides vital information about which capabilities currently exist and which need to be acquired or built
  • chrisjharding: A5: Set principles and standards to give consistent use of disruptive technologies in enterprise
  • eatraining: @Technodad @TheWombatWho A few cycles of business model prototyping might have revealed a an opportunity to respond better
  • zslayton: @Technodad @TheWombatWho Netflix again a good example.  Cannibalized their soon to be dying biz to innovate in new biz.
  • TalmanAJ: A5: #entarch should be the tool to drive/respond to disruptions in a controlled manner
  • enterprisearchs: A5 #ArchitectureThinking provides a robust approach to optimise change initiatives and accelerate delivery
  • David_A_OHara: @Technodad @TheWombatWho consider future of games consoles i.e. there will be NO consoles: smart TV will access all digital content
  • TheWombatWho: @David_A_OHara @Technodad HMV interesting – wasn’t  retail store a response to original disruption?
  • chrisjharding: A5: and ensure solutions comply with legal constraints and enterprise obligations
  • zslayton: @David_A_OHara @Technodad @TheWombatWho SmartTV is just a big ole, vertical tablet. #mobile
  • TheWombatWho: @zslayton @David_A_OHara @Technodad and value opportunity is how to keep finger prints off the screen!!!!
  • TheWombatWho: @David_A_OHara @Technodad so accessing content is not where value is?  Where is the value in that arena?
  • enterprisearchs: A5 #EntArch offers insight into which technology capabilities can be strategically applied
  • eatraining: A5 #EntArch can offer an extended value proposition not just into capability mixes but product and market mixes as well
  • TheWombatWho: @enterprisearchs @Technodad yes, yes, yes and yes.  I agree
  • Technodad: @zslayton Exactly. Decision to dump physical & go all-in on digital delivery & content was key. Wonder if #entarch led change?
  • David_A_OHara: @TheWombatWho @Technodad not much if U R console manuf!  Content IS the value, right? Smart TV democratises access to content
  • mjcavaretta: Value from #BigData primarily from…  RT @TheWombatWho: @InfoRacer behavioural sciences & #predictive #analytics
  • TheWombatWho: @enterprisearchs @Technodad getting Biz to talk through canvas & over-laying their discussions with IT choices is essential
  • zslayton: @Technodad I’m guessing product but #entarch had to rapidly adapt IT enviro to enable the product e.g. respond to the disruption
  • efeatherston: @zslayton @Technodad  Netflix seems to thrive on disruption, look at their testing model, chaos monkey , hope #entarch is involved

In a sense, the same as for Question 4: the too-usual IT-centrism is not so much in apparent evidence. Yet actually it is: I don’t think there’s a single example that moves more than half a step outside of some form of IT. Where are the references to EA for smart-materials, smart-sensors, nanotechnologies, changes in law, custom, even religion? – they’re conspicuous only by their absence. Again, we need to stop using IT as ‘the centre of everything’, because it really isn’t in the real-world: instead, we need to rethink our entire approach to architecture, shifting towards a more realistic awareness that “everything and nothing is ‘the centre’ of the architecture, all at the same time”.

Question 6: Who are the key stakeholders enterprise-architecture needs to engage when developing a disruption strategy?

  • theopengroup: Q6 So who are the key stakeholders #EntArch needs to engage when developing a #Disruption strategy?
  • filiphdr: A6 Customers
  • enterprisearchs: A6 #Disruption is the concern of the entire executive team and the board of directors – this is where #EntArch should be aiming
  • TalmanAJ: A6: Business leaders first, IT leaders second
  • chrisjharding: A6: CIOs
  • InfoRacer: A6 Customers, Shareholders, Investors, Partners
  • enterprisearchs: A6 Clearly the CEO is the key stakeholder for #EntArch to reach when contemplating new #BizModels
  • eatraining: A6 Welcome the arrival of the CDO. The chief digital officer. Is this the new sponsor for EA?
  • efeatherston: A6: As has been said, the C-level (not just CIO), as the focus must always be the business drivers, and what impact that has
  • zslayton: @Technodad emphasizing partnership and alignment between Tech #entarch and Biz entarch.
  • eatraining: A6 The Customer!!??
  • Technodad: @mjcavaretta Do you think replacement of knowledge workers by machine learning is next big disruption?
  • InfoRacer: @eatraining Hmm Chief Data Officer, because lets be honest the CIO mostly isn’t a Chief INFORMATION Officer anymore
  • TheWombatWho: A6 starts with biz, increasingly should include customers & suppliers & then IT
  • tetradian: A6: _all_ stakeholder-groups – that’s the whole point! (don’t centre it around any single stakeholder – all are ‘equal citizens’)
  • TheWombatWho: @tetradian A6 agree with Tom.  My bent is Biz 1st but you mine intel from all – whenever opportunity arrives.  Continual engagement

I’ll say straight off that I was shocked at most of the above: a sad mixture of IT-centrism and/or organisation-centrism, with only occasional indications – such as can be seen in Craig Martin’s plea of “The Customer!!??” – of much of a wider awareness. What we perhaps need to hammer home to the entire EA/BA ‘trade’ is that whilst we create an architecture for an organisation, it must be about the ‘enterprise’ or ecosystem within which that organisation operates. Crucial to this is the awareness that the enterprise is much larger than the organisation, and hence we’d usually be wise to start ‘outside-in‘ or even ‘outside-out’, rather than the literally self-centric ‘inside-in’ or ‘inside-out’.

Question 7: What current gaps in enterprise-architecture must be filled to effectively lead disruption strategy?

  • theopengroup: Q7, last one guys! What current gaps in #EntArch must be filled to effectively lead #Disruption strategy?
  • enterprisearchs: #EntArch should engage the biz to look at what sustaining & disruptive innovations are viable with the existing enterprise platform
  • zslayton: @efeatherston @Technodad Proactive disruption!  Technical tools to enable and anticipate change.  Great example.
  • enterprisearchs: A7 #EntArch needs to move beyond an IT mandate
  • enterprisearchs: A7 #EntArch needs to be recognised as a key guide in strategic business planning
  • InfoRacer: A7 Engage with biz.  Get away from tech.  Treat Information as real asset, get CDO role
  • eatraining: A7 The #EntArch mandate needs to move out of the IT space
  • chrisjharding: A7: #EntArch needs a new platform to deploy disruptive technologies – Open Platform 3.0
  • zslayton: A7 #entarch involvement during the idea stage of biz, not just the implementation.  True knight at the round table.
  • TheWombatWho: @enterprisearchs @Technodad its one of my best friends.  Evan the discipline of thought process sans formality of canvas
  • enterprisearchs: A7 #EntArchs need to improve their business engagement skills and vocabulary
  • zslayton: @eatraining Agreed!  Balance Biz #entarch with Tech #entarch.
  • efeatherston: A7: #entarch MUST be part of the business planning process, they are the connecting tissue between business drivers and IT
  • David_A_OHara: @theopengroup creative business modelling inc. hypothetical models, not simple IT response to mid term view based on today’s probs
  • TalmanAJ: A7: #entArch needs to move from its IT and technical focus to more business strategy focus
  • eatraining: @efeatherston Agreed
  • efeatherston: A7: #entarch  needs to get business to understand, they are not just the tech guys
  • eatraining: A7 There is room to expand into the products and services space as well as market model space
  • InfoRacer: A7 Common vocabulary eg by exploiting Conceptual model; Information is the lingua franca
  • enterprisearchs: A7 #EntArchs need to be more business-outcome oriented
  • chrisjharding: A7: Open Platform 3.0 #ogP3 will let architects worry about the business, not the technology
  • enterprisearchs: A7 #EntArchs need to be recruited from business domains and taught robust architecture practises
  • Technodad: A7 #EntArch can’t lose role of tracking/anticipating tech change, or business will be blindsided by next disruption.
  • filiphdr: @efeatherston Very true, and that’s a skills & communication challenge
  • eatraining: A7 Architects must focus more on becoming super mixers than on architecture utility development
  • enterprisearchs: A7 #EntArchs need to be experts in the application of #Cloud, #Mobile, #Social, #BigData and #Digital strategy
  • zslayton: @enterprisearchs Agreed.  We tend to have to push process more than models.  That is often the “ah ha”.  #entarch
  • eatraining: A7 Architecture must focus on actual change in helping design solutions that shift and change behavior as well
  • tetradian: A7: kill off the obsession with IT!!! :-) #entarch needs to cover the whole scope, not the trivial subset that is ‘digital’ alone…
  • enterprisearchs: @tetradian Disagree – Digital is a huge accelerant to #Disruption and #EntArchs in the near term need to have a v strong grip
  • tetradian: RT @enterprisearchs: A7 #EntArch needs to move beyond an IT mandate -> yes yes yes!!!
  • TalmanAJ: @tetradian Yes. Technology is just one aspect of the enterprise. Processes, strategies and people etc. are too.
  • scmunk: @tetradian this shows non-IT importance of #EntArch, also a pipeline for changes http://t.co/O4Cm4D5G7q
  • enterprisearchs: A7 #EntArchs need to be able to clearly articulate business context and motivation http://t.co/Sf4Ci8Ob7P
  • eatraining: @TheWombatWho Roadmap and plans implemented don’t show the true value because stakeholders shift back to old behavior habits.
  • TheWombatWho: A7 need to be evangelist for the ‘value’ in the Biz model not the hierarchy or structure or status quo
  • TheWombatWho: @eatraining agree.  Roadmap is point in time.  Need to establish principles, & links across value chain rather than structural links
  • DadaBeatnik: Never did understand the obsession with IT in #Entarch. Why is this? Not all biz IT-centric. Because of tools/language?
  • TheWombatWho: @DadaBeatnik accident of history?
  • TalmanAJ: @DadaBeatnik Could be historical. Origins of EA are in IT, EA function usually is in IT and EA people usually have IT background.

At least here we did see more awareness of the need to break out of the IT-centric box: it’s just that so many of the responses to the previous questions indicated that much of EA is still very much stuck there. Oh well. But, yeah, good signs that some moves are solidly underway now, at least.

One point I do need to pick up on from the tweets above. Yes, I’ll admit I somewhat dropped back to my usual rant – “kill off the obsession with IT!!! :-) ” – but please, please note that I do still very much include all forms of IT within the enterprise-architecture. I’m not objecting to IT at all: all that I’m saying is that we should not reflexively elevate IT above everything else. In other words, we need to start from an awareness – a strictly conventional, mainstream systemic-awareness – that in a viable ‘architecture of the enterprise, everything in that ‘ecosystem-as-system’ is necessary to that system, and hence necessarily an ‘equal citizen’ with everything else. Hence I do understand where Hugh Evans (@enterprisearchs) is coming from, in his riposte of “Disagree – Digital is a huge accelerant to #Disruption and #EntArchs in the near term need to have a v strong grip”: in a sense, he’s absolutely right. But the danger – and I’m sorry, but it is a huge danger – is that there’s still such as strong pull towards IT-centrism in current EA that we do need to be explicit in mitigating against it at just every step of the way. Yes, “digital is a huge accelerant to disruption”, and yes, we do need to be aware of the potential affordances offered by each new technology, yet we must always to start from the overall potential-disruption opportunity/risk first – and not from the technology.

Wrap-up

(This consisted of various people saying ‘thank you’, and ‘goodbye’, which is nice and socially-important and suchlike, yet not particularly central to the content of the TweetJam itself: I’ve dropped them from the record here, but you can chase them up on Twitter if you really need them. However, there were a couple of tweets pointing to further resources that might be helpful to some folks, so I’ll finish here with those.)

  • enterprisearchs: Look out for our upcoming webinar: http://t.co/lWvJ630BVJ ‘Leading Business Disruption Strategy with #EntArch’ Oct 10
  • dianedanamac: Thanks for joining! Continue the conversation at #ogLON, The Open Group London event Oct. 21-24

That’s it. Hope that’s been useful, anyways: over to you?

GravesTom_sq Tom Graves has been an independent consultant for more than three decades, in business transformation, enterprise architecture and knowledge management. His clients in Europe, Australasia and the Americas cover a broad range of industries including banking, utilities, manufacturing, logistics, engineering, media, telecoms, research, defence and government. He has a special interest in architecture for non-IT-centric enterprises, and integration between IT-based and non-IT-based services.

1 Comment

Filed under Business Architecture, Cloud, Enterprise Architecture, Open Platform 3.0

Talking Points on Rationale for Vendors Participation in Standards Efforts

By Terence Blevins, Portfolio Manager, The MITRE Corporation

The following are simple communication messages responding to two high level questions; what messages are useful when trying to get a company to decide on engaging in a standards effort, and next, what type of people should get engaged?

Regarding what messages are useful when trying to get a company to decide on engaging in a standards effort? I have used 5 key points:

  1. Marketing – if a company has openness and/or interoperability as part of its messaging, it is important to be seen as participating in open standards consortia. It demonstrates corporate commitment – without participation customers will not really believe the company is walking the talk
  2. Cost of selling – if a company does not support industry standards they are constantly asked to rationalize why they do not support industry standards, which increases the time and cost to sell. Sometimes you don’t even get on the short list. When you have products that are certified and have the label this becomes less an issue.
  3. Cost of developing interface standards – interface standards are just plain expensive to develop; if you become involved with an industry group, like The Open Group, you are leveraging others to get a standard done that is likely to have a long healthy shelf life at a lower cost than developing it yourself.
  4. Cost of developing solutions – again by implementing a standard, it is cheaper than developing and testing an interface on your own.
  5. Ability to set the standard – if one already has a product suite that is connected with sound interface specifications; a Platinum member of The Open Group can submit that as the standard in the fast track and actually be seen as setting the standard. This promotes the company in the leadership position and commitment to the interoperability message.

To the other question what type of people should get engaged? I typically emphasize the following:

  • There are 2 roles – the high level participation at the board and then the standards detail role.
  • The high level role needs to be filled by someone with the strategic view of the company, and the participant needs to be promoting true interoperability and demonstrating a willingness to cooperate with others.
  • The standards detail role needs to be filled by someone close to the architecture and engineering side of the house and should be someone that can contribute to the standards process – whether standards development or establishing the certification program.

©2013-The MITRE Corporation. All rights reserved.

blevins_1 Mr. Blevins is a department head at MITRE. He is a Board Member of The Open Group, representing the Customer Council.

He has been involved with the architecture discipline since the 80s, much of which was done while he was Director of Strategic Architecture at NCR Corporation. He has been involved with The Open Group since 1996 when he first was introduced to the Architecture Forum. He was co-chair of the Architecture Forum and frequent contributor of content to TOGAF including the Business Scenario Method.

Mr. Blevins was Vice President and CIO of The Open Group where he contributed to The Open Group Vision of Boundaryless Information Flow™

He holds undergraduate and Masters degrees in Mathematics from Youngstown State University. He is TOGAF 8 certified.

Comments Off

Filed under Certifications, Standards

Are You Ready for the Convergence of New, Disruptive Technologies?

By Chris Harding, The Open Group

The convergence of technical phenomena such as cloud, mobile and social computing, big data analysis, and the Internet of things that is being addressed by The Open Group’s Open Platform 3.0 Forum™ will transform the way that you use information technology. Are you ready? Take our survey at https://www.surveymonkey.com/s/convergent_tech

What the Technology Can Do

Mobile and social computing are leading the way. Recently, the launch of new iPhone models and the announcement of the Twitter stock flotation were headline news, reflecting the importance that these technologies now have for business. For example, banks use mobile text messaging to alert customers to security issues. Retailers use social media to understand their markets and communicate with potential customers.

Other technologies are close behind. In Formula One motor racing, sensors monitor vehicle operation and feed real-time information to the support teams, leading to improved design, greater safety, and lower costs. This approach could soon become routine for cars on the public roads too.

Many exciting new applications are being discussed. Stores could use sensors to capture customer behavior while browsing the goods on display, and give them targeted information and advice via their mobile devices. Medical professionals could monitor hospital patients and receive alerts of significant changes. Researchers could use shared cloud services and big data analysis to detect patterns in this information, and develop treatments, including for complex or uncommon conditions that are hard to understand using traditional methods. The potential is massive, and we are only just beginning to see it.

What the Analysts Say

Market analysts agree on the importance of the new technologies.

Gartner uses the term “Nexus of Forces” to describe the convergence and mutual reinforcement of social, mobility, cloud and information patterns that drive new business scenarios, and says that, although these forces are innovative and disruptive on their own, together they are revolutionizing business and society, disrupting old business models and creating new leaders.

IDC predicts that a combination of social cloud, mobile, and big data technologies will drive around 90% of all the growth in the IT market through 2020, and uses the term “third platform” to describe this combination.

The Open Group will identify the standards that will make Gartner’s Nexus of Forces and IDC’s Third Platform commercial realities. This will be the definition of Open Platform 3.0.

Disrupting Enterprise Use of IT

The new technologies are bringing new opportunities, but their use raises problems. In particular, end users find that working through IT departments in the traditional way is not satisfactory. The delays are too great for rapid, innovative development. They want to use the new technologies directly – “hands on”.

Increasingly, business departments are buying technology directly, by-passing their IT departments. Traditionally, the bulk of an enterprise’s IT budget was spent by the IT department and went on maintenance. A significant proportion is now spent by the business departments, on new technology.

Business and IT are not different worlds any more. Business analysts are increasingly using technical tools, and even doing application development, using exposed APIs. For example, marketing folk do search engine optimization, use business information tools, and analyze traffic on Twitter. Such operations require less IT skill than formerly because the new systems are easy to use. Also, users are becoming more IT-savvy. This is a revolution in business use of IT, comparable to the use of spreadsheets in the 1980s.

Also, business departments are hiring traditional application developers, who would once have only been found in IT departments.

Are You Ready?

These disruptive new technologies are changing, not just the IT architecture, but also the business architecture of the enterprises that use them. This is a sea change that affects us all.

The introduction of the PC had a dramatic impact on the way enterprises used IT, taking much of the technology out of the computer room and into the office. The new revolution is taking it out of the office and into the pocket. Cell phones and tablets give you windows into the world, not just your personal collection of applications and information. Through those windows you can see your friends, your best route home, what your customers like, how well your production processes are working, or whatever else you need to conduct your life and business.

This will change the way you work. You must learn how to tailor and combine the information and services available to you, to meet your personal objectives. If your role is to provide or help to provide IT services, you must learn how to support users working in this new way.

To negotiate this change successfully, and take advantage of it, each of us must understand what is happening, and how ready we are to deal with it.

The Open Group is conducting a survey of people’s reactions to the convergence of Cloud and other new technologies. Take the survey, to input your state of readiness, and get early sight of the results, to see how you compare with everyone else.

To take the survey, visit https://www.surveymonkey.com/s/convergent_tech

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

Comments Off

Filed under Cloud, Future Technologies, Open Platform 3.0, Platform 3.0

NASCIO Defines State of Enterprise Architecture at The Open Group Conference in Philadelphia

By E.G. Nadhan, HP

I have attended and blogged about many Open Group conferences. The keynotes at these conferences like other conferences provide valuable insight into the key messages and the underlying theme for the conference – which is Enterprise Architecture and Enterprise Transformation for The Open Group Conference in Philadelphia. Therefore, it is no surprise that Eric Sweden, Program Director, Enterprise Architecture & Governance, NASCIO will be delivering one of the keynotes on “State of the States: NASCIO on Enterprise Architecture”. Sweden asserts “Enterprise Architecture” provides an operating discipline for creating, operating, continual re-evaluation and transformation of an “Enterprise.” Not only do I agree with this assertion, but I would add that the proper creation, operation and continuous evaluation of the “Enterprise” systemically drives its transformation. Let’s see how.

Creation. This phase involves the definition of the Enterprise Architecture (EA) in the first place. Most often, this involves the definition of an architecture that factors in what is in place today while taking into account the future direction. TOGAF® (The Open Group Architecture Framework) provides a framework for developing this architecture from a business, application, data, infrastructure and technology standpoint; in alignment with the overall Architecture Vision with associated architectural governance.

Operation. EA is not a done deal once it has been defined. It is vital that the EA defined is sustained on a consistent basis with the advent of new projects, new initiatives, new technologies, and new paradigms. As the abstract states, EA is a comprehensive business discipline that drives business and IT investments. In addition to driving investments, the operation phase also includes making the requisite changes to the EA as a result of these investments.

Continuous Evaluation. We live in a landscape of continuous change with innovative solutions and technologies constantly emerging. Moreover, the business objectives of the enterprise are constantly impacted by market dynamics, mergers and acquisitions. Therefore, the EA defined and in operation must be continuously evaluated against the architectural principles, while exercising architectural governance across the enterprise.

Transformation. EA is an operating discipline for the transformation of an enterprise. Enterprise Transformation is not a destination — it is a journey that needs to be managed — as characterized by Twentieth Century Fox CIO, John Herbert. To Forrester Analyst Phil Murphy, Transformation is like the Little Engine That Could — focusing on the business functions that matter. (Big Data – highlighted in another keynote at this conference by Michael Cavaretta — is a paradigm gaining a lot of ground for enterprises to stay competitive in the future.)

Global organizations are enterprises of enterprises, undergoing transformation faced with the challenges of systemic architectural governance. NASCIO has valuable insight into the challenges faced by the 50 “enterprises” represented by each of the United States. Challenges that contrast the need for healthy co-existence of these states with the desire to retain a degree of autonomy. Therefore, I look forward to this keynote to see how EA done right can drive the transformation of the Enterprise.

By the way, remember when Enterprise Architecture was done wrong close to the venue of another Open Group conference?

How does Enterprise Architecture drive the transformation of your enterprise? Please let me know.

A version of this blog post originally appeared on the HP Journey through Enterprise IT Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. 

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, TOGAF®

As Platform 3.0 ripens, expect agile access and distribution of actionable intelligence across enterprises, says The Open Group panel

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here

This latest BriefingsDirect discussion, leading into the The Open Group Conference on July 15 in Philadelphia, brings together a panel of experts to explore the business implications of the current shift to so-called Platform 3.0.

Known as the new model through which big data, cloud, and mobile and social — in combination — allow for advanced intelligence and automation in business, Platform 3.0 has so far lacked standards or even clear definitions.

The Open Group and its community are poised to change that, and we’re here now to learn more how to leverage Platform 3.0 as more than a IT shift — and as a business game-changer. It will be a big topic at next week’s conference.

The panel: Dave Lounsbury, Chief Technical Officer at The Open Group; Chris Harding, Director of Interoperability at The Open Group, and Mark Skilton, Global Director in the Strategy Office at Capgemini. The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions.

This special BriefingsDirect thought leadership interview comes in conjunction with The Open Group Conference, which is focused on enterprise transformation in the finance, government, and healthcare sectors. Registration to the conference remains open. Follow the conference on Twitter at #ogPHL. [Disclosure: The Open Group is a sponsor of this and other BriefingsDirect podcasts.]

Here are some excerpts:

Gardner: A lot of people are still wrapping their minds around this notion of Platform 3.0, something that is a whole greater than the sum of the parts. Why is this more than an IT conversation or a shift in how things are delivered? Why are the business implications momentous?

Lounsbury: Well, Dana, there are lot of IT changes or technical changes going on that are bringing together a lot of factors. They’re turning into this sort of super-saturated solution of ideas and possibilities and this emerging idea that this represents a new platform. I think it’s a pretty fundamental change.

Lounsbury

If you look at history, not just the history of IT, but all of human history, you see that step changes in societies and organizations are frequently driven by communication or connectedness. Think about the evolution of speech or the invention of the alphabet or movable-type printing. These technical innovations that we’re seeing are bringing together these vast sources of data about the world around us and doing it in real time.

Further, we’re starting to see a lot of rapid evolution in how you turn data into information and presenting the information in a way such that people can make decisions on it. Given all that we’re starting to realize, we’re on the cusp of another step of connectedness and awareness.

Fundamental changes

This really is going to drive some fundamental changes in the way we organize ourselves. Part of what The Open Group is doing, trying to bring Platform 3.0 together, is to try to get ahead of this and make sure that we understand not just what technical standards are needed, but how businesses will need to adapt and evolve what business processes they need to put in place in order to take maximum advantage of this to see change in the way that we look at the information.

Harding: Enterprises have to keep up with the way that things are moving in order to keep their positions in their industries. Enterprises can’t afford to be working with yesterday’s technology. It’s a case of being able to understand the information that they’re presented, and make the best decisions.

Harding

We’ve always talked about computers being about input, process, and output. Years ago, the input might have been through a teletype, the processing on a computer in the back office, and the output on print-out paper.

Now, we’re talking about the input being through a range of sensors and social media, the processing is done on the cloud, and the output goes to your mobile device, so you have it wherever you are when you need it. Enterprises that stick in the past are probably going to suffer.

Gardner: Mark Skilton, the ability to manage data at greater speed and scale, the whole three Vs — velocity, volume, and value — on its own could perhaps be a game changing shift in the market. The drive of mobile devices into lives of both consumers and workers is also a very big deal.

Of course, cloud has been an ongoing evolution of emphasis towards agility and efficiency in how workloads are supported. But is there something about the combination of how these are coming together at this particular time that, in your opinion, substantiates The Open Group’s emphasis on this as a literal platform shift?

Skilton: It is exactly that in terms of the workloads. The world we’re now into is the multi-workload environment, where you have mobile workloads, storage and compute workloads, and social networking workloads. There are many different types of data and traffic today in different cloud platforms and devices.

Skilton

It has to do with not just one solution, not one subscription model — because we’re now into this subscription-model era … the subscription economy, as one group tends to describe it. Now, we’re looking for not only just providing the security, the infrastructure, to deliver this kind of capability to a mobile device, as Chris was saying. The question is, how can you do this horizontally across other platforms? How can you integrate these things? This is something that is critical to the new order.

So Platform 3.0 addressing this point by bringing this together. Just look at the numbers. Look at the scale that we’re dealing with — 1.7 billion mobile devices sold in 2012, and 6.8 billion subscriptions estimated according to the International Telecommunications Union (ITU) equivalent to 96 percent of the world population.

Massive growth

We had massive growth in scale of mobile data traffic and internet data expansion. Mobile data is increasing 18 percent fold from 2011 to 2016 reaching 130 exabytes annually.  We passed 1 zettabyte of global online data storage back in 2010 and IP data traffic predicted to pass 1.3 zettabytes by 2016, with internet video accounting for 61 percent of total internet data according to Cisco studies.

These studies also predict data center traffic combining network and internet based storage will reach 6.6 zettabytes annually, and nearly two thirds of this will be cloud based by 2016.  This is only going to grow as social networking is reaching nearly one in four people around the world with 1.7 billion using at least one form of social networking in 2013, rising to one in three people with 2.55 billion global audience by 2017 as another extraordinary figure from an eMarketing.com study.

It is not surprising that many industry analysts are seeing growth in technologies of mobility, social computing, big data and cloud convergence at 30 to 40 percent and the shift to B2C commerce passing $1 trillion in 2012 is just the start of a wider digital transformation.

These numbers speak volumes in terms of the integration, interoperability, and connection of the new types of business and social realities that we have today.

Gardner: Why should IT be thinking about this as a fundamental shift, rather than a modest change?

Lounsbury: A lot depends on how you define your IT organization. It’s useful to separate the plumbing from the water. If we think of the water as the information that’s flowing, it’s how we make sure that the water is pure and getting to the places where you need to have the taps, where you need to have the water, etc.

But the plumbing also has to be up to the job. It needs to have the capacity. It needs to have new tools to filter out the impurities from the water. There’s no point giving someone data if it’s not been properly managed or if there’s incorrect information.

What’s going to happen in IT is not only do we have to focus on the mechanics of the plumbing, where we see things like the big database that we’ve seen in the open-source  role and things like that nature, but there’s the analytics and the data stewardship aspects of it.

We need to bring in mechanisms, so the data is valid and kept up to date. We need to indicate its freshness to the decision makers. Furthermore, IT is going to be called upon, whether as part of the enterprise IP or where end users will drive the selection of what they’re going to do with analytic tools and recommendation tools to take the data and turn it into information. One of the things you can’t do with business decision makers is overwhelm them with big rafts of data and expect them to figure it out.

You really need to present the information in a way that they can use to quickly make business decisions. That is an addition to the role of IT that may not have been there traditionally — how you think about the data and the role of what, in the beginning, was called data scientist and things of that nature.

Shift in constituency

Skilton: I’d just like to add to Dave’s excellent points about, the shape of data has changed, but also about why should IT get involved. We’re seeing that there’s a shift in the constituency of who is using this data.

We have the Chief Marketing Officer and the Chief Procurement Officer and other key line of business managers taking more direct control over the uses of information technology that enable their channels and interactions through mobile, social and data analytics. We’ve got processes that were previously managed just by IT and are now being consumed by significant stakeholders and investors in the organization.

We have to recognize in IT that we are the masters of our own destiny. The information needs to be sorted into new types of mobile devices, new types of data intelligence, and ways of delivering this kind of service.

I read recently in MIT Sloan Management Review an article that asked what is the role of the CIO. There is still the critical role of managing the security, compliance, and performance of these systems. But there’s also a socialization of IT, and this is where  the  positioning architectures which are cross platform is key to  delivering real value to the business users in the IT community.

Gardner: How do we prevent this from going off the rails?

Harding: This a very important point. And to add to the difficulties, it’s not only that a whole set of different people are getting involved with different kinds of information, but there’s also a step change in the speed with which all this is delivered. It’s no longer the case, that you can say, “Oh well, we need some kind of information system to manage this information. We’ll procure it and get a program written” that a year later that would be in place in delivering reports to it.

Now, people are looking to make sense of this information on the fly if possible. It’s really a case of having the platforms be the standard technology platform and also the systems for using it, the business processes, understood and in place.

Then, you can do all these things quickly and build on learning from what people have gone in the past, and not go out into all sorts of new experimental things that might not lead anywhere. It’s a case of building up the standard platform in the industry best practice. This is where The Open Group can really help things along by being a recipient and a reflector of best practice and standard.

Skilton: Capgemini has been doing work in this area. I break it down into four levels of scalability. It’s the platform scalability of understanding what you can do with your current legacy systems in introducing cloud computing or big data, and the infrastructure that gives you this, what we call multiplexing of resources. We’re very much seeing this idea of introducing scalable platform resource management, and you see that a lot with the heritage of virtualization.

Going into networking and the network scalability, a lot of the customers have who inherited their old telecommunications networks are looking to introduce new MPLS type scalable networks. The reason for this is that it’s all about connectivity in the field. I meet a number of clients who are saying, “We’ve got this cloud service,” or “This service is in a certain area of my country. If I move to another parts of the country or I’m traveling, I can’t get connectivity.” That’s the big issue of scaling.

Another one is application programming interfaces (APIs). What we’re seeing now is an explosion of integration and application services using API connectivity, and these are creating huge opportunities of what Chris Anderson of Wired used to call the “long tail effect.” It is now a reality in terms of building that kind of social connectivity and data exchange that Dave was talking about.

Finally, there are the marketplaces. Companies needs to think about what online marketplaces they need for digital branding, social branding, social networks, and awareness of your customers, suppliers, and employees. Customers can see that these four levels are where they need to start thinking about for IT strategy, and Platform 3.0 is right on this target of trying to work out what are the strategies of each of these new levels of scalability.

Gardner: We’re coming up on The Open Group Conference in Philadelphia very shortly. What should we expect from that? What is The Open Group doing vis-à-vis Platform 3.0, and how can organizations benefit from seeing a more methodological or standardized approach to some way of rationalizing all of this complexity? [Registration to the conference remains open. Follow the conference on Twitter at #ogPHL.]

Lounsbury: We’re still in the formational stages of  “third platform” or Platform 3.0 for The Open Group as an industry. To some extent, we’re starting pretty much at the ground floor with that in the Platform 3.0 forum. We’re leveraging a lot of the components that have been done previously by the work of the members of The Open Group in cloud, services-oriented architecture (SOA), and some of the work on the Internet of things.

First step

Our first step is to bring those things together to make sure that we’ve got a foundation to depart from. The next thing is that, through our Platform 3.0 Forum and the Steering Committee, we can ask people to talk about what their scenarios are for adoption of Platform 3.0?

That can range from things like the technological aspects of it and what standards are needed, but also to take a clue from our previous cloud working group. What are the best business practices in order to understand and then adopt some of these Platform 3.0 concepts to get your business using them?

What we’re really working toward in Philadelphia is to set up an exchange of ideas among the people who can, from the buy side, bring in their use cases from the supply side, bring in their ideas about what the technology possibilities are, and bring those together and start to shape a set of tracks where we can create business and technical artifacts that will help businesses adopt the Platform 3.0 concept.

Harding: We certainly also need to understand the business environment within which Platform 3.0 will be used. We’ve heard already about new players, new roles of various kinds that are appearing, and the fact that the technology is there and the business is adapting to this to use technology in new ways.

For example, we’ve heard about the data scientist. The data scientist is a new kind of role, a new kind of person, that is playing a particular part in all this within enterprises. We’re also hearing about marketplaces for services, new ways in which services are being made available and combined.

We really need to understand the actors in this new kind of business scenario. What are the pain points that people are having? What are the problems that need to be resolved in order to understand what kind of shape the new platform will have? That is one of the key things that the Platform 3.0 Forum members will be getting their teeth into.

Gardner: Looking to the future, when we think about the ability of the data to be so powerful when processed properly, when recommendations can be delivered to the right place at the right time, but we also recognize that there are limits to a manual or even human level approach to that, scientist by scientist, analysis by analysis.

When we think about the implications of automation, it seems like there were already some early examples of where bringing cloud, data, social, mobile, interactions, granularity of interactions together, that we’ve begun to see that how a recommendation engine could be brought to bear. I’m thinking about the Siri capability at Apple and even some of the examples of the Watson Technology at IBM.

So to our panel, are there unknown unknowns about where this will lead in terms of having extraordinary intelligence, a super computer or data center of super computers, brought to bear almost any problem instantly and then the result delivered directly to a center, a smart phone, any number of end points?

It seems that the potential here is mind boggling. Mark Skilton, any thoughts?

Skilton: What we’re talking about is the next generation of the Internet.  The advent of IPv6 and the explosion in multimedia services, will start to drive the next generation of the Internet.

I think that in the future, we’ll be talking about a multiplicity of information that is not just about services at your location or your personal lifestyle or your working preferences. We’ll see a convergence of information and services across multiple devices and new types of “co-presence services” that interact with your needs and social networks to provide predictive augmented information value.

When you start to get much more information about the context of where you are, the insight into what’s happening, and the predictive nature of these, it becomes something that becomes much more embedding into everyday life and in real time in context of what you are doing.

I expect to see much more intelligent applications coming forward on mobile devices in the next 5 to 10 years driven by this interconnected explosion of real time processing data, traffic, devices and social networking we describe in the scope of platform 3.0. This will add augmented intelligence and is something that’s really exciting and a complete game changer. I would call it the next killer app.

First-mover benefits

Gardner: There’s this notion of intelligence brought to bear rapidly in context, at a manageable cost. This seems to me a big change for businesses. We could, of course, go into the social implications as well, but just for businesses, that alone to me would be an incentive to get thinking and acting on this. So any thoughts about where businesses that do this well would be able to have significant advantage and first mover benefits?

Harding: Businesses always are taking stock. They understand their environments. They understand how the world that they live in is changing and they understand what part they play in it. It will be down to individual businesses to look at this new technical possibility and say, “So now this is where we could make a change to our business.” It’s the vision moment where you see a combination of technical possibility and business advantage that will work for your organization.

It’s going to be different for every business, and I’m very happy to say this, it’s something that computers aren’t going to be able to do for a very long time yet. It’s going to really be down to business people to do this as they have been doing for centuries and millennia, to understand how they can take advantage of these things.

So it’s a very exciting time, and we’ll see businesses understanding and developing their individual business visions as the starting point for a cycle of business transformation, which is what we’ll be very much talking about in Philadelphia. So yes, there will be businesses that gain advantage, but I wouldn’t point to any particular business, or any particular sector and say, “It’s going to be them” or “It’s going to be them.”

Gardner: Dave Lounsbury, a last word to you. In terms of some of the future implications and vision, where could this could lead in the not too distant future?

Lounsbury: I’d disagree a bit with my colleagues on this, and this could probably be a podcast on its own, Dana. You mentioned Siri, and I believe IBM just announced the commercial version of its Watson recommendation and analysis engine for use in some customer-facing applications.

I definitely see these as the thin end of the wedge on filling that gap between the growth of data and the analysis of data. I can imagine in not in the next couple of years, but in the next couple of technology cycles, that we’ll see the concept of recommendations and analysis as a service, to bring it full circle to cloud. And keep in mind that all of case law is data and all of the medical textbooks ever written are data. Pick your industry, and there is huge amount of knowledge base that humans must currently keep on top of.

This approach and these advances in the recommendation engines driven by the availability of big data are going to produce profound changes in the way knowledge workers produce their job. That’s something that businesses, including their IT functions, absolutely need to stay in front of to remain competitive in the next decade or so.

Comments Off

Filed under ArchiMate®, Business Architecture, Cloud, Cloud/SOA, Conference, Data management, Enterprise Architecture, Platform 3.0, Professional Development, TOGAF®

Why is Cloud Adoption Taking so Long?

By Chris Harding, The Open Group

At the end of last year, Gartner predicted that cloud computing would become an integral part of IT in 2013 (http://www.gartner.com/DisplayDocument?doc_cd=230929). This looks a pretty safe bet. The real question is, why is it taking so long?

Cloud Computing

Cloud computing is a simple concept. IT resources are made available, within an environment that enables them to be used, via a communications network, as a service. It is used within enterprises to enable IT departments to meet users’ needs more effectively, and by external providers to deliver better IT services to their enterprise customers.

There are established vendors of products to fit both of these scenarios. The potential business benefits are well documented. There are examples of real businesses gaining those benefits, such as Netflix as a public cloud user (see http://www.zdnet.com/the-biggest-cloud-app-of-all-netflix-7000014298/ ), and Unilever and Lufthansa as implementers of private cloud (see http://www.computerweekly.com/news/2240114043/Unilever-and-Lufthansa-Systems-deploy-Azure-Private-cloud ).

Slow Pace of Adoption

Yet we are still talking of cloud computing becoming an integral part of IT. In the 2012 Open Group Cloud ROI survey, less than half of the respondents’ organizations were using cloud computing, although most of the rest were investigating its use. (See http://www.opengroup.org/sites/default/files/contentimages/Documents/cloud_roi_formal_report_12_19_12-1.pdf ). Clearly, cloud computing is not being used for enterprise IT as a matter of routine.

Cloud computing is now at least seven years old. Amazon’s “Elastic Compute Cloud” was launched in August 2006, and there are services that we now regard as cloud computing, though they may not have been called that, dating from before then. Other IT revolutions – personal computers, for example – have reached the point of being an integral part of IT in half the time. Why has it taken Cloud so long?

The Reasons

One reason is that using Cloud requires a high level of trust. You can lock your PC in your office, but you cannot physically secure your cloud resources. You must trust the cloud service provider. Such trust takes time to earn.

Another reason is that, although it is a simple concept, cloud computing is described in a rather complex way. The widely-accepted NIST definition (see http://csrc.nist.gov/publications/nistpubs/800-145/SP800-145.pdf ) has three service models and four deployment models, giving a total of twelve distinct delivery combinations. Each combination has different business drivers, and the three service models are based on very different technical capabilities. Real products, of course, often do not exactly correspond to the definition, and their vendors describe them in product-specific terms. This complexity often leads to misunderstanding and confusion.

A third reason is that you cannot “mix and match” cloud services from different providers. The market is consolidating, with a few key players emerging as dominant at the infrastructure and platform levels. Each of them has its own proprietary interfaces. There are no real vendor-neutral standards. A recent Information Week article on Netflix (http://www.informationweek.co.uk/cloud-computing/platform/how-netflix-is-ruining-cloud-computing/240151650 ) describes some of the consequences. Customers are beginning to talk of “vendor lock-in” in a way that we haven’t seen since the days of mainframes.

The Portability and Interoperability Guide

The Open Group Cloud Computing Portability and Interoperability Guide addresses this last problem, by providing recommendations to customers on how best to achieve portability and interoperability when working with current cloud products and services. It also makes recommendations to suppliers and standards bodies on how standards and best practice should evolve to enable greater portability and interoperability in the future.

The Guide tackles the complexity of its subject by defining a simple Distributed Computing Reference Model. This model shows how cloud services fit into the mix of products and services used by enterprises in distributed computing solutions today. It identifies the major components of cloud-enabled solutions, and describes their portability and interoperability interfaces.

Platform 3.0

Cloud is not the only new game in town. Enterprises are looking at mobile computing, social computing, big data, sensors, and controls as new technologies that can transform their businesses. Some of these – mobile and social computing, for example – have caught on faster than Cloud.

Portability and interoperability are major concerns for these technologies too. There is a need for a standard platform to enable enterprises to use all of the new technologies, individually and in combination, and “mix and match” different products. This is the vision of the Platform 3.0 Forum, recently formed by The Open Group. The distributed computing reference model is an important input to this work.

The State of the Cloud

It is now at least becoming routine to consider cloud computing when architecting a new IT solution. The chances of it being selected however appear to be less than fifty-fifty, in spite of its benefits. The reasons include those mentioned above: lack of trust, complexity, and potential lock-in.

The Guide removes some of the confusion caused by the complexity, and helps enterprises assess their exposure to lock-in, and take what measures they can to prevent it.

The growth of cloud computing is starting to be constrained by lack of standards to enable an open market with free competition. The Guide contains recommendations to help the industry and standards bodies produce the standards that are needed.

Let’s all hope that the standards do appear soon. Cloud is, quite simply, a good idea. It is an important technology paradigm that has the potential to transform businesses, to make commerce and industry more productive, and to benefit society as a whole, just as personal computing did. Its adoption really should not be taking this long.

The Open Group Cloud Computing Portability and Interoperability Guide is available from The Open Group bookstore at https://www2.opengroup.org/ogsys/catalog/G135

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF® practitioner.

3 Comments

Filed under Platform 3.0

Flexibility, Agility and Open Standards

By Jose M. Sanchez Knaack, IBM

Flexibility and agility are terms used almost interchangeably these days as attributes of IT architectures designed to cope with rapidly changing business requirements. Did you ever wonder if they are actually the same? Don’t you have the feeling that these terms remain abstract and without a concrete link to the design of an IT architecture?

This post searches to provide clear definitions for both flexibility and agility, and explain how both relate to the design of IT architectures that exploit open standards. A ‘real-life’ example will help to understand these concepts and render them relevant to the Enterprise Architect’s daily job.

First, here is some context on why flexibility and agility are increasingly important for businesses. Today, the average smart phone has more computing power than the original Apollo mission to the moon. We live in times of exponential change; the new technological revolution seems to be always around the corner and is safe to state that the trend will continue as nicely visualized in this infographic by TIME Magazine.

The average lifetime of a company in the S&P 500 has fallen by 80 percent since 1937. In other words, companies need to adapt fast to capitalize on business opportunities created by new technologies at the price of loosing their leadership position.

Thus, flexibility and agility have become ever present business goals that need to be supported by the underlying IT architecture. But, what is the precise meaning of these two terms? The online Merriam-Webster dictionary offers the following definitions:

Flexible: characterized by a ready capability to adapt to new, different, or changing requirements.

Agile: marked by ready ability to move with quick easy grace.

To understand how these terms relate to IT architecture, let us explore an example based on an Enterprise Service Bus (ESB) scenario.

An ESB can be seen as the foundation for a flexible IT architecture allowing companies to integrate applications (processes) written in different programming languages and running on different platforms within and outside the corporate firewall.

ESB products are normally equipped with a set of pre-built adapters that allow integrating 70-80 percent of applications ‘out-of-the-box’, without additional programming efforts. For the remaining 20-30 percent of integration requirements, it is possible to develop custom adapters so that any application can be integrated with any other if required.

In other words, an ESB covers requirements regarding integration flexibility, that is, it can cope with changing requirements in terms of integrating additional applications via adapters, ‘out-of-the-box’ or custom built. How does this integration flexibility correlate to integration agility?

Let’s think of a scenario where the IT team has been requested to integrate an old manufacturing application with a new business partner. The integration needs to be ready within one month; otherwise the targeted business opportunity will not apply anymore.

The picture below shows the underlying IT architecture for this integration scenario.

jose diagram

Although the ESB is able to integrate the old manufacturing application, it requires an adapter to be custom developed since the application does not support any of the communication protocols covered by the pre-built adapters. To custom develop, test and deploy an adapter in a corporate environment is likely going to take longer that a month and the business opportunity will be lost because the IT architecture was not agile enough.

This is the subtle difference between flexible and agile.

Notice that if the manufacturing application had been able to communicate via open standards, the corresponding pre-built adapter would have significantly shortened the time required to integrate this application. Applications that do not support open standards still exist in corporate IT landscapes, like the above scenario illustrates. Thus, the importance of incorporating open standards when road mapping your IT architecture.

The key takeaway is that your architecture principles need to favor information technology built on open standards, and for that, you can leverage The Open Group Architecture Principle 20 on Interoperability.

Name Interoperability
Statement Software and hardware should conform to defined standards that promote interoperability for data, applications, and technology.

In summary, the accelerating pace of change requires corporate IT architectures to support the business goals of flexibility and agility. Establishing architecture principles that favor open standards as part of your architecture governance framework is one proven approach (although not the only one) to road map your IT architecture in the pursuit of resiliency.

linkedin - CopyJose M. Sanchez Knaack is Senior Manager with IBM Global Business Services in Switzerland. Mr. Sanchez Knaack professional background covers business aligned IT architecture strategy and complex system integration at global technology enabled transformation initiatives.

 

 

 

Comments Off

Filed under Enterprise Architecture

Why Business Needs Platform 3.0

By Chris Harding, The Open Group

The Internet gives businesses access to ever-larger markets, but it also brings more competition. To prosper, they must deliver outstanding products and services. Often, this means processing the ever-greater, and increasingly complex, data that the Internet makes available. The question they now face is, how to do this without spending all their time and effort on information technology.

Web Business Success

The success stories of giants such as Amazon are well-publicized, but there are other, less well-known companies that have profited from the Web in all sorts of ways. Here’s an example. In 2000 an English illustrator called Jacquie Lawson tried creating greetings cards on the Internet. People liked what she did, and she started an e-business whose website is now ranked by Alexa as number 2712 in the world, and #1879 in the USA. This is based on website traffic and is comparable, to take a company that may be better known, with toyota.com, which ranks slightly higher in the USA (#1314) but somewhat lower globally (#4838).

A company with a good product can grow fast. This also means, though, that a company with a better product, or even just better marketing, can eclipse it just as quickly. Social networking site Myspace was once the most visited site in the US. Now it is ranked by Alexa as #196, way behind Facebook, which is #2.

So who ranks as #1? You guessed it – Google. Which brings us to the ability to process large amounts of data, where Google excels.

The Data Explosion

The World-Wide Web probably contains over 13 billion pages, yet you can often find the information that you want in seconds. This is made possible by technology that indexes this vast amount of data – measured in petabytes (millions of gigabytes) – and responds to users’ queries.

The data on the world-wide-web originally came mostly from people, typing it in by hand. In future, we will often use data that is generated by sensors in inanimate objects. Automobiles, for example, can generate data that can be used to optimize their performance or assess the need for maintenance or repair.

The world population is measured in billions. It is estimated that the Internet of Things, in which data is collected from objects, could enable us to track 100 trillion objects in real time – ten thousand times as many things as there are people, tirelessly pumping out information. The amount of available data of potential value to businesses is set to explode yet again.

A New Business Generation

It’s not just the amount of data to be processed that is changing. We are also seeing changes in the way data is used, the way it is processed, and the way it is accessed. Following The Open Group conference in January, I wrote about the convergence of social, Cloud, and mobile computing with Big Data. These are the new technical trends that are taking us into the next generation of business applications.

We don’t yet know what all those applications will be – who in the 1990’s would have predicted greetings cards as a Web application – but there are some exciting ideas. They range from using social media to produce market forecasts to alerting hospital doctors via tablets and cellphones when monitors detect patient emergencies. All this, and more, is possible with technology that we have now, if we can use it.

The Problem

But there is a problem. Although there is technology that enables businesses to use social, Cloud, and mobile computing, and to analyze and process massive amounts of data of different kinds, it is not necessarily easy to use. A plethora of products is emerging, with different interfaces, and with no ability to work with each other.  This is fine for geeks who love to play with new toys, but not so good for someone who wants to realize a new business idea and make money.

The new generation of business applications cannot be built on a mish-mash of unstable products, each requiring a different kind of specialist expertise. It needs a solid platform, generally understood by enterprise architects and software engineers, who can translate the business ideas into technical solutions.

The New Platform

Former VMware CEO and current Pivotal Initiative leader Paul Maritz describes the situation very well in his recent blog on GigaOM. He characterizes the new breed of enterprises, that give customers what they want, when they want it and where they want it, by exploiting the opportunities provided by new technologies, as consumer grade. Paul says that, “Addressing these opportunities will require new underpinnings; a new platform, if you like. At the core of this platform, which needs to be Cloud-independent to prevent lock-in, will be new approaches to handling big and fast (real-time) data.”

The Open Group has announced its new Platform 3.0 Forum to help the industry define a standard platform to meet this need. As The Open Group CTO Dave Lounsbury says in his blog, the new Forum will advance The Open Group vision of Boundaryless Information Flow™ by helping enterprises to take advantage of these convergent technologies. This will be accomplished by identifying a set of new platform capabilities, and architecting and standardizing an IT platform by which enterprises can reap the business benefits of Platform 3.0.

Business Focus

A business set up to design greetings cards should not spend its time designing communications networks and server farms. It cannot afford to spend time on such things. Someone else will focus on its core business and take its market.

The Web provided a platform that businesses of its generation could build on to do what they do best without being overly distracted by the technology. Platform 3.0 will do this for the new generation of businesses.

Help It Happen!

To find out more about the Platform 3.0 Forum, and take part in its formation, watch out for the Platform 3.0 web meetings that will be announced by e-mail and twitter, and on our home page.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing, and the Platform 3.0 Forum. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

2 Comments

Filed under Platform 3.0

Gaining Greater Cohesion: Bringing Business Analysis and Business Architecture into Focus

By Craig Martin, Enterprise Architects

Having delivered many talks on Business Architecture over the years, I’m often struck by the common vision driving many members in the audience – a vision of building cohesion in a business, achieving the right balance between competing forces and bringing the business strategy and operations into harmony.  However, as with many ambitious visions, the challenge in this case is immense.  As I will explain, many of the people who envision this future state of nirvana are, in practice, inadvertently preventing it from happening.

Standards Silos
There are a host of standards and disciplines that are brought into play by enterprises to improve business performance and capabilities. For example standards such as PRINCE2, BABOK, BIZBOK, TOGAF, COBIT, ITIL and PMBOK are designed to ensure reliability of team output and approach across various business activities. However, in many instances these standards, operating together, present important gaps and overlaps. One wonders whose job it is to integrate and unify these standards. Whose job is it to understand the business requirements, business processes, drivers, capabilities and so on?

Apples to Apples?
As these standards evolve they often introduce new jargon to support their view of the world. Have you ever had to ask your business to explain what they do on a single page? The diversity of the views and models can be quite astonishing:

  • The target operating model
  • The business model
  • The process model
  • The capability model
  • The value chain model
  • The functional model
  • The business services model
  • The component business model
  • The business reference model
  • Business anchor model

The list goes on and on…

Each has a purpose and brings value in isolation. However, in the common scenario where they are developed using differing tools, methods, frameworks and techniques, the result is usually greater fragmentation, not more cohesion – and consequently we can end up with some very confused and exacerbated business stakeholders who care less about what standard we use and more about finding clarity to just get the job done.

The Convergence of Business Architecture and Business Analysis
Ask a room filled with business analysts and business architects how their jobs differ and relate, and I guarantee that would receive a multitude of alternative and sometimes conflicting perspectives.

Both of these disciplines try to develop standardised methods and frameworks for the description of the building blocks of an organization. They also seek to standardise the means by which to string them together to create better outcomes.

In other words, they are the disciplines that seek to create balance between two important business goals:

  • To produce consistent, predictable outcomes
  • To produce outcomes that meet desired objectives

In his book, “The Design of Business: Why Design Thinking is the Next Competitive Advantage,” Roger Martin describes the relationships and trade-offs between analytical thinking and intuitive thinking in business. He refers to the “knowledge funnel,” which charts the movement of business focus from solving business mysteries using heuristics to creating algorithms that increase reliability, reducing business complexity and costs and improving business performance.

The disciplines of Business Architecture and business analysis are both currently seeking to address this challenge. Martin refers to this as ”design thinking.”

Thinking Types v2

Vision Vs. Reality For Business Analysts and Business Architects

When examining the competency models for business analysis and Business Architecture, the desire is to position these two disciplines right across the spectrum of reliability and validity.

The reality is that both the business architect and the business analyst spend a large portion of their time in the reliability space, and I believe I’ve found the reason why.

Both the BABOK and the BIZBOK provide a body of knowledge focused predominantly around the reliability space. In other words, they look at how we define the building blocks of an organization, and less so at how we invent better building blocks within the organization.

Integrating the Disciplines

While we still have some way to go to integrate, the Business Architecture and business analysis disciplines are currently bringing great value to business through greater reliability and repeatability.

However, there is a significant opportunity to enable the intuitive thinkers to look at the bigger picture and identify opportunities to innovate their business models, their go-to-market, their product and service offerings and their operations.

Perhaps we might consider introducing a new function to bridge and unify the disciplines?

This newly created function might integrate a number of incumbent roles and functions and cover:

  • A holistic structural view covering the business model and the high-level relationships and interactions between all business systems
  • A market model view in which the focus is on understanding the market dynamics, segments and customer need
  • A products and services model view focusing on customer experience, value proposition, product and service mix and customer value
  • An operating model view – this is the current focus area of the business architect and business analyst. You need these building blocks defined in a reliable, repeatable and manageable structure. This enables agility within the organization and will support the assembly and mixing of building blocks to improve customer experience and value

At the end of the day, what matters most is not business analysis or Business Architecture themselves, but how the business will bridge the reliability and validity spectrum to reliably produce desired business outcomes.

I will discuss this topic in more detail at The Open Group Conference in Sydney, April 15-18, which will be the first Open Group event to be held in Australia.

Craig-MARTIN-ea-updated-3Craig Martin is the Chief Operating Officer and Chief Architect at Enterprise Architects, which is a specialist Enterprise Architecture firm operating in the U.S., UK, Asia and Australia. He is presenting the Business Architecture plenary at the upcoming Open Group conference in Sydney. 

1 Comment

Filed under Business Architecture

Beyond Big Data

By Chris Harding, The Open Group

The big bang that started The Open Group Conference in Newport Beach was, appropriately, a presentation related to astronomy. Chris Gerty gave a keynote on Big Data at NASA, where he is Deputy Program Manager of the Open Innovation Program. He told us how visualizing deep space and its celestial bodies created understanding and enabled new discoveries. Everyone who attended felt inspired to explore the universe of Big Data during the rest of the conference. And that exploration – as is often the case with successful space missions – left us wondering what lies beyond.

The Big Data Conference Plenary

The second presentation on that Monday morning brought us down from the stars to the nuts and bolts of engineering. Mechanical devices require regular maintenance to keep functioning. Processing the mass of data generated during their operation can improve safety and cut costs. For example, airlines can overhaul aircraft engines when it needs doing, rather than on a fixed schedule that has to be frequent enough to prevent damage under most conditions, but might still fail to anticipate failure in unusual circumstances. David Potter and Ron Schuldt lead two of The Open Group initiatives, Quantum Lifecycle management (QLM) and the Universal Data Element Framework (UDEF). They explained how a semantic approach to product lifecycle management can facilitate the big-data processing needed to achieve this aim.

Chris Gerty was then joined by Andras Szakal, vice-president and chief technology officer at IBM US Federal IMT, Robert Weisman, chief executive officer of Build The Vision, and Jim Hietala, vice-president of Security at The Open Group, in a panel session on Big Data that was moderated by Dana Gardner of Interarbor Solutions. As always, Dana facilitated a fascinating discussion. Key points made by the panelists included: the trend to monetize data; the need to ensure veracity and usefulness; the need for security and privacy; the expectation that data warehouse technology will exist and evolve in parallel with map/reduce “on-the-fly” analysis; the importance of meaningful presentation of the data; integration with cloud and mobile technology; and the new ways in which Big Data can be used to deliver business value.

More on Big Data

In the afternoons of Monday and Tuesday, and on most of Wednesday, the conference split into streams. These have presentations that are more technical than the plenary, going deeper into their subjects. It’s a pity that you can’t be in all the streams at once. (At one point I couldn’t be in any of them, as there was an important side meeting to discuss the UDEF, which is in one of the areas that I support as forum director). Fortunately, there were a few great stream presentations that I did manage to get to.

On the Monday afternoon, Tom Plunkett and Janet Mostow of Oracle presented a reference architecture that combined Hadoop and NoSQL with traditional RDBMS, streaming, and complex event processing, to enable Big Data analysis. One application that they described was to trace the relations between particular genes and cancer. This could have big benefits in disease prediction and treatment. Another was to predict the movements of protesters at a demonstration through analysis of communications on social media. The police could then concentrate their forces in the right place at the right time.

Jason Bloomberg, president of Zapthink – now part of Dovel – is always thought-provoking. His presentation featured the need for governance vitality to cope with ever changing tools to handle Big Data of ever increasing size, “crowdsourcing” to channel the efforts of many people into solving a problem, and business transformation that is continuous rather than a one-time step from “as is” to “to be.”

Later in the week, I moderated a discussion on Architecting for Big Data in the Cloud. We had a well-balanced panel made up of TJ Virdi of Boeing, Mark Skilton of Capgemini and Tom Plunkett of Oracle. They made some excellent points. Big Data analysis provides business value by enabling better understanding, leading to better decisions. The analysis is often an iterative process, with new questions emerging as answers are found. There is no single application that does this analysis and provides the visualization needed for understanding, but there are a number of products that can be used to assist. The role of the data scientist in formulating the questions and configuring the visualization is critical. Reference models for the technology are emerging but there are as yet no commonly-accepted standards.

The New Enterprise Platform

Jogging is a great way of taking exercise at conferences, and I was able to go for a run most mornings before the meetings started at Newport Beach. Pacific Coast Highway isn’t the most interesting of tracks, but on Tuesday morning I was soon up in Castaways Park, pleasantly jogging through the carefully-nurtured natural coastal vegetation, with views over the ocean and its margin of high-priced homes, slipways, and yachts. I reflected as I ran that we had heard some interesting things about Big Data, but it is now an established topic. There must be something new coming over the horizon.

The answer to what this might be was suggested in the first presentation of that day’s plenary, Mary Ann Mezzapelle, security strategist for HP Enterprise Services, talked about the need to get security right for Big Data and the Cloud. But her scope was actually wider. She spoke of the need to secure the “third platform” – the term coined by IDC to describe the convergence of social, cloud and mobile computing with Big Data.

Securing Big Data

Mary Ann’s keynote was not about the third platform itself, but about what should be done to protect it. The new platform brings with it a new set of security threats, and the increasing scale of operation makes it increasingly important to get the security right. Mary Ann presented a thoughtful analysis founded on a risk-based approach.

She was followed by Adrian Lane, chief technology officer at Securosis, who pointed out that Big Data processing using NoSQL has a different architecture from traditional relational data processing, and requires different security solutions. This does not necessarily mean new techniques; existing techniques can be used in new ways. For example, Kerberos may be used to secure inter-node communications in map/reduce processing. Adrian’s presentation completed the Tuesday plenary sessions.

Service Oriented Architecture

The streams continued after the plenary. I went to the Distributed Services Architecture stream, which focused on SOA.

Bill Poole, enterprise architect at JourneyOne in Australia, described how to use the graphical architecture modeling language ArchiMate® to model service-oriented architectures. He illustrated this using a case study of a global mining organization that wanted to consolidate its two existing bespoke inventory management applications into a single commercial off-the-shelf application. It’s amazing how a real-world case study can make a topic come to life, and the audience certainly responded warmly to Bill’s excellent presentation.

Ali Arsanjani, chief technology officer for Business Performance and Service Optimization, and Heather Kreger, chief technology officer for International Standards, both at IBM, described the range of SOA standards published by The Open Group and available for use by enterprise architects. Ali was one of the brains that developed the SOA Reference Architecture, and Heather is a key player in international standards activities for SOA, where she has helped The Open Group’s Service Integration Maturity Model and SOA Governance Framework to become international standards, and is working on an international standard SOA reference architecture.

Cloud Computing

To start Wednesday’s Cloud Computing streams, TJ Virdi, senior enterprise architect at The Boeing Company, discussed use of TOGAF® to develop an Enterprise Architecture for a Cloud ecosystem. A large enterprise such as Boeing may use many Cloud service providers, enabling collaboration between corporate departments, partners, and regulators in a complex ecosystem. Architecting for this is a major challenge, and The Open Group’s TOGAF for Cloud Ecosystems project is working to provide guidance.

Stuart Boardman of KPN gave a different perspective on Cloud ecosystems, with a case study from the energy industry. An ecosystem may not necessarily be governed by a single entity, and the participants may not always be aware of each other. Energy generation and consumption in the Netherlands is part of a complex international ecosystem involving producers, consumers, transporters, and traders of many kinds. A participant may be involved in several ecosystems in several ways: a farmer for example, might consume energy, have wind turbines to produce it, and also participate in food production and transport ecosystems.

Penelope Gordon of 1-Plug Corporation explained how choice and use of business metrics can impact Cloud service providers. She worked through four examples: a start-up Software-as-a-Service provider requiring investment, an established company thinking of providing its products as cloud services, an IT department planning to offer an in-house private Cloud platform, and a government agency seeking budget for government Cloud.

Mark Skilton, director at Capgemini in the UK, gave a presentation titled “Digital Transformation and the Role of Cloud Computing.” He covered a very broad canvas of business transformation driven by technological change, and illustrated his theme with a case study from the pharmaceutical industry. New technology enables new business models, giving competitive advantage. Increasingly, the introduction of this technology is driven by the business, rather than the IT side of the enterprise, and it has major challenges for both sides. But what new technologies are in question? Mark’s presentation had Cloud in the title, but also featured social and mobile computing, and Big Data.

The New Trend

On Thursday morning I took a longer run, to and round Balboa Island. With only one road in or out, its main street of shops and restaurants is not a through route and the island has the feel of a real village. The SOA Work Group Steering Committee had found an excellent, and reasonably priced, Italian restaurant there the previous evening. There is a clear resurgence of interest in SOA, partly driven by the use of service orientation – the principle, rather than particular protocols – in Cloud Computing and other new technologies. That morning I took the track round the shoreline, and was reminded a little of Dylan Thomas’s “fishing boat bobbing sea.” Fishing here is for leisure rather than livelihood, but I suspected that the fishermen, like those of Thomas’s little Welsh village, spend more time in the bar than on the water.

I thought about how the conference sessions had indicated an emerging trend. This is not a new technology but the combination of four current technologies to create a new platform for enterprise IT: Social, Cloud, and Mobile computing, and Big Data. Mary Ann Mezzapelle’s presentation had referenced IDC’s “third platform.” Other discussions had mentioned Gartner’s “Nexus of forces,” the combination of Social, Cloud and Mobile computing with information that Gartner says is transforming the way people and businesses relate to technology, and will become a key differentiator of business and technology management. Mark Skilton had included these same four technologies in his presentation. Great minds, and analyst corporations, think alike!

I thought also about the examples and case studies in the stream presentations. Areas as diverse as healthcare, manufacturing, energy and policing are using the new technologies. Clearly, they can deliver major business benefits. The challenge for enterprise architects is to maximize those benefits through pragmatic architectures.

Emerging Standards

On the way back to the hotel, I remarked again on what I had noticed before, how beautifully neat and carefully maintained the front gardens bordering the sidewalk are. I almost felt that I was running through a public botanical garden. Is there some ordinance requiring people to keep their gardens tidy, with severe penalties for anyone who leaves a lawn or hedge unclipped? Is a miserable defaulter fitted with a ball and chain, not to be removed until the untidy vegetation has been properly trimmed, with nail clippers? Apparently not. People here keep their gardens tidy because they want to. The best standards are like that: universally followed, without use or threat of sanction.

Standards are an issue for the new enterprise platform. Apart from the underlying standards of the Internet, there really aren’t any. The area isn’t even mapped out. Vendors of Social, Cloud, Mobile, and Big Data products and services are trying to stake out as much valuable real estate as they can. They have no interest yet in boundaries with neatly-clipped hedges.

This is a stage that every new technology goes through. Then, as it matures, the vendors understand that their products and services have much more value when they conform to standards, just as properties have more value in an area where everything is neat and well-maintained.

It may be too soon to define those standards for the new enterprise platform, but it is certainly time to start mapping out the area, to understand its subdivisions and how they inter-relate, and to prepare the way for standards. Following the conference, The Open Group has announced a new Forum, provisionally titled Open Platform 3.0, to do just that.

The SOA and Cloud Work Groups

Thursday was my final day of meetings at the conference. The plenary and streams presentations were done. This day was for working meetings of the SOA and Cloud Work Groups. I also had an informal discussion with Ron Schuldt about a new approach for the UDEF, following up on the earlier UDEF side meeting. The conference hallways, as well as the meeting rooms, often see productive business done.

The SOA Work Group discussed a certification program for SOA professionals, and an update to the SOA Reference Architecture. The Open Group is working with ISO and the IEEE to define a standard SOA reference architecture that will have consensus across all three bodies.

The Cloud Work Group had met earlier to further the TOGAF for Cloud ecosystems project. Now it worked on its forthcoming white paper on business performance metrics. It also – though this was not on the original agenda – discussed Gartner’s Nexus of Forces, and the future role of the Work Group in mapping out the new enterprise platform.

Mapping the New Enterprise Platform

At the start of the conference we looked at how to map the stars. Big Data analytics enables people to visualize the universe in new ways, reach new understandings of what is in it and how it works, and point to new areas for future exploration.

As the conference progressed, we found that Big Data is part of a convergence of forces. Social, mobile, and Cloud Computing are being combined with Big Data to form a new enterprise platform. The development of this platform, and its roll-out to support innovative applications that deliver more business value, is what lies beyond Big Data.

At the end of the conference we were thinking about mapping the new enterprise platform. This will not require sophisticated data processing and analysis. It will take discussions to create a common understanding, and detailed committee work to draft the guidelines and standards. This work will be done by The Open Group’s new Open Platform 3.0 Forum.

The next Open Group conference is in the week of April 15, in Sydney, Australia. I’m told that there’s some great jogging there. More importantly, we’ll be reflecting on progress in mapping Open Platform 3.0, and thinking about what lies ahead. I’m looking forward to it already.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

2 Comments

Filed under Conference