Tag Archives: enterprise architecture

Thinking About Big Data

By Dave Lounsbury, The Open Group

“We can not solve our problems with the same level of thinking that created them.”

- Albert Einstein

The growing consumerization of technology and convergence of technologies such as the “Internet of Things”, social networks and mobile devices are causing big changes for enterprises and the marketplace. They are also generating massive amounts of data related to behavior, environment, location, buying patterns and more.

Having massive amounts of data readily available is invaluable. More data means greater insight, which leads to more informed decision-making. So far, we are keeping ahead of this data by smarter analytics and improving the way we handle this data. The question is, how long can we keep up? The rate of data production is increasing; as an example, an IDC report[1] predicts that the production of data will increase 50X in the coming decade. To magnify this problem, there’s an accompanying explosion of data about the data – cataloging information, metadata, and the results of analytics are all data in themselves. At the same time, data scientists and engineers who can deal with such data are already a scarce commodity, and the number of such people is expected to grow only by 1.5X in the same period.

It isn’t hard to draw the curve. Turning data into actionable insight is going to be a challenge – data flow is accelerating at a faster rate than the available humans can absorb, and our databases and data analytic systems can only help so much.

Markets never leave gaps like this unfilled, and because of this we should expect to see a fundamental shift in the IT tools we use to deal with the growing tide of data. In order to solve the challenges of managing data with the volume, variety and velocities we expect, we will need to teach machines to do more of the analysis for us and help to make the best use of scarce human talents.

The Study of Machine Learning

Machine Learning, sometimes called “cognitive computing”[2] or “intelligent computing”, looks at the study of building computers with the capability to learn and perform tasks based on experience. Experience in this context includes looking at vast data sets, using multiple “senses” or types of media, recognizing patterns from past history or precedent, and extrapolating this information to reason about the problem at hand. An example of machine learning that is currently underway in the healthcare sector is medical decision aids that learn to predict therapies or to help with patient management, based on correlating a vast body of medical and drug experience data with the information about the patients under treatment

A well-known example of this is Watson, a machine learning system IBM unveiled a few years ago. While Watson is best known for winning Jeopardy, that was just the beginning. IBM has since built six Watsons to assist with their primary objective: to help health care professionals find answers to complex medical questions and help with patient management[3]. The sophistication of Watson is the reaction of all this data action that is going on. Watson of course isn’t the only example in this field, with others ranging from Apple’s Siri intelligent voice-operated assistant to DARPA’s SyNAPSE program[4].

Evolution of the Technological Landscape

As the consumerization of technology continues to grow and converge, our way of constructing business models and systems need to evolve as well. We need to let data drive the business process, and incorporate intelligent machines like Watson into our infrastructure to help us turn data into actionable results.

There is an opportunity for information technology and companies to help drive this forward. However, in order for us to properly teach computers how to learn, we first need to understand the environments in which they will be asked to learn in – Cloud, Big Data, etc. Ultimately, though, any full consideration of these problems will require a look at how machine learning can help us make decisions – machine learning systems may be the real platform in these areas.

The Open Group is already laying the foundation to help organizations take advantage of these convergent technologies with its new forum, Platform 3.0. The forum brings together a community of industry thought leaders to analyze the use of Cloud, Social, Mobile computing and Big Data, and describe the business benefits that enterprises can gain from them. We’ll also be looking at trends like these at our Philadelphia conference this summer.  Please join us in the discussion.


2 Comments

Filed under Cloud, Cloud/SOA, Data management, Enterprise Architecture

Join us for The Open Group Conference in Sydney – April 15-18

By The Open Group Conference Team

The Open Group is busy gearing up for the Sydney conference, which will take place on April 15-18, 2013. With over 2,000 Associate of Enterprise Architects (AEA) members in Australia, Sydney is an ideal setting for industry experts from around the world to gather and discuss the evolution of Enterprise Architecture and its role in transforming the enterprise. Be sure to register today!

The conference offers roughly 60 sessions on a varied of topics including:

  • Cloud infrastructure as an enabler of innovation in enterprises
  • Simplifying data integration in the government and defense sectors
  • Merger transformation with TOGAF® framework and ArchiMate® modeling language
  • Measuring and managing cybersecurity risks
  • Pragmatic IT road-mapping with ArchiMate modeling language
  • The value of Enterprise Architecture certification within a professional development framework

Plenary speakers will include:

  • Allen Brown, President & CEO, The Open Group
  • Peter Haviland, Chief Business Architect, with Martin Keywood, Partner, Ernst & Young
  • David David, EA Manager, Rio Tinto
  • Roger Venning, Chief IT Architect, NBN Co. Ltd
  • Craig Martin, COO & Chief Architect, Enterprise Architects
  • Chris Forde, VP Enterprise Architecture, The Open Group

The full conference agenda is available here. Tracks include:

  • Finance & Commerce
  • Government & Defense
  • Energy & Natural Resources

And topics of discussion include, but are not limited to:

  • Cloud
  • Business Transformation
  • Enterprise Architecture
  • Technology & Innovation
  • Data Integration/Information Sharing
  • Governance & Security
  • Architecture Reference Models
  • Strategic Planning
  • Distributed Services Architecture

Upcoming Conference Submission Deadlines

Would you like a chance to speak an Open Group conference? There are upcoming deadlines for speaker proposal submissions for upcoming conferences in Philadelphia and London. To submit a proposal to speak, click here.

Venue Industry Focus Submission Deadline
Philadelphia (July 15-17) Healthcare, Finance, Government & Defense April 5, 2013
London (October 21-23) Finance, Government, Healthcare July 8, 2013

 

The agenda for Philadelphia and London are filling up fast, so it is important for proposals to be submitted as early as possible. Proposals received after the deadline dates will still be considered, space permitting; if not, proposals may be carried over to a future conference. Priority will be given to proposals received by the deadline dates and to proposals that include an end-user organization, at least as a co-presenter.

Comments Off

Filed under Conference

#ogChat Summary – Business Architecture

By Patty Donovan, The Open Group

The Open Group hosted a tweet jam (#ogChat) to discuss the evolution of Business Architecture and its role in enterprise transformation. In case you missed the conversation, here is a recap of the event.

The Participants

A total of 16 participants joined in the hour-long discussion, including:

The Discussion

Here is a high-level  snapshot of yesterday’s #ogChat discussion:

Q1 How do you define #BizArch? #ogChat

While not everyone could agree on a single definition, all agreed that Business Architecture enables operational ease and business model innovation.

  • @Dana_Gardner: Q1 Aligning the strategies and operational priorities of all a business’s groups along a common, coorindated path. #ogChat #BizArch #EA
  • @enterprisearchs: Q1 At @enterprisearchs we also believe #BizArch is the design of business to enable business model innovation #ogChat
  • @bmichelson: #ogchat q1: in reality, business architecture is more the meta model of business, used to understand, measure, deliver capability #BizArch
  • @MartinGladwell: Q1 Orchestrating the delivery of changes needed to realise the strategy #ogchat

 

Q2 What is the role of the business architect? What real world #business problems does #BizArch solve? #ogChat

Most agreed that the lines are blurred between the roles of the Business Architect and the Enterprise Architect. Both manage complexity, agility and data proactively within a business or enterprise.

  • @bmichelson: #ogchat q2: so, I differ here. I think *true* business architect designs the business; in reality, we assign “architect” to business analyst
  • @Dana_Gardner: Q2 #BizArch allows for managing complexity, fostering agility, makes a data-driven enterprise more able to act in proactive manner #ogChat
  • @editingwhiz: So much software now is aimed at line-of-business people that acquiring IT business architect creds would be a huge attribute. #ogChat
  • @MartinGladwell: Q2 Is an MBA an advantage for a BA? Is it necessary? #ogchat
  • @enterprisearchs: A2 Ensures an org is correctly positioned and the environmental/industry factors are understood in order to achieve its strategy #ogChat
  • @DaveHornford: Q2: all my answers chase their tails into architecture – what must I have to get what I want – what must change  #ogchat #bizarch

 

Q3 How is the role of the Business Architect changing? What are the drivers of this change? #ogChat #BizArch

Some argued that the role of the Business Architect is not changing at all, but rather just emerging (or evolving?), and that Business Architects are differentiating themselves from other organizational roles. Others argued that the role is changing to accommodate emerging trends and areas of focus (i.e,. customer experience).

  • @enterprisearchs: A3 Businesses are looking to differentiate, an increased focus on Customer Experience is raising questions on how to increase NPS #ogChat
  • @blake6677: #ogchat At the core of my Business Architecture practice is business capability modeling
  • @DaveHornford: Q3 – changing? Is just starting to appear – distinction between architect, strategist, analyst, change leader often hard to see  #ogchat

 

Q4 How does #BizArch differ from #EntArch? #ogChat

Similar to the discussion around question two, most participants agreed that the roles of Business and Enterprise Architects are difficult to separate, while some argued about the differences in scope of the two roles.

  • @NadhanAtHP: A4: @theopengroup Biz Architecture provides the business foundation for the Enterprise Architecture which is more holistic #ogChat
  • @DaveHornford: Q4: difference is in scope #BizArch is one of many domains comprising #EntArch #ogchat
  • @harryhendrickx: Q3 #BizArch evolves towards operational position serving many initiatives. Not sure how practice evolves #ogChat
  • Len Fehskens: Q4 “There is a lot of confusion about the meanings of #business and #enterprise, and many people use them synonymously” #Len #ogChat
  • @MartinGladwell @theopengroup Len I think there is no truth of the matter, we must choose to use these terms in a way that advances our common cause #ogchat
  • @enterprisearchs: A4 In TOGAF ADM we see #BizArch predominantly supporting the prelim and arch vision phases #ogchat

 

Q5 How can Business Architects and Enterprise Architects work together? #ogChat #BizArch #EntArch

All agreed that Business Architects and Enterprise Architects exist to support one another. When discussing the first step to establishing successful Business Architecture, participants suggested knowing its purpose first, then tapping professional accreditation and community involvement resources second.

  • @Dave Hornford: Ethnography within the enterprise, it’s ecosystem or both? #ogchat
  • @Dana_Gardner: Q5 They make each other stronger, and can provide an example to the rest on how these methods and tools can work harmoniously. #ogChat
  • @bmichelson: “@theopengroup: What is the first step toward establishing a successful #BizArch? #ogChat” < knowing why you want to establish practice
  • @MartinGladwell: @theopengroup #ogchat professional accreditation, community, role models

 

Q6 What’s in store for #BizArch in the future? #ogChat

When looking towards the future, panelists suggested erasing ambiguity when it comes to the difference between Business and Enterprise Architects. Others also predicted that the rising demand for Business Architects will spark a need for certification and training programs.

  • Len Fehskens: Q6 I fear conventional wisdom contradictions and ambiguities will be ‘resolved’ by setting arbitrary distinctions in concrete #Len #ogChat
  • @Dana_Gardner: Q6 I hope to see more stature given to the role of #BizArch, so that it becomes an executive-tier requirement. #ogChat
  • @bmichelson: #ogchat q6: learning how to enable continuous change via: visibility, context, correctness & responsiveness #BizArch
  • @MartinGladwell: Q6 #ogchat We will see information as a design activity not an analysis activity
  • @enterprisearchs: A6 The demand for #BizArch will  generate a need for recognised certification and training #ogChat
  • @allenbrownopen: Business architecture like other functions such as legal and finance can inform C level decisions, it can’t make them #ogchat

 

A big thank you to all the participants who made this such a great discussion!  Join us for our next tweet jam on Platform 3.0!

 

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

Comments Off

Filed under Business Architecture, Tweet Jam

Gaining Greater Cohesion: Bringing Business Analysis and Business Architecture into Focus

By Craig Martin, Enterprise Architects

Having delivered many talks on Business Architecture over the years, I’m often struck by the common vision driving many members in the audience – a vision of building cohesion in a business, achieving the right balance between competing forces and bringing the business strategy and operations into harmony.  However, as with many ambitious visions, the challenge in this case is immense.  As I will explain, many of the people who envision this future state of nirvana are, in practice, inadvertently preventing it from happening.

Standards Silos
There are a host of standards and disciplines that are brought into play by enterprises to improve business performance and capabilities. For example standards such as PRINCE2, BABOK, BIZBOK, TOGAF, COBIT, ITIL and PMBOK are designed to ensure reliability of team output and approach across various business activities. However, in many instances these standards, operating together, present important gaps and overlaps. One wonders whose job it is to integrate and unify these standards. Whose job is it to understand the business requirements, business processes, drivers, capabilities and so on?

Apples to Apples?
As these standards evolve they often introduce new jargon to support their view of the world. Have you ever had to ask your business to explain what they do on a single page? The diversity of the views and models can be quite astonishing:

  • The target operating model
  • The business model
  • The process model
  • The capability model
  • The value chain model
  • The functional model
  • The business services model
  • The component business model
  • The business reference model
  • Business anchor model

The list goes on and on…

Each has a purpose and brings value in isolation. However, in the common scenario where they are developed using differing tools, methods, frameworks and techniques, the result is usually greater fragmentation, not more cohesion – and consequently we can end up with some very confused and exacerbated business stakeholders who care less about what standard we use and more about finding clarity to just get the job done.

The Convergence of Business Architecture and Business Analysis
Ask a room filled with business analysts and business architects how their jobs differ and relate, and I guarantee that would receive a multitude of alternative and sometimes conflicting perspectives.

Both of these disciplines try to develop standardised methods and frameworks for the description of the building blocks of an organization. They also seek to standardise the means by which to string them together to create better outcomes.

In other words, they are the disciplines that seek to create balance between two important business goals:

  • To produce consistent, predictable outcomes
  • To produce outcomes that meet desired objectives

In his book, “The Design of Business: Why Design Thinking is the Next Competitive Advantage,” Roger Martin describes the relationships and trade-offs between analytical thinking and intuitive thinking in business. He refers to the “knowledge funnel,” which charts the movement of business focus from solving business mysteries using heuristics to creating algorithms that increase reliability, reducing business complexity and costs and improving business performance.

The disciplines of Business Architecture and business analysis are both currently seeking to address this challenge. Martin refers to this as ”design thinking.”

Thinking Types v2

Vision Vs. Reality For Business Analysts and Business Architects

When examining the competency models for business analysis and Business Architecture, the desire is to position these two disciplines right across the spectrum of reliability and validity.

The reality is that both the business architect and the business analyst spend a large portion of their time in the reliability space, and I believe I’ve found the reason why.

Both the BABOK and the BIZBOK provide a body of knowledge focused predominantly around the reliability space. In other words, they look at how we define the building blocks of an organization, and less so at how we invent better building blocks within the organization.

Integrating the Disciplines

While we still have some way to go to integrate, the Business Architecture and business analysis disciplines are currently bringing great value to business through greater reliability and repeatability.

However, there is a significant opportunity to enable the intuitive thinkers to look at the bigger picture and identify opportunities to innovate their business models, their go-to-market, their product and service offerings and their operations.

Perhaps we might consider introducing a new function to bridge and unify the disciplines?

This newly created function might integrate a number of incumbent roles and functions and cover:

  • A holistic structural view covering the business model and the high-level relationships and interactions between all business systems
  • A market model view in which the focus is on understanding the market dynamics, segments and customer need
  • A products and services model view focusing on customer experience, value proposition, product and service mix and customer value
  • An operating model view – this is the current focus area of the business architect and business analyst. You need these building blocks defined in a reliable, repeatable and manageable structure. This enables agility within the organization and will support the assembly and mixing of building blocks to improve customer experience and value

At the end of the day, what matters most is not business analysis or Business Architecture themselves, but how the business will bridge the reliability and validity spectrum to reliably produce desired business outcomes.

I will discuss this topic in more detail at The Open Group Conference in Sydney, April 15-18, which will be the first Open Group event to be held in Australia.

Craig-MARTIN-ea-updated-3Craig Martin is the Chief Operating Officer and Chief Architect at Enterprise Architects, which is a specialist Enterprise Architecture firm operating in the U.S., UK, Asia and Australia. He is presenting the Business Architecture plenary at the upcoming Open Group conference in Sydney. 

1 Comment

Filed under Business Architecture

Questions for the Upcoming Business Architecture Tweet Jam – March 19

By Patty Donovan, The Open Group

Earlier this week, we announced our upcoming tweet jam on Tuesday, March 19 at 2:00 p.m. PT/9:00 p.m. GMT/ Wednesday, March 20 at 8:00 a.m. EDT (Sydney Australia), which will examine the way in which Business Architecture is impacting enterprises and businesses of all sizes.

The discussion will be moderated by The Open Group (@theopengroup), and we welcome both members of The Open Group and interested participants alike to join the session.

The discussion will be guided by these six questions:

  1. How do you define Business Architecture?
  2. What is the role of the business architect? What real world business problems does Business Architecture solve?
  3. How is the role of the business architect changing? What are the drivers of this change?
  4. How does Business Architecture differ from Enterprise Architecture?
  5. How can business architects and enterprise architects work together?
  6. What’s in store for Business Architecture in the future?

To join the discussion, please follow the #ogChat hashtag during the allotted discussion time. Other hashtags we recommend you use during the event include:

  • Enterprise Architecture : #EntArch
  • Business Architecture: #BizArch
  • The Open Group Architecture Forum : #ogArch

For more information about the tweet jam, guidelines and general background information, please visit our previous blog post.

If you have any questions prior to the event or would like to join as a participant, please direct them to Rod McLeod (rmcleod at bateman-group dot com), or leave a comment below. We anticipate a lively chat and hope you will be able to join us!

patricia donovanPatricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

2 Comments

Filed under Business Architecture, Tweet Jam

Beyond Big Data

By Chris Harding, The Open Group

The big bang that started The Open Group Conference in Newport Beach was, appropriately, a presentation related to astronomy. Chris Gerty gave a keynote on Big Data at NASA, where he is Deputy Program Manager of the Open Innovation Program. He told us how visualizing deep space and its celestial bodies created understanding and enabled new discoveries. Everyone who attended felt inspired to explore the universe of Big Data during the rest of the conference. And that exploration – as is often the case with successful space missions – left us wondering what lies beyond.

The Big Data Conference Plenary

The second presentation on that Monday morning brought us down from the stars to the nuts and bolts of engineering. Mechanical devices require regular maintenance to keep functioning. Processing the mass of data generated during their operation can improve safety and cut costs. For example, airlines can overhaul aircraft engines when it needs doing, rather than on a fixed schedule that has to be frequent enough to prevent damage under most conditions, but might still fail to anticipate failure in unusual circumstances. David Potter and Ron Schuldt lead two of The Open Group initiatives, Quantum Lifecycle management (QLM) and the Universal Data Element Framework (UDEF). They explained how a semantic approach to product lifecycle management can facilitate the big-data processing needed to achieve this aim.

Chris Gerty was then joined by Andras Szakal, vice-president and chief technology officer at IBM US Federal IMT, Robert Weisman, chief executive officer of Build The Vision, and Jim Hietala, vice-president of Security at The Open Group, in a panel session on Big Data that was moderated by Dana Gardner of Interarbor Solutions. As always, Dana facilitated a fascinating discussion. Key points made by the panelists included: the trend to monetize data; the need to ensure veracity and usefulness; the need for security and privacy; the expectation that data warehouse technology will exist and evolve in parallel with map/reduce “on-the-fly” analysis; the importance of meaningful presentation of the data; integration with cloud and mobile technology; and the new ways in which Big Data can be used to deliver business value.

More on Big Data

In the afternoons of Monday and Tuesday, and on most of Wednesday, the conference split into streams. These have presentations that are more technical than the plenary, going deeper into their subjects. It’s a pity that you can’t be in all the streams at once. (At one point I couldn’t be in any of them, as there was an important side meeting to discuss the UDEF, which is in one of the areas that I support as forum director). Fortunately, there were a few great stream presentations that I did manage to get to.

On the Monday afternoon, Tom Plunkett and Janet Mostow of Oracle presented a reference architecture that combined Hadoop and NoSQL with traditional RDBMS, streaming, and complex event processing, to enable Big Data analysis. One application that they described was to trace the relations between particular genes and cancer. This could have big benefits in disease prediction and treatment. Another was to predict the movements of protesters at a demonstration through analysis of communications on social media. The police could then concentrate their forces in the right place at the right time.

Jason Bloomberg, president of Zapthink – now part of Dovel – is always thought-provoking. His presentation featured the need for governance vitality to cope with ever changing tools to handle Big Data of ever increasing size, “crowdsourcing” to channel the efforts of many people into solving a problem, and business transformation that is continuous rather than a one-time step from “as is” to “to be.”

Later in the week, I moderated a discussion on Architecting for Big Data in the Cloud. We had a well-balanced panel made up of TJ Virdi of Boeing, Mark Skilton of Capgemini and Tom Plunkett of Oracle. They made some excellent points. Big Data analysis provides business value by enabling better understanding, leading to better decisions. The analysis is often an iterative process, with new questions emerging as answers are found. There is no single application that does this analysis and provides the visualization needed for understanding, but there are a number of products that can be used to assist. The role of the data scientist in formulating the questions and configuring the visualization is critical. Reference models for the technology are emerging but there are as yet no commonly-accepted standards.

The New Enterprise Platform

Jogging is a great way of taking exercise at conferences, and I was able to go for a run most mornings before the meetings started at Newport Beach. Pacific Coast Highway isn’t the most interesting of tracks, but on Tuesday morning I was soon up in Castaways Park, pleasantly jogging through the carefully-nurtured natural coastal vegetation, with views over the ocean and its margin of high-priced homes, slipways, and yachts. I reflected as I ran that we had heard some interesting things about Big Data, but it is now an established topic. There must be something new coming over the horizon.

The answer to what this might be was suggested in the first presentation of that day’s plenary, Mary Ann Mezzapelle, security strategist for HP Enterprise Services, talked about the need to get security right for Big Data and the Cloud. But her scope was actually wider. She spoke of the need to secure the “third platform” – the term coined by IDC to describe the convergence of social, cloud and mobile computing with Big Data.

Securing Big Data

Mary Ann’s keynote was not about the third platform itself, but about what should be done to protect it. The new platform brings with it a new set of security threats, and the increasing scale of operation makes it increasingly important to get the security right. Mary Ann presented a thoughtful analysis founded on a risk-based approach.

She was followed by Adrian Lane, chief technology officer at Securosis, who pointed out that Big Data processing using NoSQL has a different architecture from traditional relational data processing, and requires different security solutions. This does not necessarily mean new techniques; existing techniques can be used in new ways. For example, Kerberos may be used to secure inter-node communications in map/reduce processing. Adrian’s presentation completed the Tuesday plenary sessions.

Service Oriented Architecture

The streams continued after the plenary. I went to the Distributed Services Architecture stream, which focused on SOA.

Bill Poole, enterprise architect at JourneyOne in Australia, described how to use the graphical architecture modeling language ArchiMate® to model service-oriented architectures. He illustrated this using a case study of a global mining organization that wanted to consolidate its two existing bespoke inventory management applications into a single commercial off-the-shelf application. It’s amazing how a real-world case study can make a topic come to life, and the audience certainly responded warmly to Bill’s excellent presentation.

Ali Arsanjani, chief technology officer for Business Performance and Service Optimization, and Heather Kreger, chief technology officer for International Standards, both at IBM, described the range of SOA standards published by The Open Group and available for use by enterprise architects. Ali was one of the brains that developed the SOA Reference Architecture, and Heather is a key player in international standards activities for SOA, where she has helped The Open Group’s Service Integration Maturity Model and SOA Governance Framework to become international standards, and is working on an international standard SOA reference architecture.

Cloud Computing

To start Wednesday’s Cloud Computing streams, TJ Virdi, senior enterprise architect at The Boeing Company, discussed use of TOGAF® to develop an Enterprise Architecture for a Cloud ecosystem. A large enterprise such as Boeing may use many Cloud service providers, enabling collaboration between corporate departments, partners, and regulators in a complex ecosystem. Architecting for this is a major challenge, and The Open Group’s TOGAF for Cloud Ecosystems project is working to provide guidance.

Stuart Boardman of KPN gave a different perspective on Cloud ecosystems, with a case study from the energy industry. An ecosystem may not necessarily be governed by a single entity, and the participants may not always be aware of each other. Energy generation and consumption in the Netherlands is part of a complex international ecosystem involving producers, consumers, transporters, and traders of many kinds. A participant may be involved in several ecosystems in several ways: a farmer for example, might consume energy, have wind turbines to produce it, and also participate in food production and transport ecosystems.

Penelope Gordon of 1-Plug Corporation explained how choice and use of business metrics can impact Cloud service providers. She worked through four examples: a start-up Software-as-a-Service provider requiring investment, an established company thinking of providing its products as cloud services, an IT department planning to offer an in-house private Cloud platform, and a government agency seeking budget for government Cloud.

Mark Skilton, director at Capgemini in the UK, gave a presentation titled “Digital Transformation and the Role of Cloud Computing.” He covered a very broad canvas of business transformation driven by technological change, and illustrated his theme with a case study from the pharmaceutical industry. New technology enables new business models, giving competitive advantage. Increasingly, the introduction of this technology is driven by the business, rather than the IT side of the enterprise, and it has major challenges for both sides. But what new technologies are in question? Mark’s presentation had Cloud in the title, but also featured social and mobile computing, and Big Data.

The New Trend

On Thursday morning I took a longer run, to and round Balboa Island. With only one road in or out, its main street of shops and restaurants is not a through route and the island has the feel of a real village. The SOA Work Group Steering Committee had found an excellent, and reasonably priced, Italian restaurant there the previous evening. There is a clear resurgence of interest in SOA, partly driven by the use of service orientation – the principle, rather than particular protocols – in Cloud Computing and other new technologies. That morning I took the track round the shoreline, and was reminded a little of Dylan Thomas’s “fishing boat bobbing sea.” Fishing here is for leisure rather than livelihood, but I suspected that the fishermen, like those of Thomas’s little Welsh village, spend more time in the bar than on the water.

I thought about how the conference sessions had indicated an emerging trend. This is not a new technology but the combination of four current technologies to create a new platform for enterprise IT: Social, Cloud, and Mobile computing, and Big Data. Mary Ann Mezzapelle’s presentation had referenced IDC’s “third platform.” Other discussions had mentioned Gartner’s “Nexus of forces,” the combination of Social, Cloud and Mobile computing with information that Gartner says is transforming the way people and businesses relate to technology, and will become a key differentiator of business and technology management. Mark Skilton had included these same four technologies in his presentation. Great minds, and analyst corporations, think alike!

I thought also about the examples and case studies in the stream presentations. Areas as diverse as healthcare, manufacturing, energy and policing are using the new technologies. Clearly, they can deliver major business benefits. The challenge for enterprise architects is to maximize those benefits through pragmatic architectures.

Emerging Standards

On the way back to the hotel, I remarked again on what I had noticed before, how beautifully neat and carefully maintained the front gardens bordering the sidewalk are. I almost felt that I was running through a public botanical garden. Is there some ordinance requiring people to keep their gardens tidy, with severe penalties for anyone who leaves a lawn or hedge unclipped? Is a miserable defaulter fitted with a ball and chain, not to be removed until the untidy vegetation has been properly trimmed, with nail clippers? Apparently not. People here keep their gardens tidy because they want to. The best standards are like that: universally followed, without use or threat of sanction.

Standards are an issue for the new enterprise platform. Apart from the underlying standards of the Internet, there really aren’t any. The area isn’t even mapped out. Vendors of Social, Cloud, Mobile, and Big Data products and services are trying to stake out as much valuable real estate as they can. They have no interest yet in boundaries with neatly-clipped hedges.

This is a stage that every new technology goes through. Then, as it matures, the vendors understand that their products and services have much more value when they conform to standards, just as properties have more value in an area where everything is neat and well-maintained.

It may be too soon to define those standards for the new enterprise platform, but it is certainly time to start mapping out the area, to understand its subdivisions and how they inter-relate, and to prepare the way for standards. Following the conference, The Open Group has announced a new Forum, provisionally titled Open Platform 3.0, to do just that.

The SOA and Cloud Work Groups

Thursday was my final day of meetings at the conference. The plenary and streams presentations were done. This day was for working meetings of the SOA and Cloud Work Groups. I also had an informal discussion with Ron Schuldt about a new approach for the UDEF, following up on the earlier UDEF side meeting. The conference hallways, as well as the meeting rooms, often see productive business done.

The SOA Work Group discussed a certification program for SOA professionals, and an update to the SOA Reference Architecture. The Open Group is working with ISO and the IEEE to define a standard SOA reference architecture that will have consensus across all three bodies.

The Cloud Work Group had met earlier to further the TOGAF for Cloud ecosystems project. Now it worked on its forthcoming white paper on business performance metrics. It also – though this was not on the original agenda – discussed Gartner’s Nexus of Forces, and the future role of the Work Group in mapping out the new enterprise platform.

Mapping the New Enterprise Platform

At the start of the conference we looked at how to map the stars. Big Data analytics enables people to visualize the universe in new ways, reach new understandings of what is in it and how it works, and point to new areas for future exploration.

As the conference progressed, we found that Big Data is part of a convergence of forces. Social, mobile, and Cloud Computing are being combined with Big Data to form a new enterprise platform. The development of this platform, and its roll-out to support innovative applications that deliver more business value, is what lies beyond Big Data.

At the end of the conference we were thinking about mapping the new enterprise platform. This will not require sophisticated data processing and analysis. It will take discussions to create a common understanding, and detailed committee work to draft the guidelines and standards. This work will be done by The Open Group’s new Open Platform 3.0 Forum.

The next Open Group conference is in the week of April 15, in Sydney, Australia. I’m told that there’s some great jogging there. More importantly, we’ll be reflecting on progress in mapping Open Platform 3.0, and thinking about what lies ahead. I’m looking forward to it already.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

2 Comments

Filed under Conference

Complexity from Big Data and Cloud Trends Makes Architecture Tools like ArchiMate and TOGAF More Powerful, Says Expert Panel

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here: Complexity from Big Data and Cloud Trends Makes Architecture Tools like ArchiMate and TOGAF More Powerful, Says Expert Panel, or read the transcript here.

We recently assembled a panel of Enterprise Architecture (EA) experts to explain how such simultaneous and complex trends as big data, Cloud Computing, security, and overall IT transformation can be helped by the combined strengths of The Open Group Architecture Framework (TOGAF®) and the ArchiMate® modeling language.

The panel consisted of Chris Forde, General Manager for Asia-Pacific and Vice President of Enterprise Architecture at The Open Group; Iver Band, Vice Chair of The Open Group ArchiMate Forum and Enterprise Architect at The Standard, a diversified financial services company; Mike Walker, Senior Enterprise Architecture Adviser and Strategist at HP and former Director of Enterprise Architecture at DellHenry Franken, the Chairman of The Open Group ArchiMate Forum and Managing Director at BIZZdesign, and Dave Hornford, Chairman of the Architecture Forum at The Open Group and Managing Partner at Conexiam. I served as the moderator.

This special BriefingsDirect thought leadership interview series comes to you in conjunction with The Open Group Conference recently held in Newport Beach, California. The conference focused on “Big Data – he transformation we need to embrace today.” [Disclosure: The Open Group and HP are sponsors ofBriefingsDirect podcasts.]

Here are some excerpts:

Gardner: Is there something about the role of the enterprise architect that is shifting?

Walker: There is less of a focus on the traditional things we come to think of EA such as standards, governance and policies, but rather into emerging areas such as the soft skills, Business Architecture, and strategy.

To this end I see a lot in the realm of working directly with the executive chain to understand the key value drivers for the company and rationalize where they want to go with their business. So we’re moving into a business-transformation role in this practice.

At the same time, we’ve got to be mindful of the disruptive external technology forces coming in as well. EA can’t just divorce from the other aspects of architecture as well. So the role that enterprise architects play becomes more and more important and elevated in the organization.

Two examples of this disruptive technology that are being focused on at the conference are Big Data and Cloud Computing. Both are providing impacts to our businesses not because of some new business idea but because technology is available to enhance or provide new capabilities to our business. The EA’s still do have to understand these new technology innovations and determine how they will apply to the business.

We need to get really good enterprise architects, it’s difficult to find good ones. There is a shortage right now especially given that a lot of focus is being put on the EA department to really deliver sound architectures.

Not standalone

Gardner: We’ve been talking a lot here about Big Data, but usually that’s not just a standalone topic. It’s Big Data and Cloud, Cloud, mobile and security.

So with these overlapping and complex relationships among multiple trends, why is EA and things like the TOGAF framework and the ArchiMate modeling language especially useful?

Band: One of the things that has been clear for a while now is that people outside of IT don’t necessarily have to go through the technology function to avail themselves of these technologies any more. Whether they ever had to is really a question as well.

One of things that EA is doing, and especially in the practice that I work in, is using approaches like the ArchiMate modeling language to effect clear communication between the business, IT, partners and other stakeholders. That’s what I do in my daily work, overseeing our major systems modernization efforts. I work with major partners, some of which are offshore.

I’m increasingly called upon to make sure that we have clear processes for making decisions and clear ways of visualizing the different choices in front of us. We can’t always unilaterally dictate the choice, but we can make the conversation clearer by using frameworks like the TOGAF standard and the ArchiMate modeling language, which I use virtually every day in my work.

Hornford: The fundamental benefit of these tools is the organization realizing its capability and strategy. I just came from a session where a fellow quoted a Harvard study, which said that around a third of executives thought their company was good at executing on its strategy. He highlighted that this means that two-thirds are not good at executing on their strategy.

If you’re not good at executing on your strategy and you’ve got Big Data, mobile, consumerization of IT and Cloud, where are you going? What’s the correct approach? How does this fit into what you were trying to accomplish as an enterprise?

An enterprise architect that is doing their job is bringing together the strategy, goals and objectives of the organization. Also, its capabilities with the techniques that are available, whether it’s offshoring, onshoring, Cloud, or Big Data, so that the organization is able to move forward to where it needs to be, as opposed to where it’s going to randomly walk to.

Forde: One of the things that has come out in several of the presentations is this kind of capability-based planning, a technique in EA to get their arms around this thing from a business-driver perspective. Just to polish what Dave said a little bit, it’s connecting all of those things. We see enterprises talking about a capability-based view of things on that basis.

Gardner: Let’s get a quick update. The TOGAF framework, where are we and what have been the highlights from this particular event?

Minor upgrade

Hornford: In the last year, we’ve published a minor upgrade for TOGAF version 9.1 which was based upon cleaning up consistency in the language in the TOGAF documentation. What we’re working on right now is a significant new release, the next release of the TOGAF standard, which is dividing the TOGAF documentation to make it more consumable, more consistent and more useful for someone.

Today, the TOGAF standard has guidance on how to do something mixed into the framework of what you should be doing. We’re peeling those apart. So with that peeled apart, we won’t have guidance that is tied to classic application architecture in a world of Cloud.

What we find when we have done work with the Banking Industry Architecture Network (BIAN) for banking architecture, Sherwood Applied Business Security Architecture (SABSA) for security architecture, and the TeleManagement Forum, is that the concepts in the TOGAF framework work across industries and across trends. We need to move the guidance into a place so that we can be far nimbler on how to tie Cloud with my current strategy, how to tie consumerization of IT with on-shoring?

Franken: The ArchiMate modeling language turned two last year, and the ArchiMate 1.0 standard is the language to model out the core of your EA. The ArchiMate 2.0 standard added two specifics to it to make it better aligned also to the process of EA.

According to the TOGAF standard, this is being able to model out the motivation, why you’re doing EA, stakeholders and the goals that drive us. The second extension to the ArchiMate standard is being able to model out its planning and migration.

So with the core EA and these two extensions, together with the TOGAF standard process working, you have a good basis on getting EA to work in your organization.

Gardner: Mike, fill us in on some of your thoughts about the role of information architecture vis-à-vis the larger business architect and enterprise architect roles.

Walker: Information architecture is an interesting topic in that it hasn’t been getting a whole lot of attention until recently.

Information architecture is an aspect of Enterprise Architecture that enables an information strategy or business solution through the definition of the company’s business information assets, their sources, structure, classification and associations that will prescribe the required application architecture and technical capabilities.

Information architecture is the bridge between the Business Architecture world and the application and technology architecture activities.

The reason I say that is because information architecture is a business-driven discipline that details the information strategy of the company. As we know, and from what we’ve heard at the conference keynotes like in the case of NASA, Big Data, and security presentations, the preservation and classification of that information is vital to understanding what your architecture should be.

Least matured

From an industry perspective, this is one of the least matured, as far as being incorporated into a formal discipline. The TOGAF standard actually has a phase dedicated to it in data architecture. Again, there are still lots of opportunities to grow and incorporate additional methods, models and tools by the enterprise information management discipline.

Enterprise information management not only it captures traditional topic areas like master data management (MDM), metadata and unstructured types of information architecture but also focusing on the information governance, and the architecture patterns and styles implemented in MDM, Big Data, etc. There is a great deal of opportunity there.

From the role of information architects, I’m seeing more and more traction in the industry as a whole. I’ve dealt with an entire group that’s focused on information architecture and building up an enterprise information management practice, so that we can take our top line business strategies and understand what architectures we need to put there.

This is a critical enabler for global companies, because oftentimes they’re restricted by regulation, typically handled at a government or regional area. This means we have to understand that we build our architecture. So it’s not about the application, but rather the data that it processes, moves, or transforms.

Gardner: Up until not too long ago, the conventional thinking was that applications generate data. Then you treat the data in some way so that it can be used, perhaps by other applications, but that the data was secondary to the application.

But there’s some shift in that thinking now more toward the idea that the data is the application and that new applications are designed to actually expand on the data’s value and deliver it out to mobile tiers perhaps. Does that follow in your thinking that the data is actually more prominent as a resource perhaps on par with applications?

Walker: You’re spot on, Dana. Before the commoditization of these technologies that resided on premises, we could get away with starting at the application layer and work our way back because we had access to the source code or hardware behind our firewalls. We could throw servers out, and we used to put the firewalls in front of the data to solve the problem with infrastructure. So we didn’t have to treat information as a first-class citizen. Times have changed, though.

Information access and processing is now democratized and it’s being pushed as the first point of presentment. A lot of times this is on a mobile device and even then it’s not the corporate’s mobile device, but your personal device. So how do you handle that data?

It’s the same way with Cloud, and I’ll give you a great example of this. I was working as an adviser for a company, and they were looking at their Cloud strategy. They had made a big bet on one of the big infrastructures and Cloud-service providers. They looked first at what the features and functions that that Cloud provider could provide, and not necessarily the information requirements. There were two major issues that they ran into, and that was essentially a showstopper. They had to pull off that infrastructure.

The first one was that in that specific Cloud provider’s terms of service around intellectual property (IP) ownership. Essentially, that company was forced to cut off their IP rights.

Big business

As you know, IP is a big business these days, and so that was a showstopper. It actually broke the core regulatory laws around being able to discover information.

So focusing on the applications to make sure it meets your functional needs is important. However, we should take a step back and look at the information first and make sure that for the people in your organization who can’t say no, their requirements are satisfied.

Gardner: Data architecture is it different from EA and Business Architecture, or is it a subset? What’s the relationship, Dave?

Hornford: Data architecture is part of an EA. I won’t use the word subset, because a subset starts to imply that it is a distinct thing that you can look at on its own. You cannot look at your Business Architecture without understanding your information architecture. When you think about Big Data, cool. We’ve got this pile of data in the corner. Where did it come from? Can we use it? Do we actually have legitimate rights, as Mike highlighted, to use this information? Are we allowed to mix it and who mixes it?

When we look at how our business is optimized, they normally optimize around work product, what the organization is delivering. That’s very easy. You can see who consumes your work product. With information, you often have no idea who consumes your information. So now we have provenance, we have source and as we move for global companies, we have the trends around consumerization, Cloud and simply tightening cycle time.

Gardner: Of course, the end game for a lot of the practitioners here is to create that feedback loop of a lifecycle approach, rapid information injection and rapid analysis that could be applied. So what are some of the ways that these disciplines and tools can help foster that complete lifecycle?

Band: The disciplines and tools can facilitate the right conversations among different stakeholders. One of the things that we’re doing at The Standard is building cadres equally balanced between people in business and IT.

We’re training them in information management, going through a particular curriculum, and having them study for an information management certification that introduces a lot of these different frameworks and standard concepts.

Creating cadres

We want to create these cadres to be able to solve tough and persistent information management problems that affect all companies in financial services, because information is a shared asset. The purpose of the frameworks is to ensure proper stewardship of that asset across disciplines and across organizations within an enterprise.

Hornford: The core is from the two standards that we have, the ArchiMate standard and the TOGAF standard. The TOGAF standard has, from its early roots, focused on the components of EA and how to build a consistent method of understanding of what I’m trying to accomplish, understanding where I am, and where I need to be to reach my goal.

When we bring in the ArchiMate standard, I have a language, a descriptor, a visual descriptor that allows me to cross all of those domains in a consistent description, so that I can do that traceability. When I pull in this lever or I have this regulatory impact, what does it hit me with, or if I have this constraint, what does it hit me with?

If I don’t do this, if I don’t use the framework of the TOGAF standard, or I don’t use the discipline of formal modeling in the ArchiMate standard, we’re going to do it anecdotally. We’re going to trip. We’re going to fall. We’re going to have a non-ending series of surprises, as Mike highlighted.

“Oh, terms of service. I am violating the regulations. Beautiful. Let’s take that to our executive and tell him right as we are about to go live that we have to stop, because we can’t get where we want to go, because we didn’t think about what it took to get there.” And that’s the core of EA in the frameworks.

Walker: To build on what Dave has just talked about and going back to your first question Dana, the value statement on TOGAF from a business perspective. The businesses value of TOGAF is that they get a repeatable and a predictable process for building out our architectures that properly manage risks and reliably produces value.

The TOGAF framework provides a methodology to ask what problems you’re trying to solve and where you are trying to go with your business opportunities or challenges. That leads to Business Architecture, which is really a rationalization in technical or architectural terms the distillation of the corporate strategy.

From there, what you want to understand is information — how does that translate, what information architecture do we need to put in place? You get into all sorts of things around risk management, etc., and then it goes on from there, until what we were talking about earlier about information architecture.

If the TOGAF standard is applied properly you can achieve the same result every time, That is what interests business stakeholders in my opinion. And the ArchiMate modeling language is great because, as we talked about, it provides very rich visualizations so that people cannot only show a picture, but tie information together. Different from other aspects of architecture, information architecture is less about the boxes and more about the lines.

Quality of the individuals

Forde: Building on what Dave was saying earlier and also what Iver was saying is that while the process and the methodology and the tools are of interest, it’s the discipline and the quality of the individuals doing the work

Iver talked about how the conversation is shifting and the practice is improving to build communications groups that have a discipline to operate around. What I am hearing is implied, but actually I know what specifically occurs, is that we end up with assets that are well described and reusable.

And there is a point at which you reach a critical mass that these assets become an accelerator for decision making. So the ability of the enterprise and the decision makers in the enterprise at the right level to respond is improved, because they have a well disciplined foundation beneath them.

A set of assets that are reasonably well-known at the right level of granularity for them to absorb the information and the conversation is being structured so that the technical people and the business people are in the right room together to talk about the problems.

This is actually a fairly sophisticated set of operations that I am discussing and doesn’t happen overnight, but is definitely one of the things that we see occurring with our members in certain cases.

Hornford: I want to build on that what Chris said. It’s actually the word “asset.” While he was talking, I was thinking about how people have talked about information as an asset. Most of us don’t know what information we have, how it’s collected, where it is, but we know we have got a valuable asset.

I’ll use an analogy. I have a factory some place in the world that makes stuff. Is that an asset? If I know that my factory is able to produce a particular set of goods and it’s hooked into my supply chain here, I’ve got an asset. Before that, I just owned a thing.

I was very encouraged listening to what Iver talked about. We’re building cadres. We’re building out this approach and I have seen this. I’m not using that word, but now I’m stealing that word. It’s how people build effective teams, which is not to take a couple of specialists and put them in an ivory tower, but it’s to provide the method and the discipline of how we converse about it, so that we can have a consistent conversation.

When I tie it with some of the tools from the Architecture Forum and the ArchiMate Forum, I’m able to consistently describe it, so that I now have an asset I can identify, consume and produce value from.

Business context

Forde: And this is very different from data modeling. We are not talking about entity relationship, junk at the technical detail, or third normal form and that kind of stuff. We’re talking about a conversation that’s occurring around the business context of what needs to go on supported by the right level of technical detail when you need to go there in order to clarify.

Comments Off

Filed under ArchiMate®, Enterprise Architecture, TOGAF®

Welcome to Platform 3.0

By Dave Lounsbury, The Open Group

The space around us is forever changing.

As I write now, the planet’s molten core is in motion far beneath my feet, and way above my head, our atmosphere and the universe are in constant flux too.

Man also makes his own changes as well. Innovation in technology and business constantly create new ways to work together and create economic value.

Over the past few years, we have witnessed the birth, evolution and use of a number of such changes, each of which has the potential to fundamentally change the way we engage with one another. These include: Mobile, Social (both Social Networks and Social Enterprise), Big Data, the Internet of Things, Cloud Computing as well as devices and application architectures.

Now however, these once disparate forces are converging – united by the growing Consumerization of Technology and the resulting evolution in user behavior – to create new business models and system designs.

You can see evidence of this convergence of trends in the following key architectural shifts:

  • Exponential growth of data inside and outside organizations converging with end point usage in mobile devices, analytics, embedded technology and Cloud hosted environments
  • Speed of technology and business innovation is rapidly changing the focus from asset ownership to the usage of services, and the predication of more agile architecture models to be able to adapt to new technology change and offerings
  • New value networks resulting from the interaction and growth of the Internet of Things and multi-devices and connectivity targeting specific vertical industry sector needs
  • Performance and security implications involving cross technology platforms , cache and bandwidth strategies, existing across federated environments
  • Social behavior and market channel changes resulting in multiple ways to search and select IT and business services
  • Cross device and user-centric driven service design and mainstream use of online marketplace platforms for a growing range of services

The analyst community was the first to recognize and define this evolution in the technological landscape which we are calling Platform 3.0.

At Gartner’s Symposium conference, the keynote touched on the emergence of what it called a ‘Nexus of Forces,’ and warning that it would soon render existing Business Architectures “obsolete.”

However, for those organizations who could get it right, Gartner called the Nexus a “key differentiator of business and technology management” and recommended that “strategizing on how to take advantage of the Nexus should be a top priority for companies around the world.”[i]

Similarly, according to IDC Chief Analyst, Frank Gens, “Vendors’ ability (or inability) to compete on the 3rd Platform [Platform 3.0] right now — even at the risk of cannibalizing their own 2nd Platform franchises — will reorder leadership ranks within the IT market and, ultimately, every industry that uses IT.”[ii]

Of course, while organizations will be looking to make use of Platform 3.0 to create innovative new products and services, this will not be an easy transition for many. Significantly, there will be architectural issues and structural considerations to consider when using and combining these convergent technologies which will need to be overcome. Accomplishing this will in turn require cooperation among suppliers and users of these products and services.

That is why we’re excited to announce the formation of a new – as yet unnamed – forum, specifically designed to advance The Open Group vision of Boundaryless Information Flow™ by helping enterprises to take advantage of these convergent technologies. This will be accomplished by identifying a set of new platform capabilities, and architecting and standardizing an IT platform by which enterprises can reap the business benefits of Platform 3.0. It is our intention that these capabilities will enable enterprises to:

  • Process data “in the Cloud”
  • Integrate mobile devices with enterprise computing
  • Incorporate new sources of data, including social media and sensors in the Internet of Things
  • Manage and share data that has high volume, velocity, variety and distribution
  • Turn the data into usable information through correlation, fusion, analysis and visualization

The forum will bring together a community of industry experts and thought leaders whose purpose it will be to meet these goals, initiate and manage programs to support them, and promote the results. Owing to the nature of the forum it is expected that this forum will also leverage work underway in this area by The Open Group’s existing Cloud Work Group, and would coordinate with other forums for specific overlapping or cross-cutting activities.

Looking ahead, the first deliverables will analyze the use of Cloud, Social, Mobile Computing and Big Data, and describe the business benefits that enterprises can gain from them. The forum will then proceed to describe the new IT platform in the light of this analysis.

If this area is as exciting and important to you and your organization as it is to us, please join us in the discussion. We will use this blog and other communication channels of The Open Group to let you know how you can participate, and we’d of course welcome your comments and thoughts on this idea.

21 Comments

Filed under Enterprise Architecture, Professional Development

What are Words Worth?

By Stuart Boardman, KPN

“Words are stupid, words are fun 

Words can put you on the run.”*

Many years ago I learned, at my own cost, how easily words can be re- and/or misinterpreted. The story itself is not important. What matters is that a bunch of us were trying to achieve something we thought was worthwhile, thought we’d achieved it but got conned by someone more cunning with words than we were. The result was pretty much completely the opposite result to what we intended.

I’ve spent a lot of time since then trying to find ways of tying down meanings so that, if someone disagreed with me, it would at least be clear to everyone what we were disagreeing about.. That basically involved looking for a very precise choice of words and offering a definition of what I was using them for. Nothing very original there. It’s the same motivation which leads us to create a glossary or taxonomy.

Which brings me to the problem I want to address: Definitions can actually get in the way of the discussion. In the professional world, inhabited by pretty much anyone likely to be reading this, we tend to borrow words from natural language to describe very specific concepts: concepts which we have made specific. Sometimes we borrow these words from other disciplines, which may themselves have specialized out of natural language. Sometimes the usage is often a form of metaphor or analogy, but with familiarization that fact becomes forgotten and it becomes just another word we take for granted.

Recently I had a (friendly) public debate with Tom Graves about the meaning of the word entropy, which we used separately from each other to characterize related but different phenomena affecting enterprises. We both used it as an analogy or parallel and we based our analogies on different definitions of the terms within the world where it originated, physics. These definitions are not contradictory in physics but are pretty divergent when used as analogy or metaphor. Tom and I are friends, so the discussion didn’t become rancorous, but we have yet to achieve a satisfactory resolution – at least not on an agreeable definition.

Also recently, I have witnessed a debate in the Enterprise Architecture community (on LinkedIn) about the meaning of the words business and enterprise. These are words common in natural language whereas here they were being used in the context of our specific discipline. In that context it was a relevant and perhaps even important discussion. The meaning you associate with them, unless you believe they are semantically identical, has a significant impact on your view of Enterprise Architecture (EA).

Unfortunately, the debate rather quickly developed into a heated discussion about who had the correct definition of each of these words. All kinds of “experts” from the worlds of economics and management science were quoted along with various dictionaries, which only served to prove that almost any position could be justified. The net result was that the substantial discussion got lost in definition wars. And that’s a pity because there were some important differences in perspective, which could have been useful to explore and from which everyone could have learned something – even if we all stuck to our own definitions of the words.

We may not be doing anything obscure with these words in EA, but we’re still giving them a very specific context, which may not be identical to what the man on the number 9 bus (or a professor in a business school) thinks of. If even then we are able to give them different, reasonable definitions, it’s clear that we should be seeking to focus on the underlying discussion, as intended/defined by the person who started the discussion. Otherwise we’ll never get beyond a meta-discussion.

So how can we get away from the meta-discussions? To come back to Tom and me and entropy, the discussion about the definition of the word was useful to the extent that it helped me understand what he was getting at. (Beyond that it was of no value at all in the context of the substantive discussion, which is why we parked it.) Later on, Tom observed that the important thing in a discussion about terms is the process of discussion itself. Interestingly my partner made the identical point last night and she comes from an entirely different discipline as a healthcare professional: What’s useful in such a discussion is not the statement we make but the story we tell. A statement is static. A story is dynamic. So then, instead of saying “my definition of entropy is X. What’s yours?” we say, “I use the word entropy to refer to the following phenomena/behaviors. What things are you trying to capture?” We’ve pushed that definition out of the way. Later on we may come back to it, if we think at that point it would be useful to tie the term down.

Another recent discussion on Ruth Malan’s Requisite Variety site reminded me of the importance of visuals – sketching something. In fact I’m seeing an increasing number of people talking about visual thinking You don’t have to be a great artist to sketch something out, which is a good thing because I can’t draw to save my life. You just need to realize that in your head you are very often visualizing something and not necessarily a physical object. I think that’s particularly true when we use analogy or metaphor. And how often do we talk of seeing something in our “mind’s eye”? Let’s get that vision out there, show what we think is going on and how things affect each other. Take a look at that discussion on Ruth’s site and check out the links provided by Peter Bakker.

Of course definitions have their uses and are important if a group of people developing standards need to agree on how terms will be used. The group also wants other people to understand what they’re trying to say. They hope that, even if they know another reasonable definition, they’ll accept this one for the purposes of the discussion. But sometimes people are sufficiently uncomfortable with your definition – with your use of the word – that it becomes a barrier to the discussion. That’s what happened in the enterprise/business argument I mentioned before.

Let’s think about the term enterprise again. TOGAF™ has a clear definition of enterprise, which I happily use in discussions with people who know TOGAF. There are, however, people who for perfectly good reasons have a problem with a government or non-profit organization being called an enterprise or who believe the term only applies to organizations above a certain size and complexity. There are also people for whom an enterprise is necessarily identical to an organization. I personally tend to a much more generous definition. What am I going to do when I’m talking to those whose definition of an enterprise is different from mine? Should I try to convince them my definition is right or should I say “OK, fine, we’ll use your definition but let’s talk about all those other things I wanted to include and try to understand how they affect our organization.”

I need to draw pictures. A picture doesn’t force anyone to agree on a definition. It provides a canvas (there we go, another common visual metaphor) on which to place the elements of the discussion. This picture, courtesy of Tom Graves, provides an example of such a canvas. You don’t have to agree on a definition to understand what is being said. And there’s an accompanying story. Then we can investigate what it was I was trying to say and whether we can agree about the what, how and why of mechanisms in play. That doesn’t mean they’re going to agree but at least we’ll be arguing about the actual substance and there’s a fair chance we’ll all learn from the process. The label we pin on it is then a secondary consideration.

“Words in papers, words in books

Words on tv, words for crooks

Words of comfort, words of peace

Words to make the fighting cease

Words to tell you what to do

Words are working hard for you

Eat your words but don’t go hungry

Words have always nearly hung me.”*

*From Wordy Rappinghood by Tom Tom Club (1981)

Stuart Boardman is a Senior Business Consultant with KPN where he co-leads the Enterprise Architecture practice as well as the Cloud Computing solutions group. He is co-lead of The Open Group Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI. He is a frequent speaker at conferences on the topics of Cloud, SOA, and Identity. 

11 Comments

Filed under Enterprise Architecture

The Open Group Panel Explores How the Big Data Era Now Challenges the IT Status Quo

By Dana Gardner, Interarbor Solutions

Listen to the recorded podcast here: The Open Group panel explores how the Big Data era now challenges the IT status quo, or view the on-demand video recording on this discussion here: http://new.livestream.com/opengroup/events/1838807.

We recently assembled a panel of experts to explore how Big Data changes the status quo for architecting the enterprise. The bottom line from the discussion is that large enterprises should not just wade into Big Data as an isolated function, but should anticipate the strategic effects and impacts of Big Data — as well the simultaneous complicating factors of Cloud Computing and mobile– as soon as possible.

The panel consisted of Robert Weisman, CEO and Chief Enterprise Architect at Build The Vision; Andras Szakal, Vice President and CTO of IBM’s Federal Division; Jim Hietala, Vice President for Security at The Open Group, and Chris Gerty, Deputy Program Manager at the Open Innovation Program at NASA. I served as the moderator.

And this special thought leadership interview series comes to you in conjunction with The Open Group Conference recently held in Newport Beach, California. The conference focused on “Big Data – he transformation we need to embrace today.”

Threaded factors

An interesting thread for me throughout the conference was to factor where Big Data begins and plain old data, if you will, ends. Of course, it’s going to vary quite a bit from organization to organization.

But Gerty from NASA, part of our panel, provided a good example: It’s when you run out of gas with your old data methods, and your ability to deal with the data — and it’s not just the size of the data itself.

Therefore, Big Data means do things differently — not just to manage the velocity and the volume and the variety of the data, but to really think about data fundamentally and differently. And, we need to think about security, risk and governance. If it’s a “boundaryless organization” when it comes your data, either as a product or service or a resource, that control and management of which data should be exposed, which should be opened, and which should be very closely guarded all need to be factored, determined and implemented.

Here are some excerpts from the on-stage discussion:

Dana Gardner: You mentioned that Big Data to you is not a factor of the size, because NASA’s dealing with so much. It’s when you run out of steam, as it were, with the methodologies. Maybe you could explain more. When do you know that you’ve actually run out of steam with the methodologies?

Gerty: When we collect data, we have some sort of goal in minds of what we might get out of it. When we put the pieces from the data together, it either maybe doesn’t fit as well as you thought or you are successful and you continue to do the same thing, gathering archives of information.

Gardner: Andras, does that square with where you are in your government interactions — that data now becomes a different type of resource, and that you need to know when to do things differently?At that point, where you realize there might even something else that you want to do with the data, different than what you planned originally, that’s when we have to pivot a little bit and say, “Now I need to treat this as a living archive. It’s a ‘it may live beyond me’ type of thing.” At that point, I think you treat it as setting up the infrastructure for being used later, whether it’d be by you or someone else. That’s an important transition to make and might be what one could define as Big Data.

Szakal: The importance of data hasn’t changed. The data itself, the veracity of the data, is still important. Transactional data will always need to exist. The difference is that you have certainly the three or four Vs, depending on how you look at it, but the importance of data is in its veracity, and your ability to understand or to be able to use that data before the data’s shelf life runs out.

Gardner: Bob, we’ve seen the price points on storage go down so dramatically. We’ve seem people just decide to hold on to data that they wouldn’t have before, simply because they can and they can afford to do so. That means we need to try to extract value and use that data. From the perspective of an enterprise architect, how are things different now, vis-à-vis this much larger set of data and variety of data, when it comes to planning and executing as architects?Some data has a shelf life that’s long lived. Other data has very little shelf life, and you would use different approaches to being able to utilize that information. It’s ultimately not about the data itself, but it’s about gaining deep insight into that data. So it’s not storing data or manipulating data, but applying those analytical capabilities to data.

Weisman: One of the major issues is that normally organizations are holding two orders of magnitude more data then they need. It’s an huge overhead, both in terms of the applications architecture that has a code basis, larger than it should be, and also from the technology architecture that is supporting a horrendous number of servers and a whole bunch of technology stuff that they don’t need.

The issue for the architect is to figure out as what data is useful, institute a governance process, so that you can have data lifecycle management, have a proper disposition,  focus the organization on information data and knowledge that is basically going to provide business value to the organization, and help them innovate and have a competitive advantage.

Can’t afford it

And in terms of government, just improve service delivery, because there’s waste right now on information infrastructure, and we can’t afford it anymore.

Gardner: So it’s difficult to know what to keep and what not to keep. I’ve actually spoken to a few people lately who want to keep everything, just because they want to mine it, and they are willing to spend the money and effort to do that.

Jim Hietala, when people do get to this point of trying to decide what to keep, what not to keep, and how to architect properly for that, they also need to factor in security. It shouldn’t become later in the process. It should come early. What are some of the precepts that you think are important in applying good security practices to Big Data?

Hietala: One of the big challenges is that many of the big-data platforms weren’t built from the get-go with security in mind. So some of the controls that you’ve had available in your relational databases, for instance, you move over to the Big Data platforms and the access control authorizations and mechanisms are not there today.

Gardner: There are a lot of unknown unknowns out there, as we discovered with our tweet chat last month. Some people think that the data is just data, and you apply the same security to it. Do you think that’s the case with Big Data? Is it just another follow-through of what you always did with data in the first place?Planning the architecture, looking at bringing in third-party controls to give you the security mechanisms that you are used to in your older platforms, is something that organizations are going to have to do. It’s really an evolving and emerging thing at this point.

Hietala: I would say yes, at a conceptual level, but it’s like what we saw with virtualization. When there was a mad rush to virtualize everything, many of those traditional security controls didn’t translate directly into the virtualized world. The same thing is true with Big Data.

When you’re talking about those volumes of data, applying encryption, applying various security controls, you have to think about how those things are going to scale? That may require new solutions from new technologies and that sort of thing.

Gardner: Chris Gerty, when it comes to that governance, security, and access control, are there any lessons that you’ve learned that you are aware of in terms of the best of openness, but also with the ability to manage the spigot?

Gerty: Spigot is probably a dangerous term to use, because it implies that all data is treated the same. The sooner that you can tag the data as either sensitive or not, mostly coming from the person or team that’s developed or originated the data, the better.

Kicking the can

Once you have it on a hard drive, once you get crazy about storing everything, if you don’t know where it came from, you’re forced to put it into a secure environment. And that’s just kicking the can down the road. It’s really a disservice to people who might use the data in a useful way to address their problems.

We constantly have satellites that are made for one purpose. They send all the data down. It’s controlled either for security or for intellectual property (IP), so someone can write a paper. Then, after the project doesn’t get funded or it just comes to a nice graceful close, there is that extra step, which is almost a responsibility of the originators, to make it useful to the rest of the world.

Gardner: Let’s look at Big Data through the lens of some other major trends right now. Let’s start with Cloud. You mentioned that at NASA, you have your own private Cloud that you’re using a lot, of course, but you’re also now dabbling in commercial and public Clouds. Frankly, the price points that these Cloud providers are offering for storage and data services are pretty compelling.

So we should expect more data to go to the Cloud. Bob, from your perspective, as organizations and architects have to think about data in this hybrid Cloud on-premises off-premises, moving back and forth, what do you think enterprise architects need to start thinking about in terms of managing that, planning for the right destination of data, based on the right mix of other requirements?

Weisman: It’s a good question. As you said, the price point is compelling, but the security and privacy of the information is something else that has to be taken into account. Where is that information going to reside? You have to have very stringent service-level agreements (SLAs) and in certain cases, you might say it’s a price point that’s compelling, but the risk analysis that I have done means that I’m going to have to set up my own private Cloud.

Gardner: Andras, how do the Cloud and Big Data come together in a way that’s intriguing to you?Right now, everybody’s saying is the public Cloud is going to be the way to go. Vendors are going to have to be very sensitive to that and many are, at this point in time, addressing a lot of the needs of some of the large client basis. So it’s not one-size-fits-all and it’s more than just a price for service. Architecture can bring down the price pretty dramatically, even within an enterprise.

Szakal: Actually it’s a great question. We could take the rest of the 22 minutes talking on this one question. I helped lead the President’s Commission on Big Data that Steve Mills from IBM and — I forget the name of the executive from SAP — led. We intentionally tried to separate Cloud from Big Data architecture, primarily because we don’t believe that, in all cases, Cloud is the answer to all things Big Data. You have to define the architecture that’s appropriate for your business needs.

However, it also depends on where the data is born. Take many of the investments IBM has made into enterprise market management, for example, Coremetrics, several of these services that we now offer for helping customers understand deep insight into how their retail market or supply chain behaves.

Born in the Cloud

All of that information is born in the Cloud. But if you’re talking about actually using Cloud as infrastructure and moving around huge sums of data or constructing some of these solutions on your own, then some of the ideas that Bob conveyed are absolutely applicable.

I think it becomes prohibitive to do that and easier to stand up a hybrid environment for managing the amount of data. But I think that you have to think about whether your data is real-time data, whether it’s data that you could apply some of these new technologies like Hadoop to, Hadoop MapReduce-type solutions, or whether it’s traditional data warehousing.

Data warehouses are going to continue to exist and they’re going to continue to evolve technologically. You’re always going to use a subset of data in those data warehouses, and it’s going to be an applicable technology for many years to come.

Gardner: So suffice it to say, an enterprise architect who is well versed in both Cloud infrastructure requirements, technologies, and methods, as well as Big Data, will probably be in quite high demand. That specialization in one or the other isn’t as valuable as being able to cross-pollinate between them.

Szakal: Absolutely. It’s enabling our architects and finding deep individuals who have this unique set of skills, analytics, mathematics, and business. Those individuals are going to be the future architects of the IT world, because analytics and Big Data are going to be integrated into everything that we do and become part of the business processing.

Gardner: Well, that’s a great segue to the next topic that I am interested in, and it’s around mobility as a trend and also application development. The reason I lump them together is that I increasingly see developers being tasked with mobile first.

When you create a new app, you have to remember that this is going to run in the mobile tier and you want to make sure that the requirements, the UI, and the complexity of that app don’t go beyond the ability of the mobile app and the mobile user. This is interesting to me, because data now has a different relationship with apps.

We used to think of apps as creating data and then the data would be stored and it might be used or integrated. Now, we have applications that are simply there in order to present the data and we have the ability now to present it to those mobile devices in the mobile tier, which means it goes anywhere, everywhere all the time.

Let me start with you Jim, because it’s security and risk, but it’s also just rethinking the way we use data in a mobile tier. If we can do it safely, and that’s a big IF, how important should it be for organizations to start thinking about making this data available to all of these devices and just pour out into that mobile tier as possible?

Hietala: In terms of enabling the business, it’s very important. There are a lot of benefits that accrue from accessing your data from whatever device you happen to be on. To me, it is that question of “if,” because now there’s a whole lot of problems to be solved relative to the data floating around anywhere on Android, iOS, whatever the platform is, and the organization being able to lock down their data on those devices, forgetting about whether it’s the organization device or my device. There’s a set of issues around that that the security industry is just starting to get their arms around today.

Mobile ability

Gardner: Chris, any thoughts about this mobile ability that the data gets more valuable the more you can use it and apply it, and then the more you can apply it, the more data you generate that makes the data more valuable, and we start getting into that positive feedback loop?

Gerty: Absolutely. It’s almost an appreciation of what more people could do and get to the problem. We’re getting to the point where, if it’s available on your desktop, you’re going to find a way to make it available on your device.

That same security questions probably need to be answered anyway, but making it mobile compatible is almost an acknowledgment that there will be someone who wants to use it. So let me go that extra step to make it compatible and see what I get from them. It’s more of a cultural benefit that you get from making things compatible with mobile.

Gardner: Any thoughts about what developers should be thinking by trying to bring the fruits of Big Data through these analytics to more users rather than just the BI folks or those that are good at SQL queries? Does this change the game by actually making an application on a mobile device, simple, powerful but accessing this real time updated treasure trove of data?

Gerty: I always think of the astronaut on the moon. He’s got a big, bulky glove and he might have a heads-up display in front of him, but he really needs to know exactly a certain piece of information at the right moment, dealing with bandwidth issues, dealing with the environment, foggy helmet wherever.

It’s very analogous to what the day-to-day professional will use trying to find out that quick e-mail he needs to know or which meeting to go to — which one is more important — and it all comes down to putting your developer in the shoes of the user. So anytime you can get interaction between the two, that’s valuable.

Weisman: From an Enterprise Architecture point of view my background is mainly defense and government, but defense mobile computing has been around for decades. So you’ve always been dealing with that.

The main thing is that in many cases, if they’re coming up with information, the whole presentation layer is turning into another architecture domain with information visualization and also with your security controls, with an integrated identity management capability.

It’s like you were saying about astronaut getting it right. He doesn’t need to know everything that’s happening in the world. He needs to know about his heads-up display, the stuff that’s relevant to him.

So it’s getting the right information to person in an authorized manner, in a way that he can visualize and make sense of that information, be it straight data, analytics, or whatever. The presentation layer, ergonomics, visual communication are going to become very important in the future for that. There are also a lot of problems. Rather than doing it at the application level, you’re doing it entirely in one layer.

Governance and security

Gardner: So clearly the implications of data are cutting across how we think about security, how we think about UI, how we factor in mobility. What we now think about in terms of governance and security, we have to do differently than we did with older data models.

Jim Hietala, what about the impact on spurring people towards more virtualized desktop delivery, if you don’t want to have the date on that end device, if you want solve some of the issues about control and governance, and if you want to be able to manage just how much data gets into that UI, not too much not too little.

Do you think that some of these concerns that we’re addressing will push people to look even harder, maybe more aggressive in how they go to desktop and application virtualization, as they say, keep it on the server, deliver out just the deltas?

Hietala: That’s an interesting point. I’ve run across a startup in the last month or two that is doing is that. The whole value proposition is to virtualize the environment. You get virtual gold images. You don’t have to worry about what’s actually happening on the physical device and you know when the devices connect. The security threat goes away. So we may see more of that as a solution to that.

Gardner: Andras, do you see that that some of the implications of Big Data, far fetched as it may be, are propelling people to cultivate their servers more and virtualize their apps, their data, and their desktop right up to the end devices?

Szakal: Yeah, I do. I see IBM providing solutions for virtual desktop, but I think it was really a security question you were asking. You’re certainly going to see an additional number of virtualized desktop environments.

Ultimately, our network still is not stable enough or at a high enough bandwidth to really make that useful exercise for all but the most menial users in the enterprise. From a security point of view, there is a lot to be still solved.

And part of the challenge in the Cloud environment that we see today is the proliferation of virtual machines (VMs) and the inability to actually contain the security controls within those machines and across these machines from an enterprise perspective. So we’re going to see more solutions proliferate in this area and to try to solve some of the management issues, as well as the security issues, but we’re a long ways away from that.

Gerty: Big Data, by itself, isn’t magical. It doesn’t have the answers just by being big. If you need more, you need to pry deeper into it. That’s the example. They realized early enough that they were able to make something good.

Gardner: Jim Hietala, any thoughts about examples that illustrate where we’re going and why this is so important?

Hietala: Being a security guy, I tend to talk about scare stories, horror stories. One example from last year that struck me. One of the major retailers here in the U.S. hit the news for having predicted, through customer purchase behavior, when people were pregnant.

They could look and see, based upon buying 20 things, that if you’re buying 15 of these and your purchase behavior has changed, they can tell that. The privacy implications to that are somewhat concerning.

An example was that this retailer was sending out coupons related to somebody being pregnant. The teenage girl, who was pregnant hadn’t told her family yet. The father found it. There was alarm in the household and at the local retailer store, when the father went and confronted them.

Privacy implications

There are privacy implications from the use of Big Data. When you get powerful new technology in marketing people’s hands, things sometimes go awry. So I’d throw that out just as a cautionary tale that there is that aspect to this. When you can see across people’s buying transactions, things like that, there are privacy considerations that we’ll have to think about, and that we really need to think about as an industry and a society.

Comments Off

Filed under Conference

On Demand Broadcasts from Day One at The Open Group Conference in Newport Beach

By The Open Group Conference Team

Since not everyone could make the trip to The Open Group Conference in Newport Beach, we’ve put together a recap of day one’s plenary speakers. Stay tuned for more recaps coming soon!

Big Data at NASA

In his talk titled, “Big Data at NASA,” Chris Gerty, deputy program manager, Open Innovation Program, National Aeronautics and Space Administration (NASA), discussed how Big Data is being interpreted by the next generation of rocket scientists. Chris presented a few lessons learned from his experiences at NASA:

  1. A traditional approach is not always the best approach. A tried and proven method may not translate. Creating more programs for more data to store on bigger hard drives is not always effective. We need to address the never-ending challenges that lie ahead in the shift of society to the information age.
  2. A plan for openness. Based on a government directive, Chris’ team looked to answer questions by asking the right people. For example, NASA asked the people gathering data on a satellite to determine what data was the most important, which enabled NASA to narrow focus and solve problems. Furthermore, by realizing what can also be useful to the public and what tools have already been developed by the public, open source development can benefit the masses. Through collaboration, governments and citizens can work together to solve some of humanity’s biggest problems.
  3. Embrace the enormity of the universe. Look for Big Data where no one else is looking by putting sensors and information gathering tools. If people continue to be scared of Big Data, we will be resistant to gathering more of it. By finding Big Data where it has yet to be discovered, we can solve problems and innovate.

To view Chris’s presentation, please watch the broadcasted session here: http://new.livestream.com/opengroup/Gerty-NPB13

Bringing Order to the Chaos

David Potter, chief technical officer at Promise Innovation and Ron Schuldt, senior partner at UDEF-IT, LLC discussed how The Open Group’s evolving Quantum Lifecycle Management (QLM) standard coupled with its complementary Universal Data Element Framework (UDEF) standard help bring order to the terminology chaos that faces Big Data implementations.

The QLM standard provides a framework for the aggregation of lifecycle data from a multiplicity of sources to add value to the decision making process. Gathering mass amounts of data is useless if it cannot be analyzed. The QLM framework provides a means to interpret the information gathered for business intelligence. The UDEF allows each piece of data to be paired with an unambiguous key to provide clarity. By partnering with the UDEF, the QLM framework is able to separate itself from domain-specific semantic models. The UDEF also provides a ready-made key for international language support. As an open standard, the UDEF is data model independent and as such supports normalization across data models.

One example of successful implementation is by Compassion International. The organization needed to find a balance between information that should be kept internal (e.g., payment information) and information that should be shared with its international sponsors. In this instance, UDEF was used as a structured process for harmonizing the terms used in IT systems between funding partners.

The beauty of the QLM framework and UDEF integration is that they are flexible and can be applied to any product, domain and industry.

To view David and Ron’s presentation, please watch the broadcasted session here: http://new.livestream.com/opengroup/potter-NPB13

Big Data – Panel Discussion

Moderated by Dana Gardner, Interarbor Solution, Robert Weisman , Build The Vision, Andras Szakal, IBM, Jim Hietala, The Open Group, and Chris Gerty, NASA, discussed the implications of Big Data and what it means for business architects and enterprise architects.

Big Data is not about the size but about analyzing that data. Robert mentioned that most organizations store more data than they need or use, and from an enterprise architect’s perspective, it’s important to focus on the analysis of the data and to provide information that will ultimately aid it in some way. When it comes to security, Jim explained that newer Big Data platforms are not built with security in mind. While data is data, many security controls don’t translate to new platforms or scale with the influx of data.

Cloud Computing is Big Data-ready, and price can be compelling, but there are significant security and privacy risks. Robert brought up the argument over public and private Cloud adoption, and said, “It’s not one size fits all.” But can Cloud and Big Data come together? Andras explained that Cloud is not the almighty answer to Big Data. Every organization needs to find the Enterprise Architecture that fits its needs.

The fruits of Big Data can be useful to more than just business intelligence professionals. With the trend of mobility and application development in mind, Chris suggested that developers keep users in mind. Big Data can be used to tell us many different things, but it’s about finding out what is most important and relevant to users in a way that is digestible.

Finally, the panel discussed how Big Data bringing about big changes in almost every aspect of an organization. It is important not to generalize, but customize. Every enterprise needs its own set of architecture to fit its needs. Each organization finds importance in different facets of the data gathered, and security is different at every organization. With all that in mind, the panel agreed that focusing on the analytics is the key.

To view the panel discussion, please watch the broadcasted session here: http://new.livestream.com/opengroup/events/1838807

Comments Off

Filed under Conference

Three Best Practices for Successful Implementation of Enterprise Architecture Using the TOGAF® Framework and the ArchiMate® Modeling Language

By Henry Franken, Sven van Dijk and Bas van Gils, BiZZdesign

The discipline of Enterprise Architecture (EA) was developed in the 1980s with a strong focus on the information systems landscape of organizations. Since those days, the scope of the discipline has slowly widened to include more and more aspects of the enterprise as a whole. This holistic perspective takes into account the concerns of a wide variety of stakeholders. Architects, especially at the strategic level, attempt to answer the question: “How should we organize ourselves in order to be successful?”

An architecture framework is a foundational structure or set of structures for developing a broad range of architectures and consists of a process and a modeling component. The TOGAF® framework and the ArchiMate® modeling language – both maintained by The Open Group – are two leading and widely adopted standards in this field.

TA 

While both the TOGAF framework and the ArchiMate modeling language have a broad (enterprise-wide) scope and provide a practical starting point for an effective EA capability, a key factor is the successful embedding of EA standards and tools in the organization. From this perspective, the implementation of EA means that an organization adopts processes for the development and governance of EA artifacts and deliverables. Standards need to be tailored, and tools need to be configured in the right way in order to create the right fit. Or more popularly stated, “For an effective EA, it has to walk the walk, and talk the talk of the organization!”

EA touches on many aspects such as business, IT (and especially the alignment of these two), strategic portfolio management, project management and risk management. EA is by definition about cooperation and therefore it is impossible to operate in isolation. Successful embedding of an EA capability in the organization is typically approached as a change project with clearly defined goals, metrics, stakeholders, appropriate governance and accountability, and with assigned responsibilities in place.

With this in mind, we share three best practices for the successful implementation of Enterprise Architecture:

Think big, start small

The potential footprint of a mature EA capability is as big as the entire organization, but one of the key success factors for being successful with EA is to deliver value early on. Experience from our consultancy practice proves that a “think big, start small” approach has the most potential for success. This means that the process of implementing an EA capability is a process with iterative and incremental steps, based on a long term vision. Each step in the process must add measurable value to the EA practice, and priorities should be based on the needs and the change capacity of the organization.

Combine process and modeling

The TOGAF framework and the ArchiMate modeling language are a powerful combination. Deliverables in the architecture process are more effective when based on an approach that combines formal models with powerful visualization capabilities.

The TOGAF standard describes the architecture process in detail. The Architecture Development Method (ADM) is the core of the TOGAF standard. The ADM is a customer-focused and value-driven process for the sustainable development of a business capability. The ADM specifies deliverables throughout the architecture life-cycle with a focus on the effective communication to a variety of stakeholders. ArchiMate is fully complementary to the content as specified in the TOGAF standard. The ArchiMate standard can be used to describe all aspects of the EA in a coherent way, while tailoring the content for a specific audience. Even more, an architecture repository is a valuable asset that can be reused throughout the enterprise. This greatly benefits communication and cooperation of Enterprise Architects and their stakeholders.

Use a tool!

It is true, “a fool with a tool is still a fool.” In our teaching and consulting practice we have found; however, that adoption of a flexible and easy to use tool can be a strong driver in pushing the EA initiative forward.

EA brings together valuable information that greatly enhances decision making, whether on a strategic or more operational level. This knowledge not only needs to be efficiently managed and maintained, it also needs to be communicated to the right stakeholder at the right time, and even more importantly, in the right format. EA has a diverse audience that has business and technical backgrounds, and each of the stakeholders needs to be addressed in a language that is understood by all. Therefore, essential qualifications for EA tools are: rigidity when it comes to the management and maintenance of knowledge and flexibility when it comes to the analysis (ad-hoc, what-if, etc.), presentation and communication of the information to diverse audiences.

So what you are looking for is a tool with solid repository capabilities, flexible modeling and analysis functionality.

Conclusion

EA brings value to the organization because it answers more accurately the question: “How should we organize ourselves?” Standards for EA help monetize on investments in EA more quickly. The TOGAF framework and the ArchiMate modeling language are popular, widespread, open and complete standards for EA, both from a process and a language perspective. EA becomes even more effective if these standards are used in the right way. The EA capability needs to be carefully embedded in the organization. This is usually a process based on a long term vision and has the most potential for success if approached as “think big, start small.” Enterprise Architects can benefit from tool support, provided that it supports flexible presentation of content, so that it can be tailored for the communication to specific audiences.

More information on this subject can be found on our website: www.bizzdesign.com. Whitepapers are available for download, and our blog section features a number of very interesting posts regarding the subjects covered in this paper.

If you would like to know more or comment on this blog, or please do not hesitate to contact us directly!

Henry Franken

Henry Franken is the managing director of BiZZdesign and is chair of The Open Group ArchiMate Forum. As chair of The Open Group ArchiMate Forum, Henry led the development of the ArchiMate Version 2.o standard. Henry is a speaker at many conferences and has co-authored several international publications and Open Group White Papers. Henry is co-founder of the BPM-Forum. At BiZZdesign, Henry is responsible for research and innovation.

 

 

sven Sven van Dijk Msc. is a consultant and trainer at BiZZdesign North America. He worked as an application consultant on large scale ERP implementations and as a business consultant in projects on information management and IT strategy in various industries such as finance and construction. He gained nearly eight years of experience in applying structured methods and tools for Business Process Management and Enterprise Architecture.

 

basBas van Gils is a consultant, trainer and researcher for BiZZdesign. His primary focus is on strategic use of enterprise architecture. Bas has worked in several countries, across a wide range of organizations in industry, retail, and (semi)governmental settings.  Bas is passionate about his work, has published in various professional and academic journals and writes for several blogs.

2 Comments

Filed under ArchiMate®, Enterprise Architecture, TOGAF®

“New Now” Planning

By Stuart Boardman, KPN

In my last post I introduced the idea of “the new now,” which I borrowed from Jack Martin Leith. I suggested that the planning of large transformation projects needs to focus more on the first step than on the end goal, because that first step, once taken, will be the “new now” – the reality with which the organization will have to work. There were some interesting comments that have helped me further develop my ideas. I also got pointed, via Twitter to this interesting and completely independent piece that comes to very similar conclusions.

I promised to try to explain how this might work in practice, so it here goes…

As I see it, we would start our transformation program by looking at both the first step and the long term vision more or less in parallel.

In order to establish what that first step should be, we need to ask what we want the “new now” to look like. If we could have a “new now” – right now – what would that be? In other words, what is it that we can’t do at the moment that we believe we really need to be able to do? This is a question that should be asked as broadly as possible across the organization. There are three reasons for that:

  1. We’ll probably come across a variety of opinions and we’ll need to know why they vary and why people think they are important, if we are to define something feasible and useful. It’s also possible that out of this mixture of views something altogether different may emerge.
  2. Changes in the relatively near future will tend to be changes to operational practices and those are best determined and managed by the part of the organization that performs them (see Stafford Beer’s Viable Systems Model and associated work by Patrick Hoverstadt and others).
  3. Everyone’s going to experience the “new now” (that’s why we call it the “new now”), so it would be good not to just drop it on them as if this were a new form of big bang. By involving them now, they’ll have known what’s coming and be more likely to accept it than if they were just “informed.” And at least we’ll know how people will react if the “new now” doesn’t meet their particular wishes.

This process addresses, I hope, both Ron van den Burg’s comment about different people having different “horizons” and an interesting observation made by Mark Skilton at The Open Group Conference in Newport Beach that at any one time an organization may have a large number of “strategies” in play.

The longer term perspective is about vision and strategy. What is the vision of the enterprise and what does it want to become? What are the strategies to achieve that? That’s something typically determined at the highest levels of an organization, even though one might hope these days that the whole organization would be able to contribute. For the moment, we’ll regard it as a board decision.

Maybe the board is perfectly happy and doesn’t need to change the vision or strategy. In that case we’re not talking about transformation, so let’s assume they do see a need to change something. A strategic change doesn’t necessarily have to affect the entire organization. It may be that the way a particular aspect of the enterprise’s mission is performed needs to be changed. Nonetheless if it’s at a strategic level it’s going to involve a transformation.

Now we can lay the “new now” and the long term vision next to each other and see how well they fit. Is the first step indeed a step towards the vision? If not we need to understand why. Traditionally we would tend to say the first step must then be wrong. That’s a possibility but it’s equally possible that the long-term view is simply too long-term and is missing key facts about the organization. The fact alone that the two don’t fit may indicate a disconnect within the organization and require a different change altogether. So simply by performing this action, we are addressing one of the risks to a transformation project. If we had simply defined the first step based on the long term vision, we’d probably have missed it. If, however, the fit is indeed good, then we know we have organizational buy-in for the transformation.

Once we have broad alignment, we need to re-examine the first step for feasibility. It mustn’t be more ambitious than we can deliver within a reasonable time and budget. Nothing new there. What is different is that while we require the first step to be aware of the long term vision, we don’t expect it to put a platform in place for everything the future may bring. That’s exactly what it shouldn’t do, because the only thing we know for certain is that we need to be adaptable to change

What about the second step? We’ve delivered the first step. We’re at the “new now.” How does that feel? Where would we like to be now? This essentially an iteration over the process we used for the first step. There’s a strong chance that we’ll get a different result than we would have had, if we’d planned this second step back at the beginning. After all, we have a new “now,” so our starting state is something that we couldn’t experience back then. We also need to revisit the vision/strategy aspect. The world (the Environment in VSM terms) will not have stood still in the meantime. One would hope that our vision wasn’t so fragile that it would change drastically but at the very least we need to re-validate it.

So now we can compare the new next step and the (revised) vision, just as we did with our first step. And then we move on.

So what this process comes down to is essentially a series of movements to a “new now.” After each movement we have a new reality. So yes, we’re still planning. We’re just not making hard plans for fuzzy objectives. Our planning process is as flexible as our results need to be. Of course that doesn’t mean we can’t start thinking about step two before we actually arrive at step one but these plans only become concrete when we know what the “new now” feels like and therefore exactly what the following “new now” should be.

In their comments on the previous blog both Matt Kern and Peter Bakker made the reasonable points that without a plan, you’re probably not going to get funding. The other side of the coin is that these days (and actually for a few years now) it’s increasingly difficult to get funding for multi-year transformation processes, exactly because the return on investment takes too long – and is too uncertain. That’s exactly what I’m trying to address. The fundamental concept of “new now” planning is that something of agreed value is delivered within an acceptable timescale. Isn’t that more likely to get funding?

Once again, I’d be delighted to see people’s reaction to these ideas. I’m 100 percent certain they can be improved.

Stuart Boardman is a Senior Business Consultant with KPN where he co-leads the Enterprise Architecture practice as well as the Cloud Computing solutions group. He is co-lead of The Open Group Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI. He is a frequent speaker at conferences on the topics of Cloud, SOA, and Identity. 

2 Comments

Filed under Enterprise Architecture

How Should we use Cloud?

By Chris Harding, The Open Group

How should we use Cloud? This is the key question at the start of 2013.

The Open Group® conferences in recent years have thrown light on, “What is Cloud?” and, “Should we use Cloud?” It is time to move on.

Cloud as a Distributed Processing Platform

The question is an interesting one, because the answer is not necessarily, “Use Cloud resources just as you would use in-house resources.” Of course, you can use Cloud processing and storage to replace or supplement what you have in-house, and many companies are doing just that. You can also use the Cloud as a distributed computing platform, on which a single application instance can use multiple processing and storage resources, perhaps spread across many countries.

It’s a bit like contracting a company to do a job, rather than hiring a set of people. If you hire a set of people, you have to worry about who will do what when. Contract a company, and all that is taken care of. The company assembles the right people, schedules their work, finds replacements in case of sickness, and moves them on to other things when their contribution is complete.

This doesn’t only make things easier, it also enables you to tackle bigger jobs. Big Data is the latest technical phenomenon. Big Data can be processed effectively by parceling the work out to multiple computers. Cloud providers are beginning to make the tools to do this available, using distributed file systems and map-reduce. We do not yet have, “Distributed Processing as a Service” – but that will surely come.

Distributed Computing at the Conference

Big Data is the main theme of the Newport Beach conference. The plenary sessions have keynote presentations on Big Data, including the crucial aspect of security, and there is a Big Data track that explores in depth its use in Enterprise Architecture.

There are also Cloud tracks that explore the business aspects of using Cloud and the use of Cloud in Enterprise Architecture, including a session on its use for Big Data.

Service orientation is generally accepted as a sound underlying principle for systems using both Cloud and in-house resources. The Service Oriented Architecture (SOA) movement focused initially on its application within the enterprise. We are now looking to apply it to distributed systems of all kinds. This may require changes to specific technology and interfaces, but not to the fundamental SOA approach. The Distributed Services Architecture track contains presentations on the theory and practice of SOA.

Distributed Computing Work in The Open Group

Many of the conference presentations are based on work done by Open Group members in the Cloud Computing, SOA and Semantic Interoperability Work Groups, and in the Architecture, Security and Jericho Forums. The Open Group enables people to come together to develop standards and best practices for the benefit of the architecture community. We have active Work Groups and Forums working on artifacts such as a Cloud Computing Reference Architecture, a Cloud Portability and Interoperability Guide, and a Guide to the use of TOGAF® framework in Cloud Ecosystems.

The Open Group Conference in Newport Beach

Our conferences provide an opportunity for members and non-members to discuss ideas together. This happens not only in presentations and workshops, but also in informal discussions during breaks and after the conference sessions. These discussions benefit future work at The Open Group. They also benefit the participants directly, enabling them to bring to their enterprises ideas that they have sounded out with their peers. People from other companies can often bring new perspectives.

Most enterprises now know what Cloud is. Many have identified specific opportunities where they will use it. The challenge now for enterprise architects is determining how best to do this, either by replacing in-house systems, or by using the Cloud’s potential for distributed processing. This is the question for discussion at The Open Group Conference in Newport Beach. I’m looking forward to an interesting conference!

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

1 Comment

Filed under Cloud, Conference

Successful Enterprise Architecture using the TOGAF® and ArchiMate® Standards

By Henry Franken, BiZZdesign

The discipline of Enterprise Architecture was developed in the 1980s with a strong focus on the information systems landscape of organizations. Since those days, the scope of the discipline has slowly widened to include more and more aspects of the enterprise as a whole. This holistic perspective takes into account the concerns of a wide variety of stakeholders. Architects, especially at the strategic level, attempt to answer the question “How should we organize ourselves in order to be successful?”

An architecture framework is a foundational structure, or set of structures, which can be used for developing a broad range of different architectures and consists of a process and a modeling component. TOGAF® framework and the ArchiMate® modeling language – both maintained by The Open Group® – are the two leading standards in this field.

TA

Much has been written on this topic in online forums, whitepapers, and blogs. On the BiZZdesign blog we have published several series on EA in general and these standards in particular, with a strong focus on the question: what should we do to be successful with EA using TOGAF framework and the ArchiMate modeling language? I would like to summarize some of our findings here:

Tip 1 One of the key success factors for being successful with EA is to deliver value early on. We have found that organizations who understand that a long-term vision and incremental delivery (“think big, act small”) have a larger chance of developing an effective EA capability
 
Tip 2 Combine process and modeling: TOGAF framework and the ArchiMate modeling language are a powerful combination. Deliverables in the architecture process are more effective when based on an approach that combines formal models with powerful visualization capabilities. Even more, an architecture repository is an valuable asset that can be reused throughout the enterprise
 
Tip 3 Use a tool! It is true that “a fool with a tool is still a fool”. In our teaching and consulting practice we have found, however, that adoption of a flexible and easy to use tool can be a strong driver in pushing the EA-initiative forward.

There will be several interesting presentations on this subject at the upcoming Open Group conference (Newport Beach, CA, USA, January 28 – 31: Look here), ranging from theory to case practice, focusing on getting started with EA as well as on advanced topics.

I will also present on this subject and will elaborate on the combined use of The Open Group standards for EA. I also gladly invite you to join me at the panel sessions. Look forward to see you there!

Henry FrankenHenry Franken is the managing director of BiZZdesign and is chair of The Open Group ArchiMate Forum. As chair of The Open Group ArchiMate Forum, Henry led the development of the ArchiMate Version 2.o standard. Henry is a speaker at many conferences and has co-authored several international publications and Open Group White Papers. Henry is co-founder of the BPM-Forum. At BiZZdesign, Henry is responsible for research and innovation.

2 Comments

Filed under ArchiMate®, Enterprise Architecture, TOGAF®

The Death of Planning

By Stuart Boardman, KPN

If I were to announce that planning large scale transformation projects was a waste of time, you’d probably think I’d taken leave of my senses. And yet, somehow this thought has been nagging at me for some time now. Bear with me.

It’s not so long ago that we still had debates about whether complex projects should be delivered as a “big bang” or in phases. These days the big bang has pretty much been forgotten. Why is that? I think the main reason is the level of risk involved with running a long process and dropping it into the operational environment just like that. This applies to any significant change, whether related to a business model and processes or IT architecture or physical building developments. Even if it all works properly, the level of sudden organizational change involved may stop it in its tracks.

So it has become normal to plan the change as a series of phases. We develop a roadmap to get us from here (as-is) to the end goal (to-be). And this is where I begin to identify the problem.

A few months ago I spent an enjoyable and thought provoking day with Jack Martin Leith (@jackmartinleith). Jack is a master in demystifying clichés but when he announced his irritation with “change is a journey,” I could only respond, “but Jack, it is.” What Jack made me see is that, whilst the original usage was a useful insight, it’s become a cliché which is commonly completely misused. It results in some pretty frustrating journeys! To understand that let’s take the analogy literally. Suppose your objective is to travel to San Diego but there are no direct flights from where you live. If the first step on your journey is a 4 hour layover at JFK, that’s at best a waste of your time and energy. There’s no value in this step. A day in Manhattan might be a different story. We can (and do) deal with this kind of thing for journeys of a day or so but imagine a journey that takes three or more years and all you see on the way is the inside of airports.

My experience has been that the same problem too often manifests itself in transformation programs. The first step may be logical from an implementation perspective, but it delivers no discernible value (tangible or intangible). It’s simply a validation that something has been done, as if, in our travel analogy, we were celebrating travelling the first 1000 kilometers, even if that put us somewhere over the middle of Lake Erie.

What would be better? An obvious conclusion that many have drawn is that we need to ensure every step delivers business value but that’s easier said than done.

Why is it so hard? The next thing Jack said helped me understand why. His point is that when you’ve taken the first step on your journey, it’s not just some intermediate station. It’s the “new now.” The new reality. The new as-is. And if the new reality is hanging around in some grotty airport trying to do your job via a Wi-Fi connection of dubious security and spending too much money on coffee and cookies…….you get the picture.

The problem with identifying that business value is that we’re not focusing on the new now but on something much more long-term. We’re trying to interpolate the near term business value out of the long term goal, which wasn’t defined based on near term needs.

What makes this all the more urgent is the increasing rate and unpredictability of change – in all aspects of doing business. This has led us to shorter planning horizons and an increasing tendency to regard that “to be” as nothing more than a general sense of direction. We’re thinking, “If we could deliver the whole thing really, really quickly on the basis of what we know we’d like to be able to do now, if it were possible, then it would look like this” – but knowing all the time that by the time we get anywhere near that end goal, it will have changed. It’s pretty obvious then that a first step, whose justification is entirely based on that imagined end goal, can easily be of extremely limited value.

So why not put more focus on the first step? That’s going to be the “new now.” How about making that our real target? Something that the enterprise sees as real value and that is actually feasible in a reasonable time scale (whatever that is). Instead of scoping that step as an intermediate (and rather immature) layover, why not put all our efforts into making it something really good? And when we get there and people know how the new now looks and feels, we can all think afresh about where to go next. After all, a journey is not simply defined by its destination but by how you get there and what you see and do on the way. If the actual journey itself is valuable, we may not want to get to the end of it.

Now that doesn’t mean we have to forget all about where we might want to be in three or even five years — not at all. The long term view is still important in helping us to make smart decisions about shorter term changes. It helps us allow for future change, even if only because it lets us see how much might change. And that helps us make sound decisions. But we should accept that our three or five year horizon needs to be continually open to revision – not on some artificial yearly cycle but every time there’s a “new now.” And this needs to include the times where the new now is not something we planned but is an emergent development from within or outside of the enterprise or is due to a major regulatory or market change.

So, if the focus is all on the first step and if our innovation cycle is getting steadily shorter, what’s the value of planning anything? Relax, I’m not about to fire the entire planning profession. If you don’t plan how you’re going to do something, what your dependencies are, how to react to the unexpected, etc., you’re unlikely to achieve your goal at all. Arguably that’s just project planning.

What about program planning? Well, if the program is so exposed to change maybe our concept of program planning needs to change. Instead of the plan being a thing fixed in stone that dictates everything, it could become a process in which the whole enterprise participates – itself open to emergence. The more I think about it, the more appealing that idea seems.

In my next post, I’ll go into more detail about how this might work, in particular from the perspective of Enterprise Architecture. I’ll also look more at how “the new planning” relates to innovation, emergence and social business and at the conflicts and synergies between these concerns. In the meantime, feel free to throw stones and see where the story doesn’t hold up.

Stuart Boardman is a Senior Business Consultant with KPN where he co-leads the Enterprise Architecture practice as well as the Cloud Computing solutions group. He is co-lead of The Open Group Cloud Computing Work Group’s Security for the Cloud and SOA project and a founding member of both The Open Group Cloud Computing Work Group and The Open Group SOA Work Group. Stuart is the author of publications by the Information Security Platform (PvIB) in The Netherlands and of his previous employer, CGI. He is a frequent speaker at conferences on the topics of Cloud, SOA, and Identity. 

7 Comments

Filed under Enterprise Architecture, Uncategorized

Flying in the Cloud by the Seat of Our Pants

By Chris Harding, The Open Group

In the early days of aviation, when instruments were unreliable or non-existent, pilots often had to make judgments by instinct. This was known as “flying by the seat of your pants.” It was exciting, but error prone, and accidents were frequent. Today, enterprises are in that position with Cloud Computing.

Staying On Course

Flight navigation does not end with programming the flight plan. The navigator must check throughout the flight that the plane is on course.  Successful use of Cloud requires, not only an understanding of what it can do for the business, but also continuous monitoring that it is delivering value as expected. A change of service-level, for example, can have as much effect on a user enterprise as a change of wind speed on an aircraft.

The Open Group conducted a Cloud Return on Investment (ROI) survey in 2011. Then, 55 percent of those surveyed felt that Cloud ROI would be easy to evaluate and justify, although only 35 percent had mechanisms in place to do it. When we repeated the survey in 2012, we found that the proportion that thought it would be easy had gone down to 44 percent, and only 20 percent had mechanisms in place. This shows, arguably, more realism, but it certainly doesn’t show any increased tendency to monitor the value delivered by Cloud. In fact, it shows the reverse. The enterprise pilots are flying by the seats of their pants. (The full survey results are available at http://www.opengroup.org/sites/default/files/contentimages/Documents/cloud_roi_formal_report_12_19_12-1.pdf)

They Have No Instruments

It is hard to blame the pilots for this, because they really do not have the instruments. The Open Group published a book in 2011, Cloud Computing for Business, that explains how to evaluate and monitor Cloud risk and ROI, with spreadsheet examples. The spreadsheet is pretty much the state-of-the-art in Cloud ROI instrumentation.  Like a compass, it is robust and functional at a basic level, but it does not have the sophistication and accuracy of a satellite navigation system. If we want better navigation, we must have better systems.

There is scope for Enterprise Architecture tool vendors to fill this need. As the inclusion of Cloud in Enterprise Architectures becomes commonplace, and Cloud Computing metrics and their relation to ROI become better understood, it should be possible to develop the financial components of Enterprise Architecture modeling tools so that the business impact of the Cloud systems can be seen more clearly.

The Enterprise Flight Crew

But this is not just down to the architects. The architecture is translated into systems by developers, and the systems are operated by operations staff. All of these people must be involved in the procurement and configuration of Cloud services and their monitoring through the Cloud buyers’ life cycle.

Cloud is already bringing development and operations closer together. The concept of DevOps, a paradigm that stresses communication, collaboration and integration between software developers and IT operations professionals, is increasingly being adopted by enterprises that use Cloud Computing. This communication, collaboration and integration must involve – indeed must start with – enterprise architects, and it must include the establishment and monitoring of Cloud ROI models. All of these professionals must co-operate to ensure that the Cloud-enabled enterprise keeps to its financial course.

The Architect as Pilot

The TOGAF® architecture development method includes a phase (Phase G) in which the architects participate in implementation governance. The following Phase H is currently devoted to architecture change management, with the objectives of ensuring that the architecture lifecycle is maintained, the architecture governance framework is executed, and the Enterprise Architecture capability meets current requirements. Perhaps Cloud architects should also think about ensuring that the system meets its business requirements, and continues to do so throughout its operation. They can then revisit earlier phases of the architecture development cycle (always a possibility in TOGAF) if it does not.

Flying the Cloud

Cloud Computing compresses the development lifecycle, cutting the time to market of new products and the time to operation of new enterprise systems. This is a huge benefit. It implies closer integration of architecture, development and operations. But this must be supported by proper instrumentation of the financial parameters of Cloud services, so that the architecture, development and operations professionals can keep the enterprise on course.

Flying by the seat of the pants must have been a great experience for the magnificent men in the flying machines of days gone by, but no one would think of taking that risk with the lives of 500 passengers on a modern aircraft. The business managers of a modern enterprise should not have to take that risk either. We must develop standard Cloud metrics and ROI models, so that they can have instruments to measure success.

Dr. Chris Harding is Director for Interoperability and SOA at The Open Group. He has been with The Open Group for more than ten years, and is currently responsible for managing and supporting its work on interoperability, including SOA and interoperability aspects of Cloud Computing. He is a member of the BCS, the IEEE and the AEA, and is a certified TOGAF practitioner.

10 Comments

Filed under Cloud/SOA

2013 Open Group Predictions, Vol. 1

By The Open Group

A big thank you to all of our members and staff who have made 2012 another great year for The Open Group. There were many notable achievements this year, including the release of ArchiMate 2.0, the launch of the Future Airborne Capability Environment (FACE™) Technical Standard and the publication of the SOA Reference Architecture (SOA RA) and the Service-Oriented Cloud Computing Infrastructure Framework (SOCCI).

As we wrap up 2012, we couldn’t help but look towards what is to come in 2013 for The Open Group and the industries we‘re a part of. Without further ado, here they are:

Big Data
By Dave Lounsbury, Chief Technical Officer

Big Data is on top of everyone’s mind these days. Consumerization, mobile smart devices, and expanding retail and sensor networks are generating massive amounts of data on behavior, environment, location, buying patterns – etc. – producing what is being called “Big Data”. In addition, as the use of personal devices and social networks continue to gain popularity so does the expectation to have access to such data and the computational power to use it anytime, anywhere. Organizations will turn to IT to restructure its services so it meets the growing expectation of control and access to data.

Organizations must embrace Big Data to drive their decision-making and to provide the optimal service mix services to customers. Big Data is becoming so big that the big challenge is how to use it to make timely decisions. IT naturally focuses on collecting data so Big Data itself is not an issue.. To allow humans to keep on top of this flood of data, industry will need to move away from programming computers for storing and processing data to teaching computers how to assess large amounts of uncorrelated data and draw inferences from this data on their own. We also need to start thinking about the skills that people need in the IT world to not only handle Big Data, but to make it actionable. Do we need “Data Architects” and if so, what would their role be?

In 2013, we will see the beginning of the Intellectual Computing era. IT will play an essential role in this new era and will need to help enterprises look at uncorrelated data to find the answer.

Security

By Jim Hietala, Vice President of Security

As 2012 comes to a close, some of the big developments in security over the past year include:

  • Continuation of hacktivism attacks.
  • Increase of significant and persistent threats targeting government and large enterprises. The notable U.S. National Strategy for Trusted Identities in Cyberspace started to make progress in the second half of the year in terms of industry and government movement to address fundamental security issues.
  • Security breaches were discovered by third parties, where the organizations affected had no idea that they were breached. Data from the 2012 Verizon report suggests that 92 percent of companies breached were notified by a third party.
  • Acknowledgement from senior U.S. cybersecurity professionals that organizations fall into two groups: those that know they’ve been penetrated, and those that have been penetrated, but don’t yet know it.

In 2013, we’ll no doubt see more of the same on the attack front, plus increased focus on mobile attack vectors. We’ll also see more focus on detective security controls, reflecting greater awareness of the threat and on the reality that many large organizations have already been penetrated, and therefore responding appropriately requires far more attention on detection and incident response.

We’ll also likely see the U.S. move forward with cybersecurity guidance from the executive branch, in the form of a Presidential directive. New national cybersecurity legislation seemed to come close to happening in 2012, and when it failed to become a reality, there were many indications that the administration would make something happen by executive order.

Enterprise Architecture

By Leonard Fehskens, Vice President of Skills and Capabilities

Preparatory to my looking back at 2012 and forward to 2013, I reviewed what I wrote last year about 2011 and 2012.

Probably the most significant thing from my perspective is that so little has changed. In fact, I think in many respects the confusion about what Enterprise Architecture (EA) and Business Architecture are about has gotten worse.

The stress within the EA community as both the demands being placed on it and the diversity of opinion within it increase continues to grow.  This year, I saw a lot more concern about the value proposition for EA, but not a lot of (read “almost no”) convergence on what that value proposition is.

Last year I wrote “As I expected at this time last year, the conventional wisdom about Enterprise Architecture continues to spin its wheels.”  No need to change a word of that. What little progress at the leading edge was made in 2011 seems to have had no effect in 2012. I think this is largely a consequence of the dust thrown in the eyes of the community by the ascendance of the concept of “Business Architecture,” which is still struggling to define itself.  Business Architecture seems to me to have supplanted last year’s infatuation with “enterprise transformation” as the means of compensating for the EA community’s entrenched IT-centric perspective.

I think this trend and the quest for a value proposition are symptomatic of the same thing — the urgent need for Enterprise Architecture to make its case to its stakeholder community, especially to the people who are paying the bills. Something I saw in 2011 that became almost epidemic in 2012 is conflation — the inclusion under the Enterprise Architecture umbrella of nearly anything with the slightest taste of “business” to it. This has had the unfortunate effect of further obscuring the unique contribution of Enterprise Architecture, which is to bring architectural thinking to bear on the design of human enterprise.

So, while I’m not quite mired in the slough of despond, I am discouraged by the community’s inability to advance the state of the art. In a private communication to some colleagues I wrote, “the conventional wisdom on EA is at about the same state of maturity as 14th century cosmology. It is obvious to even the most casual observer that the earth is both flat and the center of the universe. We debate what happens when you fall off the edge of the Earth, and is the flat earth carried on the back of a turtle or an elephant?  Does the walking of the turtle or elephant rotate the crystalline sphere of the heavens, or does the rotation of the sphere require the turtlephant to walk to keep the earth level?  These are obviously the questions we need to answer.”

Cloud

By Chris Harding, Director of Interoperability

2012 has seen the establishment of Cloud Computing as a mainstream resource for enterprise architects and the emergence of Big Data as the latest hot topic, likely to be mainstream for the future. Meanwhile, Service-Oriented Architecture (SOA) has kept its position as an architectural style of choice for delivering distributed solutions, and the move to ever more powerful mobile devices continues. These trends have been reflected in the activities of our Cloud Computing Work Group and in the continuing support by members of our SOA work.

The use of Cloud, Mobile Computing, and Big Data to deliver on-line systems that are available anywhere at any time is setting a new norm for customer expectations. In 2013, we will see the development of Enterprise Architecture practice to ensure the consistent delivery of these systems by IT professionals, and to support the evolution of creative new computing solutions.

IT systems are there to enable the business to operate more effectively. Customers expect constant on-line access through mobile and other devices. Business organizations work better when they focus on their core capabilities, and let external service providers take care of the rest. On-line data is a huge resource, so far largely untapped. Distributed, Cloud-enabled systems, using Big Data, and architected on service-oriented principles, are the best enablers of effective business operations. There will be a convergence of SOA, Mobility, Cloud Computing, and Big Data as they are seen from the overall perspective of the enterprise architect.

Within The Open Group, the SOA and Cloud Work Groups will continue their individual work, and will collaborate with other forums and work groups, and with outside organizations, to foster the convergence of IT disciplines for distributed computing.

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Cybersecurity, Enterprise Architecture

The Center of Excellence: Relating Everything Back to Business Objectives

By Serge Thorn, Architecting the Enterprise

This is the third and final installment of a series discussing how to implement SOA through TOGAF®. In my first blog post I explained the concept of the Center of Excellence, and creating a vision for your organization, my second blog post suggested how the Center of Excellence would define a Reference Architecture for the organization.

 SOA principles should clearly relate back to the business objectives and key architecture drivers. They will be constructed on the same mode as TOGAF 9.1 principles with the use of statement, rationale and implications. Below examples of the types of services which may be created:

  • Put the computing near the data
  • Services are technology neutral
  • Services are consumable
  • Services are autonomous
  • Services share a formal contract
  • Services are loosely coupled
  • Services abstract underlying logic
  • Services are reusable
  • Services are composable
  • Services are stateless
  • Services are discoverable
  • Location Transparency

Here is a detailed principle example:

  • Service invocation
    • All service invocations between application silos will be exposed through the Enterprise Service Bus (ESB)
    • The only exception to this principle will be when the service meets all the following criteria:
      • It will be used only within the same application silo
      • There is no potential right now or in the near future for re-use of this service
      • The service has already been right-sized
      • The  Review Team has approved the exception

As previously indicated, the SOA Center of Excellence (CoE) would also have to provide guidelines on SOA processes and related technologies. This may include:

  • Service analysis (Enterprise Architecture, BPM, OO, requirements and models, UDDI Model)
  • Service design (SOAD, specification, Discovery Process, Taxonomy)
  • Service provisioning (SPML, contracts, SLA)
  • Service implementation development (BPEL, SOAIF)
  • Service assembly and integration (JBI, ESB)
  • Service testing
  • Service deployment (the software on the network)
  • Service discovery (UDDI, WSIL, registry)
  • Service publishing (SLA, security, certificates, classification, location, UDDI, etc.)
  • Service consumption (WSDL, BPEL)
  • Service execution  (WSDM)
  • Service versioning (UDDI, WSDL)
  • Service Management and monitoring
  • Service operation
  • Programming, granularity and abstraction

Other activities may be considered by the SOA CoE such as providing a collaboration platform, asset management (service are just another type of assets), compliance with standards and best practices, use of guidelines, etc. These activities could also be supported by an Enterprise Architecture team.

As described in the TOGAF® 9.1 Framework, the SOA CoE can act as the governance body for SOA implementation, work with the Enterprise Architecture team, overseeing what goes into a new architecture that the organization is creating and ensuring that the architecture will meet the current and future needs of the organization.

The Center of Excellence provides expanded opportunities for organizations to leverage and reuse service-oriented infrastructure and knowledgebase to facilitate the implementation of cost-effective and timely SOA based solutions.

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

Comments Off

Filed under Cloud/SOA, Enterprise Architecture, Standards, TOGAF, TOGAF®, Uncategorized

Implementing SOA through TOGAF 9.1: The Center Of Excellence

By Serge Thorn, Architecting the Enterprise

This is the first installment of a three-part series discussing how to be successful in implementing an SOA initiative through TOGAF® 9.1.

Service-oriented architecture (SOA) has at times been challenged, but it is now on the verge of mainstream acceptance. It now shows maturity, success and even signs of popularity. SOA is an enterprise-scale architecture for linking resources as needed. These resources are represented as business-aligned services, which can participate and be composed in a set of choreographed processes to fulfil business needs.

In 2012, the use of SOA for pivotal emerging technologies, especially for mobile applications and cloud computing, suggests that the future prospect for SOA is favourable. SOA and cloud will begin to fade as differentiating terms because it will just be “the way we do things”. We are now at the point where everything we deploy is done in a service-oriented way, and cloud is being simply accepted as the delivery platform for applications and services. Many Enterprise Architects are also wondering if the mobile business model will drive SOA technologies in a new direction. Meanwhile, a close look at mobile application integration today tells us that pressing mobile trends will prompt IT and business leaders to ensure mobile-friendly infrastructure.

To be successful in implementing a SOA initiative, it is highly recommended that a company create a SOA Center of Excellence (CoE) and The Open Group clearly explains how this can be achieved through the use of TOGAF® 9.1. This article is based on the TOGAF® 9.1 Framework specification and specifically the sections 22.7.1.3 Partitions and Centers of Excellence with some additional thoughts on sections 22.7.1.1 Principle of Service-Orientation and 22.7.1.2 Governance and Support Strategy.

I have looked at the various attributes and provided further explanations or referred to previous experiences based on existing CoEs or sometimes called Integration Competency Centers.

The figure below illustrates a SOA CoE as part of the Enterprise Architecture team with domain and solution architects as well as developers, Quality Assurances (QAs) and Business Architects and Analysts coming from a delivery organization.

Part 1 Image

Establishing a SOA Center of Excellence

The SOA CoE supports methodologies, standards, governance processes and manages a service registry. The main goal of this core group is to establish best practices at design time to maximize reusability of services.

According to the TOGAF 9.1 Framework specification, a successful CoE will have several key attributes, including “a clear definition of the CoE’s mission: why it exists, its scope of responsibility, and what the organization and the architecture practice should expect from the CoE.”

Define a Vision

A SOA CoE must have a purpose. What do we want to achieve? What are the problems we need to solve?

It may sound obvious, but having a blueprint for SOA is critical. It is very easy for companies, especially large enterprises with disparate operations, to buy new technologies or integrate applications without regard to how they fit into the overall plan. The challenge in building a SOA is to keep people, including IT and business-side staff focused on the Enterprise Architecture goals.

In order to realize the vision of SOA the following topics should be addressed:

  • What to Build: A Reference Architecture
  • How to Build: Service-Oriented Modeling Method
  • Whether to build: Assessments, Roadmaps, and Maturity Evaluations
  • Guidance on Building: Architectural and Design Patterns
  • Oversight: Governance
  • How to Build: Standards and Tools

The SOA CoE would first have a vision which could be something like:

ABCCompany will effectively utilize SOA in order to achieve organizational flexibility and improve responsiveness to our customers.”

Then a mission statement should be communicated across the organization. Below are a few examples of mission statements:

“To enable dynamic linkage among application capabilities in a manner that facilitates business effectiveness, maintainability, customer satisfaction, rapid deployment, reuse, performance and successful implementation.”

“The mission of the CoE for SOA at ABCCompany is to promote, adopt, support the development and usage of ABCCompany standards, best practices, technologies and knowledge in the field of SOA and have a key role in the business transformation of ABCCompany. The CoE will collaborate with the business to create an agile organization, which in turn will facilitate ABCCompany to accelerate the creation of new products and services for the markets, better serve its customers, and better collaborate with partners and vendors.”

Define a Structure

The SOA CoE also needs to define a structure and the various interactions with the enterprise architecture team, the project management office, the business process/planning and strategy group, the product management group, etc.

The SOA CoE also needs to create a steering committee or board (which could be associated to an architecture board) to provide different types of support:

  • Architecture decision support
    • Maintain standards, templates and policies surrounding Integration and SOA
    • Participate in Integration and SOA design decisions
  • Operational support
    • Responsible for building and maintaining SOA Infrastructure
    • Purchasing registries and products to grow infrastructure
  • Development support
    • Development of administrative packages and services
    • Develop enterprise services based on strategic direction

Define Measurements

According to the TOGAF® 9.1 Framework Specification, “Clear goals for the CoE including measurements and Key Performance Indicators (KPIs). It is important to ensure that the measures and KPIs of the CoE do not drive inappropriate selection of SOA as the architecture style.”

Measurements and metrics will have to be identified. The common ones could be:

  • Service revenue
  • Service vitality
  • Ratio between services used and those created
  • Mean Time To Service Development or Service change
  • Service availability
  • Service reuse
  • Quality assurance

Define Testing Activities

As stated in the TOGAF® 9.1 Framework specification, “The CoE will provide the “litmus test” of a good service.”

Clearly comprehensive testing activities must be described by the SOA CoE. In addition to a set of defined processes related to Web Service Definition Language (WSDL) testing, functional unit testing, regression testing, security testing, interoperability testing, vulnerability testing and load, performance testing, an analysis tool suite may be used to tailor the unique testing and validation needs of Service Oriented Architectures.

This helps test the message layer functionality of their services by automating their testing and supports numerous transport protocols. A few examples include: HTTP 1.0, HTTP/1.1, JMS, MQ, RMI, SMTP, .NET WCF HTTP, .NET WCF TCP, Electronic Data Interchange, ESBs, etc.

Only by adopting a comprehensive testing stance can enterprises ensure that their SOA is robust, scalable, interoperable and secure.

  •  The CoE will disseminate the skills, experience, and capabilities of the SOA center to the rest of the architecture practice.

The Center of Excellence will promote best practices, methodologies, knowledge and pragmatic leading-edge solutions in the area of SOA to the project teams.

  •  Identify how members of the CoE, and other architecture practitioners, will be rewarded for success.

This may sounds like a good idea but I have never seen this as an applied practice.

Define a Skill Set

According to the TOGAF® 9.1 Framework specification, “Recognition that, at the start, it is unlikely the organization will have the necessary skills to create a fully functional CoE. The necessary skills and experience must be carefully identified, and where they are not present, acquired. A fundamental skill for leading practitioners within the CoE is the ability to mentor other practitioners transferring knowledge, skills, and experience.”

Competency and skills building is needed for any initiative. SOA is not just about integrating technologies and applications – it is a culture change within the enterprise, which requires IT to move from being a technology provider to a business enabler. There may be a wide range of skills required such as:

  • Enterprise Architecture
  • Value of SOA
  • Governance model for SOA
  • Business Process Management and SOA
  • Design of SOA solutions
  • Modeling
  • Technologies and standards
  • Security
  • Business communication

It has to be said that lack of SOA skills is the number one inhibitor to SOA adoption.

  • Close-out plan for when the CoE has fulfilled its purpose.

Here again, I am not sure that I have observed any SOA CoE being closed…

In the second installment of this three-part series I will discuss how the Center of Excellence defines a Reference Architecture for the organization.

Serge Thorn is CIO of Architecting the Enterprise.  He has worked in the IT Industry for over 25 years, in a variety of roles, which include; Development and Systems Design, Project Management, Business Analysis, IT Operations, IT Management, IT Strategy, Research and Innovation, IT Governance, Architecture and Service Management (ITIL). He is the Chairman of the itSMF (IT Service Management forum) Swiss chapter and is based in Geneva, Switzerland.

Comments Off

Filed under Cloud/SOA, Enterprise Architecture, Standards, TOGAF, TOGAF®