Tag Archives: HP

NASCIO Defines State of Enterprise Architecture at The Open Group Conference in Philadelphia

By E.G. Nadhan, HP

I have attended and blogged about many Open Group conferences. The keynotes at these conferences like other conferences provide valuable insight into the key messages and the underlying theme for the conference – which is Enterprise Architecture and Enterprise Transformation for The Open Group Conference in Philadelphia. Therefore, it is no surprise that Eric Sweden, Program Director, Enterprise Architecture & Governance, NASCIO will be delivering one of the keynotes on “State of the States: NASCIO on Enterprise Architecture”. Sweden asserts “Enterprise Architecture” provides an operating discipline for creating, operating, continual re-evaluation and transformation of an “Enterprise.” Not only do I agree with this assertion, but I would add that the proper creation, operation and continuous evaluation of the “Enterprise” systemically drives its transformation. Let’s see how.

Creation. This phase involves the definition of the Enterprise Architecture (EA) in the first place. Most often, this involves the definition of an architecture that factors in what is in place today while taking into account the future direction. TOGAF® (The Open Group Architecture Framework) provides a framework for developing this architecture from a business, application, data, infrastructure and technology standpoint; in alignment with the overall Architecture Vision with associated architectural governance.

Operation. EA is not a done deal once it has been defined. It is vital that the EA defined is sustained on a consistent basis with the advent of new projects, new initiatives, new technologies, and new paradigms. As the abstract states, EA is a comprehensive business discipline that drives business and IT investments. In addition to driving investments, the operation phase also includes making the requisite changes to the EA as a result of these investments.

Continuous Evaluation. We live in a landscape of continuous change with innovative solutions and technologies constantly emerging. Moreover, the business objectives of the enterprise are constantly impacted by market dynamics, mergers and acquisitions. Therefore, the EA defined and in operation must be continuously evaluated against the architectural principles, while exercising architectural governance across the enterprise.

Transformation. EA is an operating discipline for the transformation of an enterprise. Enterprise Transformation is not a destination — it is a journey that needs to be managed — as characterized by Twentieth Century Fox CIO, John Herbert. To Forrester Analyst Phil Murphy, Transformation is like the Little Engine That Could — focusing on the business functions that matter. (Big Data – highlighted in another keynote at this conference by Michael Cavaretta — is a paradigm gaining a lot of ground for enterprises to stay competitive in the future.)

Global organizations are enterprises of enterprises, undergoing transformation faced with the challenges of systemic architectural governance. NASCIO has valuable insight into the challenges faced by the 50 “enterprises” represented by each of the United States. Challenges that contrast the need for healthy co-existence of these states with the desire to retain a degree of autonomy. Therefore, I look forward to this keynote to see how EA done right can drive the transformation of the Enterprise.

By the way, remember when Enterprise Architecture was done wrong close to the venue of another Open Group conference?

How does Enterprise Architecture drive the transformation of your enterprise? Please let me know.

A version of this blog post originally appeared on the HP Journey through Enterprise IT Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. 

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Conference, Enterprise Architecture, Enterprise Transformation, TOGAF®

Driving Boundaryless Information Flow in Healthcare

By E.G. Nadhan, HP

I look forward with great interest to the upcoming Open Group conference on EA & Enterprise Transformation in Finance, Government & Healthcare in Philadelphia in July 2013. In particular, I am interested in the sessions planned on topics related to the Healthcare Industry. This industry is riddled with several challenges of uncontrolled medical costs, legislative pressures, increased plan participation, and improved longevity of individuals. Come to think of it, these challenges are not that different from those faced when defining a comprehensive enterprise architecture. Therefore, can the fundamental principles of Enterprise Architecture be applied towards the resolution of these challenges in the Healthcare industry? The Open Group certainly thinks so.

Enterprise Architecture is a discipline, methodology, and practice for translating business vision and strategy into the fundamental structures and dynamics of an enterprise at various levels of abstraction. As defined by TOGAF®, enterprise architecture needs to be developed through multiple phases. These include Business Architecture, Applications, Information, and Technology Architecture. All this must be in alignment with the overall vision. The TOGAF Architecture Development Method enables a systematic approach to addressing these challenges while simplifying the problem domain.

This approach to the development of Enterprise Architecture can be applied towards the complex problem domain that manifests itself in Healthcare. Thus, it is no surprise that The Open Group is sponsoring the Population Health Working Group, which has a vision to enable “boundary-less information flow” between the stakeholders that participate in healthcare delivery. Checkout the presentation delivered by Larry Schmidt, Chief Technologist, Health and Life Sciences Industries, HP, US at the Open Group conference in Philadelphia.

As a Platinum member of The Open Group, HP has co-chaired the release of multiple standards, including the first technical cloud standard. The Open Group is also leading the definition of the Cloud Governance Framework. Having co-chaired these projects, I look forward to the launch of the Population Health Working Group with great interest.

Given the role of information in today’s landscape, “boundary-less information flow” between the stakeholders that participate in healthcare delivery is vital. At the same time, how about injecting a healthy dose of innovation given that enterprise Architects are best positioned for innovation – a post triggered by Forrester Analyst Brian Hopkins’s thoughts on this topic. The Open Group — with its multifaceted representation from a wide array of enterprises — provides incredible opportunities for innovation in the context of the complex landscape of the healthcare industry. Take a look at the steps taken by HP Labs to innovate and improve patient care one day at a time.

I would strongly encourage you to attend Schmidt’s session, as well as the Healthcare Transformation Panel moderated by Open Group CEO, Allen Brown at this conference.

How about you? What are some of the challenges that you are facing within the Healthcare industry today? Have you applied Enterprise Architecture development methods to problem domains in other industries? Please let me know.

Connect with Nadhan on: Twitter, Facebook, Linkedin and Journey Blog.

A version of this blog post originally appeared on the HP Enterprise Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. 

3 Comments

Filed under Business Architecture, Cloud, Cloud/SOA, Conference, Enterprise Architecture, Healthcare, TOGAF®

The Open Group Cloud Computing Work Group Web Jam on CIO Priorities

By E.G. Nadhan, HP

Recently, I shared my experience leading the first Web Jam within The Open Group Cloud Work Group. We are now gearing up to have another one of these sessions – this time around, the topic being CIO priorities as driven by Cloud Computing. Even though the Web Jam is an internal session held within The Open Group Cloud Work Group, we want to factor in other opinions as well – hence this blog where I share my perspective on how Cloud Computing is defining the priorities for the CIO. I am basing this perspective on the findings from a survey conducted by IDG Research as published in this white paper on IT priorities where I was one of the persons interviewed.

I would categorize the CIO priorities across five drivers: customers, business, innovation, finance and governance.

1. Customers. CIOs must listen to their customers (especially shareholders). Cloud Computing is breeding a new generation of customer-focused CIOs.  Shareholders are driving IT to the Cloud. At the same time, enterprises need to be at least as social as their customers so that they can process the brontobytes of data generated through these channels.

2. Business. CIOs must shift their attention from technical matters to business issues. This is not surprising. As I outlined in an earlier blog post, the right way to transform to Cloud Computing has always been driven by the business needs of the enterprise. When addressing technical requests, CIOs need to first determine the underlying, business-driven root cause of the request.

3. Innovation. CIOs must make innovation part of the IT blood stream. CIOs need to take steps today to innovate the planet for 2020.  For example, the Cloud facilitates the storage of brontobytes of data that can be informationalized through data analysis techniques by those who have the sexiest job of the 21st Century – Data Scientist.

4. Finance. CIOs must have the right mechanisms in place to track the ROI of Cloud Computing.  As fellow blogger from The Open Group Chris Harding states, CIOs must not fly in the Cloud by the seat of their pants.  Note that tracking the ROI is not a one-time activity. CIOs must be ready to answer the ROI question on the Cloud.

5. Governance. CIOs must ensure that there is a robust Cloud governance model across the enterprise. In the past, I’ve explained how we can build upon SOA Governance to realize Cloud governance.  As a co-chair for the Cloud Governance project within The Open Group, I have a lot of interest in this space and would like to hear your thoughts.

So, there you have it. Those are the top 5 priorities for the CIO driven by key Cloud Computing forces. How about you? Are there other CIO priorities that you can share? I would be interested to know and quite happy to engage in a discussion as well.

Once the web jam has taken place, I am planning on sharing the discussions in this blog so that we can continue our discussion.

NadhanHP Distinguished Technologist, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHP.

2 Comments

Filed under Cloud, Cloud/SOA

Protecting Data is Good. Protecting Information Generated from Big Data is Priceless

By E.G. Nadhan, HP

This was the key message that came out of The Open Group® Big Data Security Tweet Jam on Jan 22 at 9:00 a.m. PT, which addressed several key questions centered on Big Data and security. Here is my summary of the observations made in the context of these questions.

Q1. What is Big Data security? Is it different from data security?

Big data security is more about information security. It is typically external to the corporate perimeter. IT is not prepared today to adequately monitor its sheer volume in brontobytes of data. The time period of long-term storage could violate compliance mandates. Note that storing Big Data in the Cloud changes the game with increased risks of leaks, loss, breaches.

Information resulting from the analysis of the data is even more sensitive and therefore, higher risk – especially when it is Personally Identifiable Information on the Internet of devices requiring a balance between utility and privacy.

At the end of the day, it is all about governance or as they say, “It’s the data, stupid! Govern it.”

Q2. Any thoughts about security systems as producers of Big Data, e.g., voluminous systems logs?

Data gathered from information security logs is valuable but rules for protecting it are the same. Security logs will be a good source to detect patterns of customer usage.

Q3. Most BigData stacks have no built in security. What does this mean for securing Big Data?

There is an added level of complexity because it goes across apps, network plus all end points. Having standards to establish identity, metadata, trust would go a long way. The quality of data could also be a security issue — has it been tampered with, are you being gamed etc. Note that enterprises have varying needs of security around their business data.

Q4. How is the industry dealing with the social and ethical uses of consumer data gathered via Big Data?

Big Data is still nascent and ground rules for handling the information are yet to be established. Privacy issue will be key when companies market to consumers. Organizations are seeking forgiveness rather than permission. Regulatory bodies are getting involved due to consumer pressure. Abuse of power from access to big data is likely to trigger more incentives to attack or embarrass. Note that ‘abuse’ to some is just business to others.

Q5. What lessons from basic data security and cloud security can be implemented in Big Data security?

Security testing is even more vital for Big Data. Limit access to specific devices, not just user credentials. Don’t assume security via obscurity for sensors producing bigdata inputs – they will be targets.

Q6. What are some best practices for securing Big Data? What are orgs doing now and what will organizations be doing 2-3 years from now?

Current best practices include:

  • Treat Big Data as your most valuable asset
  • Encrypt everything by default, proper key management, enforcement of policies, tokenized logs
  • Ask your Cloud and Big Data providers the right questions – ultimately, YOU are responsible for security
  • Assume data needs verification and cleanup before it is used for decisions if you are unable to establish trust with data source

Future best practices:

  • Enterprises treat Information like data today and will respect it as the most valuable asset in the future
  • CIOs will eventually become Chief Officer for Information

Q7. We’re nearing the end of today’s tweet tam. Any last thoughts on Big Data security?

Adrian Lane who participated in the tweet jam will be keynoting at The Open Group Conference in Newport Beach next week and wrote a good best practices paper on securing Big Data.

I have been part of multiple tweet chats specific to security as well as one on Information Optimization. Recently, I also conducted the first Open Group Web Jam internal to The Cloud Work Group.  What I liked about this Big Data Security Tweet Jam is that it brought two key domains together highlighting the intersection points. There was great contribution from subject matter experts forcing participants to think about one domain in the context of the other.

In a way, this post is actually synthesizing valuable information from raw data in the tweet messages – and therefore needs to be secured!

What are your thoughts on the observations made in this tweet jam? What measures are you taking to secure Big Data in your enterprise?

I really enjoyed this tweet jam and would strongly encourage you to actively participate in upcoming tweet jams hosted by The Open Group.  You get to interact with a wide spectrum of knowledgeable practitioners listed in this summary post.

NadhanHP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has more than 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project, and is also the founding co-chair for the Open Group Cloud Computing Governance project. Connect with Nadhan on: Twitter, Facebook, LinkedIn and Journey Blog.

 

2 Comments

Filed under Tweet Jam

The Open Group Newport Beach Conference – Early Bird Registration Ends January 4

By The Open Group Conference Team

The Open Group is busy gearing up for the Newport Beach Conference. Taking place January 28-31, 2013, the conference theme is “Big Data – The Transformation We Need to Embrace Today” and will bring together leading minds in technology to discuss the challenges and solutions facing Enterprise Architecture around the growth of Big Data. Register today!

Information is power, and we stand at a time when 90% of the data in the world today was generated in the last two years alone.  Despite the sheer enormity of the task, off the shelf hardware, open source frameworks, and the processing capacity of the Cloud, mean that Big Data processing is within the cost-effective grasp of the average business. Organizations can now initiate Big Data projects without significant investment in IT infrastructure.

In addition to tutorial sessions on TOGAF® and ArchiMate®, the conference offers roughly 60 sessions on a varied of topics including:

  • The ways that Cloud Computing is transforming the possibilities for collecting, storing, and processing big data.
  • How to contend with Big Data in your Enterprise?
  • How does Big Data enable your Business Architecture?
  • What does the Big Data revolution mean for the Enterprise Architect?
  • Real-time analysis of Big Data in the Cloud.
  • Security challenges in the world of outsourced data.
  • What is an architectural view of Security for the Cloud?

Plenary speakers include:

  • Christian Verstraete, Chief Technologist – Cloud Strategy, HP
  • Mary Ann Mezzapelle, Strategist – Security Services, HP
  • Michael Cavaretta, Ph.D, Technical Leader, Predictive Analytics / Data Mining Research and Advanced Engineering, Ford Motor Company
  • Adrian Lane, Analyst and Chief Technical Officer, Securosis
  • David Potter, Chief Technical Officer, Promise Innovation Oy
  • Ron Schuldt, Senior Partner, UDEF-IT, LLC

A full conference agenda is available here. Tracks include:

  • Architecting Big Data
  • Big Data and Cloud Security
  • Data Architecture and Big Data
  • Business Architecture
  • Distributed Services Architecture
  • EA and Disruptive Technologies
  • Architecting the Cloud
  • Cloud Computing for Business

Early Bird Registration

Early Bird registration for The Open Group Conference in Newport Beach ends January 4. Register now and save! For more information or to register: http://www.opengroup.org/event/open-group-newport-beach-2013/reg

Upcoming Conference Submission Deadlines

In addition to the Early Bird registration deadline to attend the Newport Beach conference, there are upcoming deadlines for speaker proposal submissions to Open Group conferences in Sydney, Philadelphia and London. To submit a proposal to speak, click here.

Venue Industry Focus Submission Deadline
Sydney (April 15-17) Finance, Defense, Mining January 18, 2013
Philadelphia (July 15-17) Healthcare, Finance, Defense April 5, 2013
London (October 21-23) Finance, Government, Healthcare July 8, 2013

We expect space on the agendas of these events to be at a premium, so it is important for proposals to be submitted as early as possible. Proposals received after the deadline dates will still be considered, if space is available; if not, they may be carried over to a future conference. Priority will be given to proposals received by the deadline dates and to proposals that include an end-user organization, at least as a co-presenter.

Comments Off

Filed under Conference

Call for Submissions

By Patty Donovan, The Open Group

The Open Group Blog is celebrating its second birthday this month! Over the past few years, our blog posts have tended to cover Open Group activities – conferences, announcements, our lovely members, etc. While several members and Open Group staff serve as regular contributors, we’d like to take this opportunity to invite our community members to share their thoughts and expertise on topics related to The Open Group’s areas of expertise as guest contributors.

Here are a few examples of popular guest blog posts that we’ve received over the past year

Blog posts generally run between 500 and 800 words and address topics relevant to The Open Group workgroups, forums, consortiums and events. Some suggested topics are listed below.

  • ArchiMate®
  • Big Data
  • Business Architecture
  • Cloud Computing
  • Conference recaps
  • DirectNet
  • Enterprise Architecture
  • Enterprise Management
  • Future of Airborne Capability Environment (FACE™)
  • Governing Board Businesses
  • Governing Board Certified Architects
  • Governing Board Certified IT Specialists
  • Identity Management
  • IT Security
  • The Jericho Forum
  • The Open Group Trusted Technology Forum (OTTF)
  • Quantum Lifecycle Management
  • Real-Time Embedded Systems
  • Semantic Interoperability
  • Service-Oriented Architecture
  • TOGAF®

If you have any questions or would like to contribute, please contact opengroup (at) bateman-group.com.

Please note that all content submitted to The Open Group blog is subject to The Open Group approval process. The Open Group reserves the right to deny publication of any contributed works. Anything published shall be copyright of The Open Group.

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

1 Comment

Filed under Uncategorized

Build Upon SOA Governance to Realize Cloud Governance

By E.G. Nadhan, HP

The Open Group SOA Governance Framework just became an International Standard available to government and enterprises worldwide. At the same time, I read an insightful post by ZDNet Blogger, Joe McKendrick who states that Cloud and automation drive new growth in SOA governance market. I have always maintained that the fundamentals of Cloud Computing are based upon SOA principles. This brings up the next natural question: Where are we with Cloud Governance?

I co-chair the Open Group project for defining the Cloud Governance framework. Fundamentally, the Cloud Governance framework builds upon The Open Group SOA Governance Framework and provides additional context for Cloud Governance in relation to other governance standards in the industry. We are with Cloud Governance today where we were with SOA Governance a few years back when The Open Group started on the SOA Governance framework project.

McKendrick goes on to say that the tools and methodologies built and stabilized over the past few years for SOA projects are seeing renewed life as enterprises move to the Cloud model. In McKendrick’s words, “it is just a matter of getting the word out.” That may be the case for the SOA governance market. But, is that so for Cloud Governance?

When it comes to Cloud Governance, it is more than just getting the word out. We must make progress in the following areas for Cloud Governance to become real:

  • Sustained adoption. Enterprises must continuously adopt cloud based services balancing it with outsourcing alternatives. This will give more visibility to the real-life use cases where Cloud Governance can be exercised to validate and refine the enabling set of governance models.
  • Framework Definition. Finally, Cloud Governance needs a standard framework to facilitate its adoption. Just like the SOA Governance Framework, the definition of a standard for the Cloud Governance Framework as well as the supporting reference models will pave the way for the consistent adoption of Cloud Governance.

Once these progressions are made, Cloud Governance will be positioned like SOA Governance—and it will then be just a “matter of getting the word out.”

A version of this blog post originally appeared on the Journey through Enterprise IT Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Connect with Nadhan on: Twitter, Facebook, Linkedin and Journey Blog.

1 Comment

Filed under Cloud, Cloud/SOA

I Thought I had Said it All – and Then Comes Service Technology

By E.G. Nadhan, HP

It is not the first time that I am blogging about the evolution of fundamental service orientation principles serving as an effective foundation for cloud computing. You may recall my earlier posts in The Open Group blog on Top 5 tell-tale signs of SOA evolving to the Cloud, followed by The Right Way to Transform to Cloud Computing, following up with my latest post on this topic about taking a lesson from history to integrate to the Cloud. I thought I had said it all and there was nothing more to blog about on this topic other than diving into more details.

Until I saw the post by Forbes blogger Joe McKendrick on Before There Was Cloud Computing, There was SOA. In this post, McKendrick introduces a new term – Service Technology – which resonates with me because it cements the concept of a service-oriented thinking that technically enables the realization of SOA within the enterprise followed by its sustained evolution to cloud computing. In fact, the 5th International SOA, Cloud and Service Technology Symposium is a conference centered around this concept.

Even if this is a natural evolution, we must still exercise caution that we don’t fall prey to the same pitfalls of integration like the IT world did in the past. I elaborate further on this topic in my post on The Open Group blog: Take a lesson from History to Integrate to the Cloud.

I was intrigued by another comment in McKendrick’s post about “Cloud being inherently service-oriented.” Almost. I would slightly rephrase it to Cloud done right being inherently service-oriented. So, what do I mean by Cloud done right. Voila:The Right Way to Transform to Cloud Computing on The Open Group blog.

So, how about you? Where are you with your SOA strategy? Have you been selectively transforming to the Cloud? Do you have “Service Technology” in place within your enterprise?

I would like to know, and something tells me McKendrick will as well.

So, it would be an interesting exercise to see if the first Technical standard for Cloud Computing published by The Open Group should be extended to accommodate the concept of Service Technology. Perhaps, it is already an integral part of this standard in concept. Please let me know if you are interested. As the co-chair for this Open Group project, I am very interested in working with you on taking next steps.

A version of this blog post originally appeared on the Journey through Enterprise IT Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Connect with Nadhan on: Twitter, Facebook, Linkedin and Journey Blog.

Comments Off

Filed under Cloud/SOA

It Is a Big World for Big Data After All

By E.G. Nadhan, HP

In the Information Week Global CIO blog, Patrick Houston says that big is bad when it comes to data, questioning the appropriateness of the term big data. Houston highlights the risk of the term being taken literally by the not-so-technical folks. Big data will continue to spread with emerging associative terms like big data expertbig data technologies, etc. I also see other reactions to this term like the one in Allison Watterson’s post, “What do you mean big data, little data is hard enough.” So why has it gained this broad adoption so fast?

Here are my top 5 reasons why the term big data has stuck, and why it may be appropriate, after all:

Foundational. It all started with data processing going decades back. Over the years, we have seen:

  • Big Computer – monolithic behemoths – or in today’s terms, legacy platforms
  • Big Network – local and wide area networks
  • Big Connector – the Internet that facilitated meaningful access with a purpose to consumers across the globe
  • Big Communicator – social media that has fostered communication beyond our imagination

It is all leading up to the generation and consumption of big data driven by presence. It was all about data to start with, and we have come back full circle to data again.

PervasiveBig Data will pervasively promote a holistic approach across all architectural elements of cloud computing:

  • Compute – complex data processing algorithms
  • Network – timely transmission of high volumes of data
  • Storage – various media to house <choose your prefix> bytes of data

FamiliarBig is always part of compound associations whether it be a hamburger (Big Mac), Big Brother or The Big Dipper. It is a big deal, shall we say? Data has always been generated and consumed with continued emergence of evolutionary technologies. You say big data and pictures of data rapidly growing like a balloon or spreading like water come to mind. It has something to do with data. There is something big about it.

Synthetic. Thomas C. Redman introduces a term “Informationlization” in the Harvard Business Review blog titled, “Integrate data into product, or get left behind.”  To me, the term big data is also about the synthesis individual pixels on the display device coming together to present a cohesive, meaningful picture.

Simple. You cannot get simpler than a three-letter word paired up with a four-letter word to mean something by itself. Especially when neither one is a TLA (three-letter acronym) for something very difficult to pronounce! Children in their elementary grades start learning these simple words before moving on to complex spelling bees with an abundance of vowels and y and x and q letters. Big data rolls off the tongue easily with a total of three syllables.

As humans, we tend to gravitate towards simplicity, which is why the whole world chimes in and sways back and forth when Sir Paul McCartney sings Hey Jude! decades after the first performance of this immortal piece. The line that sticks in our mind is the simplest line in the whole song – easy to render – one that we hum along with our hearts. Likewise, big data provides the most simplistic interpretation possible for a really complex world out there.

I actually like what Houston proposes – gushing data. However, I am not sure if it would enjoy the attention that big data gets. It represents a domain that needs to be addressed globally across all architectural layers by everyone including the consumers, administrators and orchestrators of data.

Therefore, big data is not just good enough – it is apt.

What about you? Do you have other names in mind? What does big data mean to you?

A version of this blog post originally appeared on the HP Enterprise Services Blog.

HP Distinguished Technologist and Cloud Advisor, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHPwww.hp.com/go/journeyblog

2 Comments

Filed under Data management

How the Operating System Got Graphical

By Dave Lounsbury, The Open Group

The Open Group is a strong believer in open standards and our members strive to help businesses achieve objectives through open standards. In 1995, under the auspices of The Open Group, the Common Desktop Environment (CDE) was developed and licensed for use by HP, IBM, Novell and Sunsoft to make open systems desktop computers as easy to use as PCs.

CDE is a single, standard graphical user interface for managing data, files, and applications on an operating system. Both application developers and users embraced the technology and approach because it provided a simple and common approach to accessing data and applications on network. With a click of a mouse, users could easily navigate through the operating system – similar to how we work on PCs and Macs today.

It was the first successful attempt to standardize on a desktop GUI on multiple, competing platforms. In many ways, CDE is responsible for the look, feel, and functionality of many of the popular operating systems used today, and brings distributed computing capabilities to the end user’s desktop.

The Open Group is now passing the torch to a new CDE community, led by CDE suppliers and users such as Peter Howkins and Jon Trulson.

“I am grateful that The Open Group decided to open source the CDE codebase,” said Jon Trulson. “This technology still has its fans and is very fast and lightweight compared to the prevailing UNIX desktop environments commonly in use today. I look forward to seeing it grow.”

The CDE group is also releasing OpenMotif, which is the industry standard graphical interface that standardizes application presentation on open source operating systems such as Linux. OpenMotif is also the base graphical user interface toolkit for the CDE.

The Open Group thanks these founders of the new CDE community for their dedication and contribution to carrying this technology forward. We are delighted this community is moving forward with this project and look forward to the continued growth in adoption of this important technology.

For those of you who are interested in learning more about the CDE project and would like to get involved, please see http://sourceforge.net/projects/cdesktopenv.

Dave LounsburyDave Lounsbury is The Open Group‘s Chief Technology Officer, previously VP of Collaboration Services.  Dave holds three U.S. patents and is based in the U.S.

Comments Off

Filed under Standards

#ogChat Summary – The Future of BYOD

By Patty Donovan, The Open Group

With over 400 tweets flying back and forth, last week’s BYOD Tweet Jam (#ogChat) saw a fast-paced, lively discussion on the future of the bring your own device (BYOD) trend and its implications in the enterprise. In case you missed the conversation, here’s a recap of last week’s #ogChat!

There were a total of 29 participants including:

Here is a high-level a snapshot of yesterday’s #ogChat:

Q1 What are the quantifiable benefits of BYOD? What are the major risks of #BYOD, and do these risks outweigh the benefits? #ogChat

Participants generally agreed that the main risk of BYOD is data security and benefits include cost and convenience.

  • @MobileGalen Data policy is core because that’s where the real value is in business. Affects access and intrusion/hacking of course secondarily #ogChat
  • @technodad Q1 #BYOD transcends time/space boundaries – necessary for a global business. #ogChat
  • @AWildCSO Q1 Risks: Risk to integrity and availability of corporate IT systems – malware into enterprise from employee owned devices #ogChat

Q2 What are the current security issues with #BYOD, and how should organizations go about securing those devices? #ogChat

The most prominent issue discussed was who owns the responsibility of security. Many couldn’t agree on whether responsibility fell on the user or the organization.

  • @AWildCSO Q2: Main issue is the confidentiality of data. Not a new issue, has been around a while, especially since the advent of networking. #ogChat
  • @cebess .@ MobileGalen Right — it’s about the data not the device. #ogChat
  • @AppsTechNews Q2 Not knowing who’s responsible? Recent ITIC/KnowBe4 survey: 37% say corporation responsible for #BYOD security; 39% say end user #ogChat
  • @802dotchris @MobileGalen there’s definitiely a “golden ratio” of fucntionality to security and controls @IDGTechTalk #ogChat
  • @MobileGalen #ogChat Be careful about looking for mobile mgmt tools as your fix. Most are about disablement not enablement. Start w enable, then protect.

Q3 How can an organization manage corporate data on employee owned devices, while not interfering with data owned by an employee? #ogChat

Most participants agreed that securing corporate data is a priority but were stumped when it came to maintaining personal data privacy. Some suggested that organizations will have no choice but to interfere with personal data, but all agreed that no matter what the policy, it needs to be clearly communicated to employees.

  • @802dotchris @jim_hietala in our research, we’re seeing more companies demand app-by-app wipe or other selective methods as MDM table stakes #ogChat
  • @AppsTechNews Q3 Manage the device, manage & control apps running on it, and manage data within those apps – best #BYOD solutions address all 3 #ogChat
  • @JonMoger @theopengroup #security #ogChat #BYOD is a catalyst for a bigger trend driven by cultural shift that affects HR, legal, finance, LOB.
  • @bobegan I am a big believer in people, and i think most employees feel that they own a piece of corporate policy #ogChat
  • @mobilityofficer @theopengroup Q3: Sometimes you have no choice but to interfere with private data but you must communicate that to employees #ogChat

Q4 How does #BYOD contribute to the creation or use of #BigData in the enterprise? What role does #BYOD play in #BigData strategy? #ogChat

Participants exchanged opinions on the relationship between BYOD and Big Data, leaving much room for future discussion.

  • @technodad Q4 #bigdata created by mobile, geotgged, realtime apps is gold dust for business analytics & marketing. Smart orgs will embrace it. #ogChat
  • @cebess .@ technodad Context is king. The device in the field has quite a bit of contextual info. #ogChat
  • @bobegan @cebess Right, a mobile strategy, including BYOD is really about information supply chain managment. Must include many audiences #ogChat

Q5 What best practices can orgs implement to provide #BYOD flexibility and also maintain control and governance over corporate data? #ogChat

When discussing best practices, it became clear that no matter what, organizations must educate employees and be consistent with business priorities. Furthermore, if data is precious, treat it that way.

  • @AWildCSO Q5: Establish policies and processes for the classification, ownership and custodianship of information assets. #ogChat
  • @MobileGalen #ogChat: The more precious your info, the less avail it should be, BYOD or not. Use containered apps for sensitive, local access for secret
  • @JonMoger @theopengroup #BYOD #ogChat 1. Get the right team to own 2. Educate mgmt on risks & opps 3. Set business priorities 4. Define policies

Q6 How will organizations embrace or reject #BYOD moving forward? Will they have a choice or will employees dictate use? #ogChat

While understanding the security risks, most participants embraced BYOD as a big trend that will eventually become the standard moving forward.

A big thank you to all the participants who made this such a great discussion!

Patricia Donovan is Vice President, Membership & Events, at The Open Group and a member of its executive management team. In this role she is involved in determining the company’s strategic direction and policy as well as the overall management of that business area. Patricia joined The Open Group in 1988 and has played a key role in the organization’s evolution, development and growth since then. She also oversees the company’s marketing, conferences and member meetings. She is based in the U.S.

Comments Off

Filed under Tweet Jam

Secrets Behind the Rapid Growth of SOA

By E.G. Nadhan, HP

Service Oriented Architecture has been around for more than a decade and has steadily matured over the years with increasing levels of adoption. Cloud computing, a paradigm that is founded upon the fundamental service oriented principles, has fueled SOA’s adoption in recent years. ZDNet blogger Joe McKendrick calls out a survey by Companies and Markets in one of his blog posts – SOA market grew faster than expected.

Some of the statistics from this survey as referenced by McKendrick include:

  • SOA represents a total global market value of $5.518 billion, up from $3.987 billion in 2010 – or a 38% growth.
  • The SOA market in North America is set to grow at a compound annual growth rate (CAGR) of 11.5% through 2014.

So, what are the secrets of the success that SOA seems to be enjoying?  During the past decade, I can recall a few skeptics who were not so sure about SOA’s adoption and growth.  But I believe there are 5 “secrets” behind the success story of SOA that should put such skepticism to rest:

  1. Architecture. Service oriented architectures have greatly facilitated a structured approach to enterprise architecture (EA) at large. Despite debates over the scope of EA and SOA, the fact remains that service orientation is an integral part of the foundational factors considered by the enterprise architect. If anything, it has also acted as a catalyst for giving more visibility to the need for well-defined enterprise architecture to be in place for the current and desired states.
  2. Application. Service orientation has promoted standardized interfaces that have enabled the continued existence of multiple applications in an integrated, cohesive manner. Thanks to a SOA-based approach, integration mechanisms are no longer held hostage to proprietary formats and legacy platforms.
  3. Availability. Software Vendors have taken the initiative to make their functionality available through services. Think about the number of times you have heard a software vendor suggest Web services as their de-facto method for integrating to other systems? Single-click generation of a Web service is a very common feature across most of the software tools used for application development.
  4. Alignment. SOA has greatly facilitated and realized increased alignment from multiple fronts including the following:
    • Business to IT. The definition of application and technology services is really driven by the business need in the form of business services.
    • Application to Infrastructure. SOA strategies for the enterprise have gone beyond the application layer to the infrastructure, resulting in greater alignment between the application being deployed and the supporting infrastructure. Infrastructure services are an integral part of the comprehensive set of services landscape for an enterprise.
    • Platforms and technology. Interfaces between applications are much less dependent on the underlying technologies or platforms, resulting in increased alignment between various platforms and technologies. Interoperability has been taken to new levels across the extended enterprise.
  5. AdoptionSOA has served as the cornerstone for new paradigms like cloud computing. Increased adoption of SOA has also resulted in the evolution of multiple industry standards for SOA and has also led to the evolution of standards for infrastructure services to be provisioned in the cloudStandards do take time to evolve, but when they do, it is a tacit endorsement by the IT industry of the maturity of the underlying phenomenon — in this case, SOA.

Thus, the application of service oriented principles across the enterprise has increased SOA’s adoption spurred by the availability of readily exposed services across all architectural layers resulting in increased alignment between business and IT.

What about you? What factors come to your mind as SOA success secrets? Is your SOA experience in alignment with the statistics from the report McKendrick referenced? I would be interested to know.

Reposted with permission from CIO Magazine.

HP Distinguished Technologist, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHP.

1 Comment

Filed under Cloud/SOA

Top 5 Tell-tale Signs of SOA Evolving to the Cloud

By E.G. Nadhan, HP Enterprise Services

Rewind two decades and visualize what a forward-thinking prediction would have looked like then —  IT is headed towards a technology agnostic, service-based applications and infrastructure environment, consumed when needed, with usage-based chargeback models in place for elastic resources. A forward thinking tweet would have simply said – IT is headed for the Cloud. These concepts have steadily evolved within applications first with virtualization expediting their evolution within infrastructure across enterprises. Thus, IT has followed an evolutionary pattern over the years forcing enterprises to continuously revisit their overall strategy.

What started as SOA has evolved into the Cloud.  Here are five tell-tale signs:

  • As-a-service model:  Application interfaces being exposed as services in a standardized fashion were the technical foundation to SOA. This concept was slowly but steadily extended to the infrastructure environment leading to IaaS and eventually, [pick a letter of your choice]aaS. Infrastructure components, provisioned as services, had to be taken into account as part of the overall SOA strategy. Given the vital role of IaaS within the Cloud, a holistic SOA enterprise-wide SOA strategy is essential for successful Cloud deployment.
  • Location transparency: Prior to service orientation, applications had to be aware of the logistics of information sources. Service orientation introduced location transparency so that the specifics of the physical location where the services were executed did not matter as much. Extending this paradigm, Cloud leverages the available resources as and when needed for execution of the services provided.
  • Virtualization: Service orientation acted as a catalyst for virtualization of application interfaces wherein the standardization of the interfaces was given more importance than the actual execution of the services. Virtualization was extended to infrastructure components facilitating their rapid provisioning as long as it met the experience expectations of the consumers.
  • Hardware: IaaS provisioning based on virtualization along with the partitioning of existing physical hardware into logically consumable segments resulted in hardware being shared across multiple applications. Cloud extends this notion into a pool of hardware resources being shared across multiple applications.
  • Chargeback: SOA was initially focused on service implementation after which the focus shifted to SOA Governance and SOA Management including the tracking of metrics and chargeback mechanism. Cloud is following a similar model, which is why the challenges of metering and chargeback mechanisms that IT is dealing with in the Cloud are fundamentally similar to monitoring service consumption across the enterprise.

These are my tell-tale signs. I would be very interested to know about practical instances of similar signs on your end.

Figure 1: The Open Group Service Oriented Cloud Computing Infrastructure Technical Standard

It is no surprise that the very first Cloud technical standard published by The Open Group — Service Oriented Cloud Computing Infrastructure – initially started as the Service Oriented Infrastructure (SOI) project within The Open Group SOA Work Group. As its co-chair, I had requested extending SOI into the Open Group Cloud Work Group when it was formed making it a joint project across both work groups. Today, you will see how the SOCCI technical standard calls out the evolution of SOI into SOCCI for the Cloud.

To find out more about the new SOCCI technical standard, please check out: http://www3.opengroup.org/news/press/open-group-publishes-new-standards-soa-and-cloud

 This blog post was originally posted on HP’s Technical Support Services Blog.

HP Distinguished Technologist, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHP.

4 Comments

Filed under Cloud, Cloud/SOA, Service Oriented Architecture, Standards

Open Group Security Gurus Dissect the Cloud: Higher of Lower Risk

By Dana Gardner, Interarbor Solutions

For some, any move to the Cloud — at least the public Cloud — means a higher risk for security.

For others, relying more on a public Cloud provider means better security. There’s more of a concentrated and comprehensive focus on security best practices that are perhaps better implemented and monitored centrally in the major public Clouds.

And so which is it? Is Cloud a positive or negative when it comes to cyber security? And what of hybrid models that combine public and private Cloud activities, how is security impacted in those cases?

We posed these and other questions to a panel of security experts at last week’s Open Group Conference in San Francisco to deeply examine how Cloud and security come together — for better or worse.

The panel: Jim Hietala, Vice President of Security for The Open Group; Stuart Boardman, Senior Business Consultant at KPN, where he co-leads the Enterprise Architecture Practice as well as the Cloud Computing Solutions Group; Dave Gilmour, an Associate at Metaplexity Associates and a Director at PreterLex Ltd., and Mary Ann Mezzapelle, Strategist for Enterprise Services and Chief Technologist for Security Services at HP.

The discussion is moderated by Dana Gardner, Principal Analyst at Interarbor Solutions. The full podcast can be found here.

Here are some excerpts:

Gardner: Is this notion of going outside the firewall fundamentally a good or bad thing when it comes to security?

Hietala: It can be either. Talking to security people in large companies, frequently what I hear is that with adoption of some of those services, their policy is either let’s try and block that until we get a grip on how to do it right, or let’s establish a policy that says we just don’t use certain kinds of Cloud services. Data I see says that that’s really a failed strategy. Adoption is happening whether they embrace it or not.

The real issue is how you do that in a planned, strategic way, as opposed to letting services like Dropbox and other kinds of Cloud Collaboration services just happen. So it’s really about getting some forethought around how do we do this the right way, picking the right services that meet your security objectives, and going from there.

Gardner: Is Cloud Computing good or bad for security purposes?

Boardman: It’s simply a fact, and it’s something that we need to learn to live with.

What I’ve noticed through my own work is a lot of enterprise security policies were written before we had Cloud, but when we had private web applications that you might call Cloud these days, and the policies tend to be directed toward staff’s private use of the Cloud.

Then you run into problems, because you read something in policy — and if you interpret that as meaning Cloud, it means you can’t do it. And if you say it’s not Cloud, then you haven’t got any policy about it at all. Enterprises need to sit down and think, “What would it mean to us to make use of Cloud services and to ask as well, what are we likely to do with Cloud services?”

Gardner: Dave, is there an added impetus for Cloud providers to be somewhat more secure than enterprises?

Gilmour: It depends on the enterprise that they’re actually supplying to. If you’re in a heavily regulated industry, you have a different view of what levels of security you need and want, and therefore what you’re going to impose contractually on your Cloud supplier. That means that the different Cloud suppliers are going to have to attack different industries with different levels of security arrangements.

The problem there is that the penalty regimes are always going to say, “Well, if the security lapses, you’re going to get off with two months of not paying” or something like that. That kind of attitude isn’t going to go in this kind of security.

What I don’t understand is exactly how secure Cloud provision is going to be enabled and governed under tight regimes like that.

An opportunity

Gardner: Jim, we’ve seen in the public sector that governments are recognizing that Cloud models could be a benefit to them. They can reduce redundancy. They can control and standardize. They’re putting in place some definitions, implementation standards, and so forth. Is the vanguard of correct Cloud Computing with security in mind being managed by governments at this point?

Hietala: I’d say that they’re at the forefront. Some of these shared government services, where they stand up Cloud and make it available to lots of different departments in a government, have the ability to do what they want from a security standpoint, not relying on a public provider, and get it right from their perspective and meet their requirements. They then take that consistent service out to lots of departments that may not have had the resources to get IT security right, when they were doing it themselves. So I think you can make a case for that.

Gardner: Stuart, being involved with standards activities yourself, does moving to the Cloud provide a better environment for managing, maintaining, instilling, and improving on standards than enterprise by enterprise by enterprise? As I say, we’re looking at a larger pool and therefore that strikes me as possibly being a better place to invoke and manage standards.

Boardman: Dana, that’s a really good point, and I do agree. Also, in the security field, we have an advantage in the sense that there are quite a lot of standards out there to deal with interoperability, exchange of policy, exchange of credentials, which we can use. If we adopt those, then we’ve got a much better chance of getting those standards used widely in the Cloud world than in an individual enterprise, with an individual supplier, where it’s not negotiation, but “you use my API, and it looks like this.”

Having said that, there are a lot of well-known Cloud providers who do not currently support those standards and they need a strong commercial reason to do it. So it’s going to be a question of the balance. Will we get enough specific weight of people who are using it to force the others to come on board? And I have no idea what the answer to that is.

Gardner: We’ve also seen that cooperation is an important aspect of security, knowing what’s going on on other people’s networks, being able to share information about what the threats are, remediation, working to move quickly and comprehensively when there are security issues across different networks.

Is that a case, Dave, where having a Cloud environment is a benefit? That is to say more sharing about what’s happening across networks for many companies that are clients or customers of a Cloud provider rather than perhaps spotty sharing when it comes to company by company?

Gilmour: There is something to be said for that, Dana. Part of the issue, though, is that companies are individually responsible for their data. They’re individually responsible to a regulator or to their clients for their data. The question then becomes that as soon as you start to share a certain aspect of the security, you’re de facto sharing the weaknesses as well as the strengths.

So it’s a two-edged sword. One of the problems we have is that until we mature a little bit more, we won’t be able to actually see which side is the sharpest.

Gardner: So our premise that Cloud is good and bad for security is holding up, but I’m wondering whether the same things that make you a risk in a private setting — poor adhesion to standards, no good governance, too many technologies that are not being measured and controlled, not instilling good behavior in your employees and then enforcing that — wouldn’t this be the same either way? Is it really Cloud or not Cloud, or is it good security practices or not good security practices? Mary Ann?

No accountability

Mezzapelle: You’re right. It’s a little bit of that “garbage in, garbage out,” if you don’t have the basic things in place in your enterprise, which means the policies, the governance cycle, the audit, and the tracking, because it doesn’t matter if you don’t measure it and track it, and if there is no business accountability.

David said it — each individual company is responsible for its own security, but I would say that it’s the business owner that’s responsible for the security, because they’re the ones that ultimately have to answer that question for themselves in their own business environment: “Is it enough for what I have to get done? Is the agility more important than the flexibility in getting to some systems or the accessibility for other people, as it is with some of the ubiquitous computing?”

So you’re right. If it’s an ugly situation within your enterprise, it’s going to get worse when you do outsourcing, out-tasking, or anything else you want to call within the Cloud environment. One of the things that we say is that organizations not only need to know their technology, but they have to get better at relationship management, understanding who their partners are, and being able to negotiate and manage that effectively through a series of relationships, not just transactions.

Gardner: If data and sharing data is so important, it strikes me that Cloud component is going to be part of that, especially if we’re dealing with business processes across organizations, doing joins, comparing and contrasting data, crunching it and sharing it, making data actually part of the business, a revenue generation activity, all seems prominent and likely.

So to you, Stuart, what is the issue now with data in the Cloud? Is it good, bad, or just the same double-edged sword, and it just depends how you manage and do it?

Boardman: Dana, I don’t know whether we really want to be putting our data in the Cloud, so much as putting the access to our data into the Cloud. There are all kinds of issues you’re going to run up against, as soon as you start putting your source information out into the Cloud, not the least privacy and that kind of thing.

A bunch of APIs

What you can do is simply say, “What information do I have that might be interesting to people? If it’s a private Cloud in a large organization elsewhere in the organization, how can I make that available to share?” Or maybe it’s really going out into public. What a government, for example, can be thinking about is making information services available, not just what you go and get from them that they already published. But “this is the information,” a bunch of APIs if you like. I prefer to call them data services, and to make those available.

So, if you do it properly, you have a layer of security in front of your data. You’re not letting people come in and do joins across all your tables. You’re providing information. That does require you then to engage your users in what is it that they want and what they want to do. Maybe there are people out there who want to take a bit of your information and a bit of somebody else’s and mash it together, provide added value. That’s great. Let’s go for that and not try and answer every possible question in advance.

Gardner: Dave, do you agree with that, or do you think that there is a place in the Cloud for some data?

Gilmour: There’s definitely a place in the Cloud for some data. I get the impression that there is going to drive out of this something like the insurance industry, where you’ll have a secondary Cloud. You’ll have secondary providers who will provide to the front-end providers. They might do things like archiving and that sort of thing.

Now, if you have that situation where your contractual relationship is two steps away, then you have to be very confident and certain of your cloud partner, and it has to actually therefore encompass a very strong level of governance.

The other issue you have is that you’ve got then the intersection of your governance requirements with that of the cloud provider’s governance requirements. Therefore you have to have a really strongly — and I hate to use the word — architected set of interfaces, so that you can understand how that governance is actually going to operate.

Gardner: Wouldn’t data perhaps be safer in a cloud than if they have a poorly managed network?

Mezzapelle: There is data in the Cloud and there will continue to be data in the Cloud, whether you want it there or not. The best organizations are going to start understanding that they can’t control it that way and that perimeter-like approach that we’ve been talking about getting away from for the last five or seven years.

So what we want to talk about is data-centric security, where you understand, based on role or context, who is going to access the information and for what reason. I think there is a better opportunity for services like storage, whether it’s for archiving or for near term use.

There are also other services that you don’t want to have to pay for 12 months out of the year, but that you might need independently. For instance, when you’re running a marketing campaign, you already share your data with some of your marketing partners. Or if you’re doing your payroll, you’re sharing that data through some of the national providers.

Data in different places

So there already is a lot of data in a lot of different places, whether you want Cloud or not, but the context is, it’s not in your perimeter, under your direct control, all of the time. The better you get at managing it wherever it is specific to the context, the better off you will be.

Hietala: It’s a slippery slope [when it comes to customer data]. That’s the most dangerous data to stick out in a Cloud service, if you ask me. If it’s personally identifiable information, then you get the privacy concerns that Stuart talked about. So to the extent you’re looking at putting that kind of data in a Cloud, looking at the Cloud service and trying to determine if we can apply some encryption, apply the sensible security controls to ensure that if that data gets loose, you’re not ending up in the headlines of The Wall Street Journal.

Gardner: Dave, you said there will be different levels on a regulatory basis for security. Wouldn’t that also play with data? Wouldn’t there be different types of data and therefore a spectrum of security and availability to that data?

Gilmour: You’re right. If we come back to Facebook as an example, Facebook is data that, even if it’s data about our known customers, it’s stuff that they have put out there with their will. The data that they give us, they have given to us for a purpose, and it is not for us then to distribute that data or make it available elsewhere. The fact that it may be the same data is not relevant to the discussion.

Three-dimensional solution

That’s where I think we are going to end up with not just one layer or two layers. We’re going to end up with a sort of a three-dimensional solution space. We’re going to work out exactly which chunk we’re going to handle in which way. There will be significant areas where these things crossover.

The other thing we shouldn’t forget is that data includes our software, and that’s something that people forget. Software nowadays is out in the Cloud, under current ways of running things, and you don’t even always know where it’s executing. So if you don’t know where your software is executing, how do you know where your data is?

It’s going to have to be just handled one way or another, and I think it’s going to be one of these things where it’s going to be shades of gray, because it cannot be black and white. The question is going to be, what’s the threshold shade of gray that’s acceptable.

Gardner: Mary Ann, to this notion of the different layers of security for different types of data, is there anything happening in the market that you’re aware of that’s already moving in that direction?

Mezzapelle: The experience that I have is mostly in some of the business frameworks for particular industries, like healthcare and what it takes to comply with the HIPAA regulation, or in the financial services industry, or in consumer products where you have to comply with the PCI regulations.

There has continued to be an issue around information lifecycle management, which is categorizing your data. Within a company, you might have had a document that you coded private, confidential, top secret, or whatever. So you might have had three or four levels for a document.

You’ve already talked about how complex it’s going to be as you move into trying understand, not only for that data, that the name Mary Ann Mezzapelle, happens to be in five or six different business systems over a 100 instances around the world.

That’s the importance of something like an Enterprise Architecture that can help you understand that you’re not just talking about the technology components, but the information, what they mean, and how they are prioritized or critical to the business, which sometimes comes up in a business continuity plan from a system point of view. That’s where I’ve advised clients on where they might start looking to how they connect the business criticality with a piece of information.

One last thing. Those regulations don’t necessarily mean that you’re secure. It makes for good basic health, but that doesn’t mean that it’s ultimately protected.You have to do a risk assessment based on your own environment and the bad actors that you expect and the priorities based on that.

Leaving security to the end

Boardman: I just wanted to pick up here, because Mary Ann spoke about Enterprise Architecture. One of my bugbears — and I call myself an enterprise architect — is that, we have a terrible habit of leaving security to the end. We don’t architect security into our Enterprise Architecture. It’s a techie thing, and we’ll fix that at the back. There are also people in the security world who are techies and they think that they will do it that way as well.

I don’t know how long ago it was published, but there was an activity to look at bringing the SABSA Methodology from security together with TOGAF®. There was a white paper published a few weeks ago.

The Open Group has been doing some really good work on bringing security right in to the process of EA.

Hietala: In the next version of TOGAF, which has already started, there will be a whole emphasis on making sure that security is better represented in some of the TOGAF guidance. That’s ongoing work here at The Open Group.

Gardner: As I listen, it sounds as if the in the Cloud or out of the Cloud security continuum is perhaps the wrong way to look at it. If you have a lifecycle approach to services and to data, then you’ll have a way in which you can approach data uses for certain instances, certain requirements, and that would then apply to a variety of different private Cloud, public Cloud, hybrid Cloud.

Is that where we need to go, perhaps have more of this lifecycle approach to services and data that would accommodate any number of different scenarios in terms of hosting access and availability? The Cloud seems inevitable. So what we really need to focus on are the services and the data.

Boardman: That’s part of it. That needs to be tied in with the risk-based approach. So if we have done that, we can then pick up on that information and we can look at a concrete situation, what have we got here, what do we want to do with it. We can then compare that information. We can assess our risk based on what we have done around the lifecycle. We can understand specifically what we might be thinking about putting where and come up with a sensible risk approach.

You may come to the conclusion in some cases that the risk is too high and the mitigation too expensive. In others, you may say, no, because we understand our information and we understand the risk situation, we can live with that, it’s fine.

Gardner: It sounds as if we are coming at this as an underwriter for an insurance company. Is that the way to look at it?

Current risk

Gilmour: That’s eminently sensible. You have the mortality tables, you have the current risk, and you just work the two together and work out what’s the premium. That’s probably a very good paradigm to give us guidance actually as to how we should approach intellectually the problem.

Mezzapelle: One of the problems is that we don’t have those actuarial tables yet. That’s a little bit of an issue for a lot of people when they talk about, “I’ve got $100 to spend on security. Where am I going to spend it this year? Am I going to spend it on firewalls? Am I going to spend it on information lifecycle management assessment? What am I going to spend it on?” That’s some of the research that we have been doing at HP is to try to get that into something that’s more of a statistic.

So, when you have a particular project that does a certain kind of security implementation, you can see what the business return on it is and how it actually lowers risk. We found that it’s better to spend your money on getting a better system to patch your systems than it is to do some other kind of content filtering or something like that.

Gardner: Perhaps what we need is the equivalent of an Underwriters Laboratories (UL) for permeable organizational IT assets, where the security stamp of approval comes in high or low. Then, you could get you insurance insight– maybe something for The Open Group to look into. Any thoughts about how standards and a consortium approach would come into that?

Hietala: I don’t know about the UL for all security things. That sounds like a risky proposition.

Gardner: It could be fairly popular and remunerative.

Hietala: It could.

Mezzapelle: An unending job.

Hietala: I will say we have one active project in the Security Forum that is looking at trying to allow organizations to measure and understand risk dependencies that they inherit from other organizations.

So if I’m outsourcing a function to XYZ corporation, being able to measure what risk am I inheriting from them by virtue of them doing some IT processing for me, could be a Cloud provider or it could be somebody doing a business process for me, whatever. So there’s work going on there.

I heard just last week about a NSF funded project here in the U.S. to do the same sort of thing, to look at trying to measure risk in a predictable way. So there are things going on out there.

Gardner: We have to wrap up, I’m afraid, but Stuart, it seems as if currently it’s the larger public Cloud provider, something of Amazon and Google and among others that might be playing the role of all of these entities we are talking about. They are their own self-insurer. They are their own underwriter. They are their own risk assessor, like a UL. Do you think that’s going to continue to be the case?

Boardman: No, I think that as Cloud adoption increases, you will have a greater weight of consumer organizations who will need to do that themselves. You look at the question that it’s not just responsibility, but it’s also accountability. At the end of the day, you’re always accountable for the data that you hold. It doesn’t matter where you put it and how many other parties they subcontract that out to.

The weight will change

So there’s a need to have that, and as the adoption increases, there’s less fear and more, “Let’s do something about it.” Then, I think the weight will change.

Plus, of course, there are other parties coming into this world, the world that Amazon has created. I’d imagine that HP is probably one of them as well, but all the big names in IT are moving in here, and I suspect that also for those companies there’s a differentiator in knowing how to do this properly in their history of enterprise involvement.

So yeah, I think it will change. That’s no offense to Amazon, etc. I just think that the balance is going to change.

Gilmour: Yes. I think that’s how it has to go. The question that then arises is, who is going to police the policeman and how is that going to happen? Every company is going to be using the Cloud. Even the Cloud suppliers are using the Cloud. So how is it going to work? It’s one of these never-decreasing circles.

Mezzapelle: At this point, I think it’s going to be more evolution than revolution, but I’m also one of the people who’ve been in that part of the business — IT services — for the last 20 years and have seen it morph in a little bit different way.

Stuart is right that there’s going to be a convergence of the consumer-driven, cloud-based model, which Amazon and Google represent, with an enterprise approach that corporations like HP are representing. It’s somewhere in the middle where we can bring the service level commitments, the options for security, the options for other things that make it more reliable and risk-averse for large corporations to take advantage of it.

Dana Gardner is president and principal analyst at Interarbor Solutions, an enterprise IT analysis, market research, and consulting firm. Gardner, a leading identifier of software and Cloud productivity trends and new IT business growth opportunities, honed his skills and refined his insights as an industry analyst, pundit, and news editor covering the emerging software development and enterprise infrastructure arenas for the last 18 years.

1 Comment

Filed under Cloud, Cloud/SOA, Conference, Cybersecurity, Information security, Security Architecture

5 Tips Enterprise Architects Can Learn from the Winchester Mystery House

By E.G.Nadhan, HP Enterprise Services

Not far from where The Open Group Conference was held in San Francisco this week is the Winchester Mystery House, once the personal residence of Sarah Winchester, widow of the gun magnate William Wirt Winchester. It took 38 years to build this house. Extensions and modifications were primarily based on a localized requirement du jour. Today, the house has several functional abnormalities that have no practical explanation.

To build a house right, you need a blueprint that details what is to be built, where, why and how based on the home owner’s requirements (including cost). As the story goes, Sarah Winchester’s priorities were different. However, if we don’t follow this systematic approach as enterprise architects, we are likely to land up with some Winchester IT houses as well.

Or, have we already? Enterprises are always tempted to address the immediate problem at hand with surprisingly short timelines. Frequent implementations of sporadic, tactical additions evolve to a Winchester Architecture. Right or wrong, Sarah Winchester did this by choice. If enterprises of today land up with such architectures, it can only by chance and not by choice.

So, here are my tips to architect by choice rather than chance:

  • Establish your principles: Fundamental architectural principles must be in place that serve as a rock solid foundation upon which architectures are based. These principles are based on generic, common-sense tenets that are refined to apply specifically to your enterprise.
  • Install solid governance: The appropriate level of architectural governance must be in place with the participation from the stakeholders concerned. This governance must be exercised, keeping these architectural principles in context.
  • Ensure business alignment: After establishing the architectural vision, Enterprise Architecture must lead in with a clear definition of the over-arching business architecture which defines the manner in which the other architectural layers are realized. Aligning business to IT is one of the primary responsibilities of an enterprise architect.
  • Plan for continuous evaluation: Enterprise Architecture is never really done. There are constant triggers (internal and external) for implementing improvements and extensions. Consumer behavior, market trends and technological evolution can trigger aftershocks within the foundational concepts that the architecture is based upon.

Thus, it is interesting that The Open Group conference was miles away from the Winchester House. By choice, I would expect enterprise architects to go to The Open Group Conference. By chance, if you do happen by the Winchester House and are able to relate it to your Enterprise Architecture, please follow the tips above to architect by choice, and not by chance.

If you have instances where you have seen the Winchester pattern, do let me know by commenting here or following me on Twitter @NadhanAtHP.

This blog post was originally posted on HP’s Transforming IT Blog.

HP Distinguished Technologist, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project. Twitter handle @NadhanAtHP.

4 Comments

Filed under Enterprise Architecture, TOGAF®

First Technical Standard for Cloud Computing – SOCCI

By E.G. Nadhan, HP

The Open Group just announced the availability of its first Technical Standard for the Cloud – Service Oriented Cloud Computing Infrastructure Framework (SOCCI), which outlines the concepts and architectural building blocks necessary for infrastructures to support SOA and Cloud initiatives. HP has played a leadership role in the definition and evolution of this standard within The Open Group.

SOCCI.png

As a platinum member of The Open Group, HP’s involvement started with the leadership of the Service Oriented Infrastructure project that I helped co-chair. As the Cloud Computing Working Group started taking shape, I suggested expanding this project into the working group, which resulted in the formation of the Service Oriented Cloud Computing Infrastructure project. This project was co-chaired by Tina Abdollah of IBM and myself and operated under the auspices of both the SOA and Cloud Computing Working Groups.

Infrastructure has been traditionally provisioned in a physical manner. With the evolution of virtualization technologies and application of service-orientation to infrastructure, it can now be offered as a service. SOCCI is the realization of an enabling framework of service-oriented components for infrastructure to be provided as a service in the cloud.

Service Oriented Cloud Computing Infrastructure (SOCCI) is a classic intersection of multiple paradigms in the industry – infrastructure virtualization, service-orientation and the cloud – an inevitable convergence,” said Tom Hall, Global Product Marketing Manager, Cloud and SOA Applications, HP Enterprise Services. “HP welcomes the release of the industry’s first cloud computing standard by The Open Group. This standard provides a strong foundation for HP and The Open Group to work together to evolve additional standards in the SOA and Cloud domains.”

This standard can be leveraged in one or more of the following ways:

  • Comprehend service orientation and Cloud synergies
  • Extend adoption of  traditional and service-oriented infrastructure in the Cloud
  • Leverage consumer, provider and developer viewpoints
  • Incorporate SOCCI building blocks into Enterprise Architecture
  • Implement Cloud-based solutions using different infrastructure deployment models
  • Realize business solutions referencing the SOCCI Business Scenario
  • Apply Cloud governance considerations and recommendations

The Open Group also announced the availability of the SOA Reference Architecture, a blueprint for creating and evaluating SOA solutions.

Standards go through a series of evolution phases as I outline in my post on Evolution of IaaS standards.  The announcement of the SOCCI Technical Standard will give some impetus to the evolution of IaaS standards in the Cloud somewhere between the experience and consensus phases.

It was a very positive experience co-chairing the evolution of the SOCCI standard within The Open Group working with other member companies from several enterprises with varied perspectives.

Have you taken a look at this standard?  If not, please do so.  And for those who have, where and how do you think this standard could be adopted?  Are there ways that the standard can be improved in future releases to make it better suited for broader adoption?  Please let me know!

This blog post was originally posted on HP’s Enterprise Services Blog.

HP Distinguished Technologist, E.G.Nadhan has over 25 years of experience in the IT industry across the complete spectrum of selling, delivering and managing enterprise level solutions for HP customers. He is the founding co-chair for The Open Group SOCCI project and is also the founding co-chair for the Open Group Cloud Computing Governance project.

1 Comment

Filed under Cloud, Cloud/SOA, Service Oriented Architecture, Standards