Tag Archives: Boundaryless Information Flow

Enterprise Architects “Know Nothing”: A Conversation with Ron Tolido

By The Open Group

It has been well documented that the digital economy is sending many companies—not to mention industries— into a tailspin. Customer expectations, demands for innovation and a rapid change are creating an IT landscape that is not only difficult to manage but nearly impossible to predict. And according to Capgemini’s Ron Tolido, Enterprise Architects need to prepare to function in a world where they have no idea what type of solutions and innovations their clients may need, even in the near future—a world where Enterprise Architects “know nothing.”

Tolido, who spoke at The Open Group London 2016 in April, believes organizations must begin to look to “I don’t know” architectures if they are to survive in the digital economy. Traditional IT methods and architectural practices that were established during the 1980s and 1990s are no longer relevant in the digital age.

Because customer and business needs are constantly changing there really is no way to know what IT landscapes will look like in the future or what type of solutions organizations will need, Tolido says. Therefore, rather than asking clients what they need, IT must instead provide users an architected platform of services that can be mixed and matched to meet a variety needs, enabling business customers to go in any direction they want.

As such, Tolido says Enterprise Architects in this emerging digital era are comparable to the character Jon Snow from HBO’s Game of Thronesa character who is often told “You know nothing.” Like Jon Snow, today Enterprise Architects effectively know nothing because businesses have no idea what the future will hold, whether two days or ten years from now. With new business scenarios developing in real-time, architectures can no longer be painstakingly planned for or designed.

So where does that leave Enterprise Architects? What can they offer in a world where they know nothing and are heading blindly into an environment that is constantly in flux?

Tolido says it’s time for enterprise architectures to stop trying to make predictions as to what architectures should look like and instead provide the business a digital platform that will allow for a new style of architecting, one that drives continuous transformation rather than requirements-driven, step-by-step change.

To do this, Tolido says Enterprise Architects must enable “the art of the possible” within organizations, providing their clients with a catalog of possibilities—a listing of potential things they could be doing to help companies continually transform themselves.

This is a huge shift for most IT departments, Tolido says, which are still stuck in the mindset that the business is different from IT and that business requirements must drive IT initiatives, with architecture sitting somewhere between the two. No longer can architects be content to place architectures somewhere in the middle between the business and IT, Tolido says, because in the next generation of IT—the era of the platform—there is no distinction between business and IT. They are one and the same. With the “third platform”—or Open Platform 3.0™—the platform must allow the business to continually adapt to the needs of customers and market forces.

This brave new world will also require Enterprise Architects to become more adaptable themselves and give up control of their architectures, Tolido says. The role of architects is evolving with them becoming business enablers, or platform “maesters.”

Currently, many established enterprises are having a difficult time adjusting to this new reality; thus all the digital disruption we are seeing across industries, Tolido says. Start-ups and newer technology players have some advantage here because they are already in a state of change and their systems have been designed to deal with that.

One way, Tolido suggests, that enterprises can make transformation easier on themselves would be to create a “parallel IT universe” alongside their existing systems that explores a more service-oriented model and allows for them to transition. Although such a system might cannibalize existing services or products, it may also be the only way to keep up with disruptive market forces. “Better to eat yourself and be your own disruptor than have someone else do it to you,” Tolido says.

As “platform maesters,” Enterprise Architects will also need to become much more proactive in helping company stakeholders understand the necessity of a platform play for continuous business transformation. That means proving that the EA role is much more about designing a continuously enabling platform than actually designing solutions, which is a shift in role for EAs. Tolido believes EAs must also become better at telling the digital story and outlining the business possibilities that services can enable. “They need to become real change agents. This will require more imagination from architects as well.”

Enabling unhindered, continuous transformation may actually allow businesses to move closer to The Open Group vision of Boundaryless Information Flow™, Tolido says. Standards will have a significant role to play here because companies designing platforms that allow for constant change will need the help of standards. The work being done in The Open Group Open Platform 3.0 Forum can help organizations better understand what open platforms designed for micro services and ad hoc application composition will look like. For example, Tolido says, the concept of the Open Business Data Lake—an environment that combines services, data retrieval and storage in a fluid way to provides dynamic outlets and uses for the data, is an indicator of how the landscape will look differently. “Standards are crucial for helping people understand how that landscape should look and giving guidance as to how organizations can work with microservices and agility,” Tolido says.

Despite all the upheaval going on at companies and in IT today, Tolido believes these are exciting times for IT because the discipline is going through a revolution that will effect everything that businesses do. Although it may take some adjustments for Enterprise Architects, Tolido says the new landscape will provide a lot of compelling challenges for architects who accept that they know “nothing”, go with the flow and who can adapt to uncertainty.

“It’s a new world. There’s more change than you can absorb right now. Better enjoy the ride.”

@theopengroup

By The Open Group

Ron Tolido is Senior Vice President and Chief Technology Officer of Application Services Continental Europe, Capgemini. He is also a Director on The Open Group Governing Board and blogger for Capgemini’s multiple award-winning CTO blog, as well as the lead author of Capgemini’s TechnoVision and the global Application Landscape Reports. As a noted Digital Transformation ambassador, Tolido speaks and writes about IT strategy, innovation, applications and architecture. Based in the Netherlands, Mr. Tolido currently takes interest in apps rationalization, Cloud, enterprise mobility, the power of open, Slow Tech, process technologies, the Internet of Things, Design Thinking and – above all – radical simplification.

 

7 Comments

Filed under Boundaryless Information Flow™, Business Transformation, Data Lake, digital technologies, Enterprise Architecture, enterprise architecture, Enterprise Transformation, Internet of Things, IoT, Open Platform 3.0, Ron Tolido, Standards, The Open Group

The Open Group San Francisco 2016 Day One Highlights

By Loren K. Baynes, Director, Global Communications, The Open Group

On Monday, January 25, The Open Group kicked off its first event of 2016, focused on Enabling Boundaryless Information Flow™, at the Marriott Union Square in the city by the bay, San Francisco, California.

President and CEO Steve Nunn gave a warm welcome to over 250 attendees from 18 countries, including Botswana, China and The Netherlands. He introduced the morning’s plenary, which centered on Digital Business and the Customer Experience. This year also marks a major milestone for The Open Group, which is celebrating its 20th anniversary in 2016.

The Open Group Director of Interoperability Dr. Chris Harding kicked off the morning’s event speaking on “Doing Digital Business.”

Digital technology is transforming business today. As such, how Enterprise Architects can architect for and deliver better customer experience is a more critical factor for businesses today than ever before. For thousands of years, most business transactions happened face-to-face with human interaction at the heart of them. The Internet has changed that, largely taking humans out of the equation in favor of “intelligent” programs that provide customer service. As Enterprise Architects, the challenge now is to create corporate systems and personas that mimic human interaction to provide better service levels. To achieve that, Harding says, currently companies are looking at a number of improved models including providing microservices, Cloud architectures and data lakes.

To better enable the transformation toward digital customer experiences, The Open Group Open Platform 3.0™ Forum is currently working on an interoperability standard to support a variety of services that run on digital platforms. In addition, the Digital Business and Customer Experience Work Group—a joint work group of Open Platform 3.0 and the Architecture Forums—is currently working on customer-based architectures, as well as a whitepaper geared toward enabling better customer experiences for digital business.

In the second session of the morning, Mark Skilton of PA Consulting addressed the issue of “The Battle for Owning the Digital Spaces”. Skilton says that in this era of unprecedented digital information, we need to better understand all of that information in order to create business opportunities—however, much of that information is contained in the “gray” spaces in between interactions. Accessing that kind of data provides opportunities for businesses to get a better handle on how to provide digital experiences that will draw customers. It also requires “ecosystem” thinking where what is happening on both the micro and macro levels should be considered.

As such, companies must reconsider what it means to be an enterprise, platform or even a service. This requires a new way of looking at architectures that combines both physical and virtual environments to take advantage of those “gray” spaces in people’s lives. By interconnecting or “flattening” out people’s experiences, such as their work, living, commercial or social spaces, they will be allowed to take their digital experiences with them throughout their lives. To enable these things moving forward, architects will need to change their mindsets to think differently and consider experience more rather than just architectures. Behavior, interactivity, psychology, usability—the human factors—of advanced customer experience will need to be considered in the architecture development process more to create more connected spaces to meet people’s needs.

Trevor Cheung, Vice President Strategy & Architecture Practice for Huawei Global Services, spoke next on “Architecting for Customer Experience.” Cheung introduced the concept of the ROADS Experience, a principle for designing customer-driven architectures. According to Cheung, ROADS (Real-time, On-demand, All-online, DIY and Social) is critical for companies that want to become digital service providers. As organizations digitalize, they should think more holistically about customer experiences—including both internal (employees) and external audiences (customers, partners, etc.)—moving from an inside-out IT perspective to one that also considers outside-in constituencies.

For example, to provide omni-channel experiences, business architectures must focus on the values of stakeholders across the ecosystem—from buyers and their interests, to partners and suppliers or operations. By applying the ROADS principle, each stakeholder, or persona, can be considered along the way to develop an architecture blue print that covers all channels and experiences, mapping the needs back to the technologies needed to provide specific capabilities. Currently two whitepapers are being developed in the Digital Business and Customer Experience Work Group that explore these issues, including a new reference model for customer architectures.

In the last morning session Jeff Matthews, Director of Venture Strategy and Research, Space Frontier Foundation, presented “The Journey to Mars is Powered by Data: Enabling Boundaryless Information Flow™ within NASA.” Currently, NASA’s programs, particularly its endeavors to send people to Mars, are being enabled by complex Enterprise Architectures that govern each of the agency’s projects.

According to Matthews, nothing goes through NASA’s planning without touching Enterprise Architecture. Although the agency has a relatively mature architecture, they are continually working to breakdown silos within the agency to make their architectures more boundaryless.

Ideally, NASA believes, removing boundaries will give them better access to the data they need, allowing the agency to evolve to a more modular architecture. In addition, they are looking at a new decision-making operating model that will help them grapple with the need to buy technologies and setting up architectures now for programs that are being planned for 10-30 years in the future. To help them do this, Matthews encouraged audience members and vendors to reach out to him to talk about architectural strategies.

In addition to the event proceedings, The Open Group also hosted the inaugural meeting of the TOGAF® User Group on Monday. Aimed at bringing together TOGAF users and stakeholders in order to share information, best practices and learning, the day-long meeting featured topics relative to how to better use TOGAF in practicality. Attendees participated in a number of breakout sessions regarding the standard, intended to provide opportunities to share experiences and enlighten others on how to best use TOGAF as well as provide suggestions as to how the standard can be improved upon in the future.

Allen Brown, current interim CEO of the Association of Enterprise Architects (AEA), and former CEO of The Open Group, also introduced the AEA Open Badges Program for Professional Development. Much like badge programs for the Boy or Girl Scouts, the Open Badge program lets people demonstrate their professional achievements via digital badges that provide credentials for skills or achievements learned. Moving forward, the AEA will be providing digital badges, each of which will include embedded information showing the information learned to earn the badge. Attendees can earn badges for attending this conference. For more information, email OpenBadges@GlobalAEA.org.

Monday’s afternoon tracks were split into two tracks centered on Open Platform 3.0™ and Risk, Dependability and Trusted Technology. The Open Platform 3.0 track continued in the same vein as the morning’s sessions looking at how Enterprise Architectures must adapt to the changes due to digitalization and growing customer expectations. Accenture Enterprise Architect Syed Husain gave an insightful presentation on enabling contextual architectures and increased personalization using artificial intelligence. As both consumers and technology become increasingly sophisticated, demands for individualized preferences tailored to individuals are growing. Companies that want to keep up will need to take these demands into account as they evolve their infrastructures. In the Security track, sessions centered on privacy governance, best practices for adopting the Open FAIR Risk Management standard and dealing with cyber security risks as well as how to navigate the matrix of data classification to maximize data protection practices.

Concluding the day was an evening reception where event and TOGAF User Group attendees mixed, mingled and networked. The reception featured The Open Group Partner Pavilion, as well as short presentations from The Open Group Architecture, IT4IT™ and Open Platform 3.0 Forums.

@theopengroup #ogSFO

By Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog, media relations and social media. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Comments Off on The Open Group San Francisco 2016 Day One Highlights

Filed under Boundaryless Information Flow™, Cloud, EA, Enterprise Architecture, Internet of Things, Interoperability, IoT, Open Platform 3.0, Standards, Steve Nunn, The Open Group, The Open Group, The Open Group San Francisco 2016, The Open Group San Franscisco 2016, TOGAF®, Uncategorized

The Open Group to Hold Next Event in San Francisco

The Open Group, the vendor-neutral IT consortium, is hosting its next event in San Francisco January 25-28. The Open Group San Francisco 2016 will focus on how Enterprise Architecture is empowering companies to build better systems by architecting for digital business strategies. The event will go into depth on this topic through various individual sessions and keynotes.

Some of the many topics of discussion at the event include Business Architecture; how to architect systems using tools and frameworks such as TOGAF® and ArchiMate® (both Open Group standards); Social, Mobile, Analytics and Cloud (SMAC); Risk Management and Cybersecurity; Business Transformation; Professional Development, and improving the security and dependability of IT, including the global supply chain on which they rely.

Key speakers at the event, taking place at San Francisco’s Marriott Union Square, include:

  • Steve Nunn,  President & CEO, The Open Group
  • Trevor Cheung, VP Strategy and Architecture Practice, Huawei Global Services
  • Jeff Matthews, Director of Venture Strategy and Research, Space Frontier Foundation
  • Ajit Gaddam, Chief Security Architect, Visa
  • Eric Cohen, Chief Enterprise Architect, Thales
  • Heather Kreger, Distinguished Engineer, CTO International Standards, IBM

Full details on the range of track speakers at the event can be found here.

There will also be the inaugural TOGAF® User Group meeting taking place on January 25. Facilitated breakout sessions will bring together key stakeholders and users to share best practices, information and learn from each other.

Other subject areas at the three day event will include:

  • Open Platform 3.0™ – The Customer Experience and Digital Business
  • IT4IT – Managing the Business of IT. Case study presentations and a vendor panel to discuss the release of The Open Group IT4IT Reference Architecture Version 2.0 standard
    • Plus deep dive presentations into the four streams of the IT Value Chain along with the latest information on the IT4IT training and certification program.
  • EA & Business Transformation – Understand what role EA, as currently practiced, plays in Business Transformation, especially transformations driven by emerging and disruptive technologies.
  • Risk, Dependability & Trusted Technology – The cybersecurity connection – securing the global supply chain.
  • Enabling Healthcare
  • TOGAF® 9 and ArchiMate® – Case studies and the harmonization of the standards.
  • Understand how to develop better interoperability & communication across organizational boundaries and pursue global standards for Enterprise Architecture that are highly relevant to all industries.

Registration for The Open Group San Francisco is open now, is available to members and non-members, and can be found here.

Join the conversation @theopengroup #ogSFO

Comments Off on The Open Group to Hold Next Event in San Francisco

Filed under Boundaryless Information Flow, The Open Group, The Open Group San Francisco 2016, TOGAF, Uncategorized

The Open Group Edinburgh 2015 Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

On Monday October 19, Allen Brown, President and CEO of The Open Group, welcomed over 230 attendees from 26 countries to the Edinburgh International Conference Center located in the heart of historic Edinburgh, Scotland.

Allen kicked off the morning with an overview of company achievements and third quarter activities. The Open Group has over 500 member organizations in 42 countries, with the newest members coming from Peru and Zambia. Allen provided highlights of the many activities of our Forums and Work Groups. Too many to list, but white papers, guides, snapshots and standards have been published and continue to be in development. The newest Work Group is Digital Business Strategy and Customer Experience. The UDEF Work Group is now named O-DEF (Open – Data Element Framework) Work Group. The Real Time and Embedded Systems Forum is becoming more focused on critical systems and high assurance. Our members and staff have been very productive as always!

The morning plenary featured the theme “Architecting Business Transformation” with BAES Submarines. Speakers were Stephen Cole, CIO, BAE Systems Maritime Submarines; John Wilcock, Head of Operations Transformation, BAE Systems Submarine Solutions; Matthew Heard, Senior Operations Engineer, BAE Systems Maritime Submarines; and Paul Homan, Enterprise Architect, IBM. The presentation included a history of BAES Submarines and a ‘case study’ on using TOGAF® to help define BAE’s strategy for transforming their operations and production functions. The gentlemen all advocated the need to continue to drive change and transformation through the TOGAF principles. TOGAF has provided a structured, standardized approach to solving functional problems. TOGAF also ultimately allows organizations to document and measure their success along the way for meeting business objectives.

Following the keynotes, all presenters joined Allen for a panel consisting of an engaging Q&A with the audience.

By Loren K. Baynes, Director, Global Marketing CommunicationsPaul Homan, John Wilcock, Matthew Heard, Stephen Cole, Allen Brown

In the afternoon, the agenda offered several tracks on Risk, Dependability and Trusted Technology; EA and Business Transformation and Open Platform 3.0™.

One of the many sessions was “Building the Digital Enterprise – from Digital Disruption to Digital Experience” with Mark Skilton, Digital Expert, and Rob Mettler, Director of Digital Business, both with PA Consulting. The speakers discussed the new Work Group of The Open Group – Digital Business and Customer Experience, which is in the early stage of researching and developing a framework for the digital boom and new kind of ecosystem. The group examines how the channels from 15 years ago compare to today’s multi-device/channel work requiring a new thinking and process, while “always keeping in mind, customer behavior is key”.

The evening concluded with a networking Partner Pavilion (IT4IT™, The Open Group Open Platform™ and Enterprise Architecture) and a whisky tasting by the Scotch Whisky Heritage Centre.

Tuesday, October 20th began with another warm Open Group welcome by Allen Brown.

Allen and Ron Ashkenas, Senior Partner, Schaffer Consulting presented “A 20-year Perspective on the Boundaryless Organization and Boundaryless Information Flow™. The More Things Change, the More They Stay the Same”.

Ron shared his vision of how the book “The Boundaryless Organization” came to light and was published in 1995. He discussed his experiences working with Jack Welch to progress GE (General Electric). Their pondering included “can staff/teams be more nimble without boundaries and layers?”. After much discussion, the concept of ‘boundaryless’ was born. The book showed companies how to sweep away the artificial obstacles – such as hierarchy, turf, and geography – that get in the way of outstanding business performance. The presentation was a great retrospective of boundaryless and The Open Group. But they also explored the theme of ‘How does boundaryless fit today in light of the changing world?’. The vision of The Open Group is Boundaryless Information Flow.

Allen emphasized that “then standards were following the industry, now their leading the industry”. Boundaryless Information Flow does not mean no boundaries exist. Boundaryless means aspects are permeable to boundaries to enable business, yet not prohibit it.

During the next session, Allen announced the launch of the IT4IT™ Reference Architecture v2.0 Standard. Chris Davis, University of South Florida and Chair of The Open Group IT4IT™ Forum, provided a brief overview of IT4IT and the standard. The Open Group IT4IT Reference Architecture is a standard reference architecture and value chain-based operating model for managing the business of IT.

After the announcement, Mary Jarrett, IT4IT Manager, Shell, presented “Rationale for Adopting an Open Standard for Managing IT”. In her opening, she stated her presentation was an accountant’s view of IT4IT and the Shell journey. Mary’s soundbites included: “IT adds value to businesses and increases revenue and profits; ideas of IT are changing and we need to adapt; protect cyber back door as well as physical front door.”

The afternoon tracks consisted of IT4IT™, EA Practice & Professional Development, Open Platform 3.0™, and Architecture Methods and Techniques.

The evening concluded with a fantastic private function at the historic Edinburgh Castle. Bagpipes, local culinary offerings including haggis, and dancing were enjoyed by all!

By Loren K. Baynes, Director, Global Marketing Communications

Edinburgh Castle

On Wednesday and Thursday, work sessions and member meetings were held.

A special ‘thank you’ goes to our sponsors and exhibitors: BiZZdesign; Good e-Learning, HP, Scape, Van Haren Publishing and AEA.

Other content, photos and highlights can be found via #ogEDI on Twitter.  Select videos are on The Open Group YouTube channel. For full agenda and speakers, please visit The Open Group Edinburgh 2015.

By Loren K. Baynes, Director, Global Marketing CommunicationsLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog, media relations and social media. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Comments Off on The Open Group Edinburgh 2015 Highlights

Filed under boundaryless information flow, Enterprise Architecture, IT, IT4IT, Open Platform 3.0, The Open Group, The Open Group Ediburgh 2015, TOGAF

The Open Group Edinburgh—The State of Boundaryless Information Flow™ Today

By The Open Group

This year marks the 20th anniversary of the first version of TOGAF®, an Open Group standard, and the publication of “The Boundaryless Organization,” a book that defined how companies should think about creating more open, flexible and engaging organizations. We recently sat down with The Open Group President and CEO Allen Brown and Ron Ashkenas, Senior Partner at Schaffer Consulting and one of the authors of “The Boundaryless Organization,” to get a perspective on Boundaryless Information Flow™ and where the concept stands today. Brown and Ashkenas presented their perspectives on this topic at The Open Group Edinburgh event on Oct. 20.

In the early 1990s, former GE CEO Jack Welch challenged his team to create what he called a “boundaryless organization”—an organization where the barriers between employees, departments, customers and suppliers had been broken down. He also suggested that in the 21st Century, all organizations would need to move in this direction.

Based on the early experience of GE, and a number of other companies, the first book on the subject, “The Boundaryless Organization,” was published in 1995. This was the same year that The Open Group released the first version of the TOGAF® standard, which provided an architectural framework to help companies achieve interoperability by providing a structure for interconnecting legacy IT systems. Seven years later, The Open Group adopted the concept of Boundaryless Information Flow™—achieved through global interoperability in a secure, reliable and timely manner—as the ongoing vision and mission for the organization. According to Allen Brown, President and CEO of The Open Group, that vision has sustained The Open Group over the years and continues to do so as the technology industry faces unprecedented and rapid change.

Brown’s definition of Boundaryless Information Flowis rooted in the notion of permeability. Ron Ashkenas, a co-author of “The Boundaryless Organization” emphasizes that organizations still need boundaries—without some boundaries they would become “dis-organized.” But like the cells walls in the human body, those boundaries need to be sufficiently permeable so that information can easily flow back and forth in real time, without being distorted, fragmented or blocked.

In that context, Brown believes that learning to be boundaryless today is more important than ever for organizations, despite the fact that many of the boundaries that existed in 1995 no longer exist, and the technical ability to share information around the world will continue to evolve. What often holds organizations back however, says Ashkenas, are the social and cultural patterns within organizations, not the technology.

“We have a tendency to protect information in different parts of the organization,” says Ashkenas. “Different functions, locations, and business units want their own version of ‘the truth’ rather than being held accountable to a common standard. This problem becomes even more acute across companies in a globally connected ecosystem and world. So despite our technological capabilities, we still end up with different systems, different information available at different times. We don’t have the common true north. The need to be more boundaryless is still there.  In fact it’s greater now, even though we have the capability to do it.”

Although the technical capabilities for Boundaryless Information Flow are largely here, the larger issue is getting people to agree and collaborate on how to do things. As Ashkenas explains, “It’s not just the technical challenges, it’s also cultural challenges, and the two have to go hand-in-hand.”

What’s more, collaboration is not just an issue of getting individuals to change, but of making changes at much larger levels on a much larger scale. Not only are boundaries now blurring within organizations, they’re blurring between institutions and across global ecosystems, which may include anything from apps, devices and technologies to companies, countries and cultures.

Ashkenas says that’s where standards, such as those being created by The Open Group, can help make a difference.  He says, “Standards used to come after technology. Now they need to come before the changes and weave together some of the ecosystem partners. I think that’s one of the exciting new horizons for The Open Group and its members—they can make a fundamental difference in the next few years.”

Brown agrees. He says that there are two major forces currently facing how The Open Group will continue to shape the Boundaryless Information Flow vision. One is the need for standards to address the changing perspective needed of the IT function from an “inside-out” to “outside-in” model fueled by a combination of boundaryless thinking and the convergence of social, mobile, Cloud, the Internet of Things and Big Data. The other is the need to shift from IT strategies being derived from business strategies and reacting to the business agenda that leads to redundancy and latency in delivering new solutions. Instead, IT must shift to recognize technology as increasingly driving business opportunity and that IT must be managed as a business in its own right.

For example, twenty years ago a standard might lag behind a technology. Once companies no longer needed to compete on the technology, Brown says, they would standardize. With things like Open Platform 3.0™ and the need to manage the business of IT (IT4IT™) quickly changing the business landscape, now standards need to be at the forefront, along with technology development, so that companies have guidance on how to navigate a more boundaryless world while maintaining security and reliability.

“This is only going to get more and more exciting and more and more interesting,” Brown says.

How boundaryless are we?

Just how boundaryless are we today? Ashkenas says a lot has been accomplished in the past 20 years. Years ago, he says, people in most organizations would have thought that Boundaryless Information Flow was either not achievable or they would have shrugged their shoulders and ignored the concept. Today there’s a strong acceptance of the need for it. In fact, a recent survey of The Open Group members found that 65 percent of those surveyed believe boundarylessness as a positive thing within their organization. And while most organizations are making strides toward boundarylessness, only a minority–15 percent—of those surveyed felt that Boundaryless Information Flow was something that would be impossible to achieve in a large international organization such as theirs.

According to Brown and Ashkenas, the next horizon for many organizations will be to truly make information flow more accessible in real-time for all stakeholders. Ashkenas says in most organizations the information people need is not always available when people need it, whether this is due to different systems, cultural constraints or even time zones. The next step will be to provide managers real-time, anywhere access to all the information they need. IT can help play a bigger part in providing people a “one truth” view of information, he says.

Another critical—but potentially difficult—next step is to try to get people to agree on how to make boundarylessness work across ecosystems. Achieving this will be a challenge because ways of doing things—even standards development—will need to adapt to different cultures in order for them to ultimately work. What makes sense in the U.S. or Europe from a business standpoint may not make sense in China or India or Brazil, for example.

“What are the fundamentals that have to be accepted by everyone and where is there room for customization to local cultures?” asks Ashkenas. “Figuring out the difference between the two will be critical in the coming years.”

Brown and Ashkenas say that we can expect technical innovations to evolve at greater speed and with greater effectiveness in the coming years. This is another reason why Enterprise Architecture and standards development will be critical for helping organizations transform themselves and adapt as boundaries blur even more.

As Brown notes, the reason that the architecture discipline and standards such as TOGAF arose 20 years ago was exactly because organizations were beginning to move toward boundarylessness and they needed a way to figure out how to frame those environments and make them work together. Before then, when IT departments were implementing different siloed systems for different functions across organizations, they had no inkling that someday people might need to share that information across systems or departments, let alone organizations.

“It never crossed our minds that we’d need to add people or get information from disparate systems and put them together to make some sense. It wasn’t information, it was just data. The only way in any complex organization that you can start weaving this together and see how they join together is to have some sort of architecture, an overarching view of the enterprise. Complex organizations have that view and say ‘this is how that information comes together and this is how it’s going to be designed.’ We couldn’t have gotten there without Enterprise Architecture in large organizations,” he says.

In the end, the limitations to Boundaryless Information Flow will largely be organizational and cultural, not a question of technology. Technologically boundarylessness is largely achievable. The question for organizations, Brown says, is whether they’ll be able to adjust to the changes that technology brings.

“The limitations are in the organization and the culture. Can they make the change? Can they absorb it all? Can they adapt?” he says.

1 Comment

Filed under Uncategorized

The Open Group Baltimore 2015 Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

The Open Group Baltimore 2015, Enabling Boundaryless Information Flow™, July 20-23, was held at the beautiful Hyatt Regency Inner Harbor. Over 300 attendees from 16 countries, including China, Japan, Netherlands and Brazil, attended this agenda-packed event.

The event kicked off on July 20th with a warm Open Group welcome by Allen Brown, President and CEO of The Open Group. The first plenary speaker was Bruce McConnell, Senior VP, East West Institute, whose presentation “Global Cooperation in Cyberspace”, gave a behind-the-scenes look at global cybersecurity issues. Bruce focused on US – China cyber cooperation, major threats and what the US is doing about them.

Allen then welcomed Christopher Davis, Professor of Information Systems, University of South Florida, to The Open Group Governing Board as an Elected Customer Member Representative. Chris also serves as Chair of The Open Group IT4IT™ Forum.

The plenary continued with a joint presentation “Can Cyber Insurance Be Linked to Assurance” by Larry Clinton, President & CEO, Internet Security Alliance and Dan Reddy, Adjunct Faculty, Quinsigamond Community College MA. The speakers emphasized that cybersecurity is not a simply an IT issue. They stated there are currently 15 billion mobile devices and there will be 50 billion within 5 years. Organizations and governments need to prepare for new vulnerabilities and the explosion of the Internet of Things (IoT).

The plenary culminated with a panel “US Government Initiatives for Securing the Global Supply Chain”. Panelists were Donald Davidson, Chief, Lifecycle Risk Management, DoD CIO for Cybersecurity, Angela Smith, Senior Technical Advisor, General Services Administration (GSA) and Matthew Scholl, Deputy Division Chief, NIST. The panel was moderated by Dave Lounsbury, CTO and VP, Services, The Open Group. They discussed the importance and benefits of ensuring product integrity of hardware, software and services being incorporated into government enterprise capabilities and critical infrastructure. Government and industry must look at supply chain, processes, best practices, standards and people.

All sessions concluded with Q&A moderated by Allen Brown and Jim Hietala, VP, Business Development and Security, The Open Group.

Afternoon tracks (11 presentations) consisted of various topics including Information & Data Architecture and EA & Business Transformation. The Risk, Dependability and Trusted Technology theme also continued. Jack Daniel, Strategist, Tenable Network Security shared “The Evolution of Vulnerability Management”. Michele Goetz, Principal Analyst at Forrester Research, presented “Harness the Composable Data Layer to Survive the Digital Tsunami”. This session was aimed at helping data professionals understand how Composable Data Layers set digital and the Internet of Things up for success.

The evening featured a Partner Pavilion and Networking Reception. The Open Group Forums and Partners hosted short presentations and demonstrations while guests also enjoyed the reception. Areas focused on were Enterprise Architecture, Healthcare, Security, Future Airborne Capability Environment (FACE™), IT4IT™ and Open Platform™.

Exhibitors in attendance were Esteral Technologies, Wind River, RTI and SimVentions.

By Loren K. Baynes, Director, Global Marketing CommunicationsPartner Pavilion – The Open Group Open Platform 3.0™

On July 21, Allen Brown began the plenary with the great news that Huawei has become a Platinum Member of The Open Group. Huawei joins our other Platinum Members Capgemini, HP, IBM, Philips and Oracle.

By Loren K Baynes, Director, Global Marketing CommunicationsAllen Brown, Trevor Cheung, Chris Forde

Trevor Cheung, VP Strategy & Architecture Practice, Huawei Global Services, will be joining The Open Group Governing Board. Trevor posed the question, “what can we do to combine The Open Group and IT aspects to make a customer experience transformation?” His presentation entitled “The Value of Industry Standardization in Promoting ICT Innovation”, addressed the “ROADS Experience”. ROADS is an acronym for Real Time, On-Demand, All Online, DIY, Social, which need to be defined across all industries. Trevor also discussed bridging the gap; the importance of combining Customer Experience (customer needs, strategy, business needs) and Enterprise Architecture (business outcome, strategies, systems, processes innovation). EA plays a key role in the digital transformation.

Allen then presented The Open Group Forum updates. He shared roadmaps which include schedules of snapshots, reviews, standards, and publications/white papers.

Allen also provided a sneak peek of results from our recent survey on TOGAF®, an Open Group standard. TOGAF® 9 is currently available in 15 different languages.

Next speaker was Jason Uppal, Chief Architecture and CEO, iCareQuality, on “Enterprise Architecture Practice Beyond Models”. Jason emphasized the goal is “Zero Patient Harm” and stressed the importance of Open CA Certification. He also stated that there are many roles of Enterprise Architects and they are always changing.

Joanne MacGregor, IT Trainer and Psychologist, Real IRM Solutions, gave a very interesting presentation entitled “You can Lead a Horse to Water… Managing the Human Aspects of Change in EA Implementations”. Joanne discussed managing, implementing, maintaining change and shared an in-depth analysis of the psychology of change.

“Outcome Driven Government and the Movement Towards Agility in Architecture” was presented by David Chesebrough, President, Association for Enterprise Information (AFEI). “IT Transformation reshapes business models, lean startups, web business challenges and even traditional organizations”, stated David.

Questions from attendees were addressed after each session.

In parallel with the plenary was the Healthcare Interoperability Day. Speakers from a wide range of Healthcare industry organizations, such as ONC, AMIA and Healthway shared their views and vision on how IT can improve the quality and efficiency of the Healthcare enterprise.

Before the plenary ended, Allen made another announcement. Allen is stepping down in April 2016 as President and CEO after more than 20 years with The Open Group, including the last 17 as CEO. After conducting a process to choose his successor, The Open Group Governing Board has selected Steve Nunn as his replacement who will assume the role with effect from November of this year. Steve is the current COO of The Open Group and CEO of the Association of Enterprise Architects. Please see press release here.By Loren K. Baynes, Director, Global Marketing Communications

Steve Nunn, Allen Brown

Afternoon track topics were comprised of EA Practice & Professional Development and Open Platform 3.0™.

After a very informative and productive day of sessions, workshops and presentations, event guests were treated to a dinner aboard the USS Constellation just a few minutes walk from the hotel. The USS Constellation constructed in 1854, is a sloop-of-war, the second US Navy ship to carry the name and is designated a National Historic Landmark.

By Loren K. Baynes, Director, Global Marketing CommunicationsUSS Constellation

On Wednesday, July 22, tracks continued: TOGAF® 9 Case Studies and Standard, EA & Capability Training, Knowledge Architecture and IT4IT™ – Managing the Business of IT.

Thursday consisted of members-only meetings which are closed sessions.

A special “thank you” goes to our sponsors and exhibitors: Avolution, SNA Technologies, BiZZdesign, Van Haren Publishing, AFEI and AEA.

Check out all the Twitter conversation about the event – @theopengroup #ogBWI

Event proceedings for all members and event attendees can be found here.

Hope to see you at The Open Group Edinburgh 2015 October 19-22! Please register here.

By Loren K. Baynes, Director, Global Marketing CommunicationsLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog, media relations and social media. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

Comments Off on The Open Group Baltimore 2015 Highlights

Filed under Accreditations, Boundaryless Information Flow™, Cybersecurity, Enterprise Architecture, Enterprise Transformation, Healthcare, Internet of Things, Interoperability, Open CA, Open Platform 3.0, Security, Security Architecture, The Open Group Baltimore 2015, TOGAF®

Managing Your Vulnerabilities: A Q&A with Jack Daniel

By The Open Group

With hacks and security breaches becoming more prevalent everyday, it’s incumbent on organizations to determine the areas where their systems may be vulnerable and take actions to better handle those vulnerabilities. Jack Daniel, a strategist with Tenable Network Security who has been active in securing networks and systems for more than 20 years, says that if companies start implementing vulnerability management on an incremental basis and use automation to help them, they can hopefully reach a point where they’re not constantly handling vulnerability crises.

Daniel will be speaking at The Open Group Baltimore event on July 20, presenting on “The Evolution of Vulnerability Management.” In advance of that event, we recently spoke to Daniel to get his perspective on hacker motivations, the state of vulnerability management in organizations today, the human problems that underlie security issues and why automation is key to better handling vulnerabilities.

How do you define vulnerability management?

Vulnerability detection is where this started. News would break years ago of some vulnerability, some weakness in a system—a fault in the configuration or software bug that allows bad things to happen. We used to really to do a hit-or-miss job of it, it didn’t have to be rushed at all. Depending on where you were or what you were doing, you might not be targeted—it would take months after something was released before bad people would start doing things with it. As criminals discovered there was money to be made in exploiting vulnerabilities, the attackers became more and more motivated by more than just notoriety. The early hacker scene that was disruptive or did criminal things was largely motivated by notoriety. As people realized they could make money, it became a problem, and that’s when we turned to management.

You have to manage finding vulnerabilities, detecting vulnerabilities and resolving them, which usually means patching but not always. There are a lot of ways to resolve or mitigate without actually patching, but the management aspect is discovering all the weaknesses in your environment—and that’s a really broad brush, depending on what you’re worried about. That could be you’re not compliant with PCI if you’re taking credit cards or it could be that bad guys can steal your database full of credit card numbers or intellectual property.

It’s finding all the weaknesses in your environment, the vulnerabilities, tracking them, resolving them and then continuing to track as new ones appear to make sure old ones don’t reappear. Or if they do reappear, what in your corporate process is allowing bad things to happen over and over again? It’s continuously doing this.

The pace of bad things has accelerated, the motivations of the actors have forked in a couple of directions, and to do a good job of vulnerability management really requires gathering data of different qualities and being able to make assessments about it and then applying what you know to what’s the most effective use of your resources—whether it’s time or money or employees to fix what you can.

What are the primary motivations you’re seeing with hacks today?

They fall into a couple big buckets, and there are a whole bunch of them. One common one is financial—these are the people that are stealing credit cards, stealing credentials so they can do bank wire fraud, or some other way to get at money. There are a variety of financial motivators.

There are also some others, depending on who you are. There’s the so-called ‘Hacktivist,’ which used to be a thing in the early days of hacking but has now become more widespread. These are folks like the Syrian Electronic Army or there’s various Turkish groups that through the years have done website defacements. These people are not trying to steal money, they’re trying to embarrass you, they’re trying to promote a message. It may be, as with the Syrian Electronic Army, they’re trying to support the ruler of whatever’s left of Syria. So there are political motivations. Anonymous did a lot of destructive things—or people calling themselves ‘Anonymous’—that’s a whole other conversation, but people do things under the banner of Anonymous as hacktivism that struck out at corporations they thought were unjust or unfair or they did political things.

Intellectual property theft would be the third big one, I think. Generally the finger is pointed at China, but it’s unfair to say they’re the only ones stealing trade secrets. People within your own country or your own market or region are stealing trade secrets continuously, too.

Those are the three big ones—money, hacktivism and intellectual property theft. It trickles down. One of the things that has come up more often over the past few years is people get attacked because of who they’re connected to. It’s a smaller portion of it and one that’s overlooked but is a message that people need to hear. For example, in the Target breach, it is claimed that the initial entry point was through the heating and air conditioning vendors’ computer systems and their access to the HVAC systems inside a Target facility, and, from there, they were able to get through. There are other stories about the companies where organizations have been targeted because of who they do business with. That’s usually a case of trying to attack somebody that’s well-secured and there’s not an easy way in, so you find out who does their heating and air-conditioning or who manages their remote data centers or something and you attack those people and then come in.

How is vulnerability management different from risk management?

It’s a subset of risk management. Risk management, when done well, gives a scope of a very large picture and helps you drill down into the details, but it has to factor in things above and beyond the more technical details of what we more typically think of as vulnerability management. Certainly they work together—you have to find what’s vulnerable and then you have to make assessments as to how you’re going to address your vulnerabilities, and that ideally should be done in a risk-based manner. Because as much as all of the reports from Verizon Data Breach Report and others say you have to fix everything, the reality is that not only can we not fix everything, we can’t fix a lot immediately so you really have to prioritize things. You have to have information to prioritize things, and that’s a challenge for many organizations.

Your session at The Open Group Baltimore event is on the evolution of vulnerability management—where does vulnerability management stand today and where does it need to go?

One of my opening slides sums it up—it used to be easy, and it’s not anymore. It’s like a lot of other things in security, it’s sort of a buzz phrase that’s never really taken off like it needs to at the enterprise level, which is as part of the operationalization of security. Security needs to be a component of running your organization and needs to be factored into a number of things.

The information security industry has a challenge and history of being a department in the middle and being obstructionist, which is I think is well deserved. But the real challenge is to cooperate more. We have to get a lot more information, which means working well with the rest of the organization, particularly networking and systems administrators and having conversations with them as far as the data and the environment and sharing and what we discover as problems without being the judgmental know-it-all security people. That is our stereotype. The adversaries are often far more cooperative than we are. In a lot of criminal forums, people will be fairly supportive of other people in their community—they’ll go up to where they reach the trade-secret level and stop—but if somebody’s not cutting into their profits, rumor is these people are cooperating and collaborating.

Within an organization, you need to work cross-organizationally. Information sharing is a very real piece of it. That’s not necessarily vulnerability management, but when you step into risk analysis and how you manage your environment, knowing what vulnerabilities you have is one thing, but knowing what vulnerabilities people are actually going to do bad things to requires information sharing, and that’s an industry wide challenge. It’s a challenge within our organizations, and outside it’s a real challenge across the enterprise, across industry, across government.

Why has that happened in the Security industry?

One is the stereotype—a lot of teams are very siloed, a lot of teams have their fiefdoms—that’s just human nature.

Another problem that everyone in security and technology faces is that we talk to all sorts of people and have all sorts of great conversations, learn amazing things, see amazing things and a lot of it is under NDA, formal or informal NDAs. And if it weren’t for friend-of-a-friend contacts a lot of information sharing would be dramatically less. A lot of the sanitized information that comes out is too sanitized to be useful. The Verizon Data Breach Report pointed out that there are similarities in attacks but they don’t line up with industry verticals as you might expect them to, so we have that challenge.

Another serious challenge we have in security, especially in the research community, is that there’s total distrust of the government. The Snowden revelations have really severely damaged the technology and security community’s faith in the government and willingness to cooperate with them. Further damaging that are the discussions about criminalizing many security tools—because the people in Congress don’t understand these things. We have a president who claims to be technologically savvy, and he is more than any before him, but he still doesn’t get it and he’s got advisors that don’t get it. So we have a great distrust of the government, which has been earned, despite the fact that any one of us in the industry knows folks at various agencies—whether the FBI or intelligence agencies or military —who are fantastic people—brilliant, hardworking patriotic—but the entities themselves are political entities, and that causes a lot of distrust in information sharing.

And there are just a lot of people that have the idea that they want proprietary information. This is not unique to security. There are a couple of different types of managers—there are people in organizations who strive to make themselves irreplaceable. As a manager, you’ve got to get those people out of your environment because they’re just poisonous. There are other people who strive to make it so that they can walk away at any time and it will be a minor inconvenience for someone to pick up the notes and run. Those are the type of people you should hang onto for dear life because they share information, they build knowledge, they build relationships. That’s just human nature. In security I don’t think there are enough people who are about building those bridges, building those communications paths, sharing what they’ve learned and trying to advance the cause. I think there’s still too many who horde information as a tool or a weapon.

Security is fundamentally a human problem amplified by technology. If you don’t address the human factors in it, you can have technological controls, but it still has to be managed by people. Human nature is a big part of what we do.

You advocate for automation to help with vulnerability management. Can automation catch the threats when hackers are becoming increasingly sophisticated and use bots themselves? Will this become a war of bot vs. bot?

A couple of points about automation. Our adversaries are using automation against us. We need to use automation to fight them, and we need to use as much automation as we can rely on to improve our situation. But at some point, we need smart people working on hard problems, and that’s not unique to security at all. The more you automate, at some point in time you have to look at whether your automation processes are improving things or not. If you’ve ever seen a big retailer or grocery store that has a person working full-time to manage the self-checkout line, that’s failed automation. That’s just one example of failed automation. Or if there’s a power or network outage at a hospital where everything is regulated and medications are regulated and then nobody can get their medications because the network’s down. Then you have patients suffering until somebody does something. They have manual systems that they have to fall back on and eventually some poor nurse has to spend an entire shift doing data entry because the systems failed so badly.

Automation doesn’t solve the problems—you have to automate the right things in the right ways, and the goal is to do the menial tasks in an automated fashion so you have to spend less human cycles. As a system or network administrator, you run into the same repetitive tasks over and over and you write scripts to do it or buy a tool to automate it. They same applies here –you want to filter through as much of the data as you can because one of the things that modern vulnerability management requires is a lot of data. It requires a ton of data, and it’s very easy to fall into an information overload situation. Where the tools can help is by filtering it down and reducing the amount of stuff that gets put in front of people to make decisions about, and that’s challenging. It’s a balance that requires continuous tuning—you don’t want it to miss anything so you want it to tell you everything that’s questionable but it can’t throw too many things at you that aren’t actually problems or people give up and ignore the problems. That was allegedly part of a couple of the major breaches last year. Alerts were triggered but nobody paid attention because they get tens of thousands of alerts a day as opposed to one big alert. One alert is hard to ignore—40,000 alerts and you just turn it off.

What’s the state of automated solutions today?

It’s pretty good if you tune it, but it takes maintenance. There isn’t an Easy Button, to use the Staples tagline. There’s not an Easy Button, and anyone promising an Easy Button is probably not being honest with you. But if you understand your environment and tune the vulnerability management and patch management tools (and a lot of them are administrative tools), you can automate a lot of it and you can reduce the pain dramatically. It does require a couple of very hard first steps. The first step in all of it is knowing what’s in your environment and knowing what’s crucial in your environment and understanding what you have because if you don’t know what you’ve got, you won’t be able to defend it well. It is pretty good but it does take a fair amount of effort to get to where you can make the best of it. Some organizations are certainly there, and some are not.

What do organizations need to consider when putting together a vulnerability management system?

One word: visibility. They need to understand that they need to be able to see and know what’s in the environment—everything that’s in their environment—and get good information on those systems. There needs to be visibility into a lot of systems that you don’t always have good visibility into. That means your mobile workforce with their laptops, that means mobile devices that are on the network, which are probably somewhere whether they belong there or not, that means understanding what’s on your network that’s not being managed actively, like Windows systems that might not be in active directory or RedHat systems that aren’t being managed by satellite or whatever systems you use to manage it.

Knowing everything that’s in the environment and its roles in the system—that’s a starting point. Then understanding what’s critical in the environment and how to prioritize that. The first step is really understanding your own environment and having visibility into the entire network—and that can extend to Cloud services if you’re using a lot of Cloud services. One of the conversations I’ve been having lately since the latest Akamai report was about IPv6. Most Americans are ignoring it even at the corporate level, and a lot of folks think you can ignore it still because we’re still routing most of our traffic over the IPv4 protocol. But IPv6 is active on just about every network out there. It’s just whether or not we actively measure and monitor it. The Akamai Report said something that a lot of folks have been saying for years and that’s that this is really a problem. Even though the adoption is pretty low, what you see if you start monitoring for it is people communicating in IPv6 whether intentionally or unintentionally. Often unintentionally because everythings’s enabled, so there’s often a whole swath of your network that people are ignoring. And you can’t have those huge blind spots in the environment, you just can’t. The vulnerability management program has to take into account that sort of overall view of the environment. Then once you’re there, you need a lot of help to solve the vulnerabilities, and that’s back to the human problem.

What should Enterprise Architects look for in an automated solution?

It really depends on the corporate need. They need to figure out whether or not the systems they’re looking at are going to find most or all of their network and discover all of the weakness, and then help them prioritize those. For example, can your systems do vulnerability analysis on newly discovered systems with little or no input? Can you automate detection? Can you automate confirmation of findings somehow? Can you interact with other systems? There’s a piece, too—what’s the rest of your environment look like? Are there ways into it? Does your vulnerability management system work with or understand all the things you’ve got? What if you have some unique network gear that your vulnerability management systems not going to tell you what the vulnerability’s in? There are German companies that like to use operating systems other than Windows and garden variety Linux distributions. Does it work in your environment and will it give you good coverage in your environment and can it take a lot of the mundane out of it?

How can companies maintain Boundaryless Information Flow™–particularly in an era of the Internet of Things–but still manage their vulnerabilities?

The challenge is a lot of people push back against high information flow because they can’t make sense of it; they can’t ingest the data, they can’t do anything with it. It’s the challenge of accepting and sharing a lot of information. It doesn’t matter whether vulnerability management or lot analysis or patch management or systems administration or back up or anything—the challenge is that networks have systems that share a lot of data but until you add context, it’s not really information. What we’re interested in in vulnerability management is different than what you’re automated backup is. The challenge is having systems that can share information outbound, share information inbound and then act rationally on only that which is relevant to them. That’s a real challenge because information overload is a problem that people have been complaining about for years, and it’s accelerating at a stunning rate.

You say Internet of Things, and I get a little frustrated when people treat that as a monolith because at one end an Internet enabled microwave or stove has one set of challenges, and they’re built on garbage commodity hardware with no maintenance ability at all. There are other things that people consider Internet of Things because they’re Internet enabled and they’re running Windows or a more mature Linux stack that has full management and somebody’s managing it. So there’s a huge gap between the managed IoT and the unmanaged, and the unmanaged is just adding low power machines in environments that will just amplify things like distributed denial of service (DoS). As it is, a lot of consumers have home routers that are being used to attack other people and do DoS attacks. A lot of the commercial stuff is being cleaned up, but a lot of the inexpensive home routers that people have are being used, and if those are used and misused or misconfigured or attacked with worms that can change the settings for things to have everything in the network participate in.

The thing with the evolution of vulnerability management is that we’re trying to drive people to a continuous monitoring situation. That’s where the federal government has gone, that’s where a lot of industries are, and it’s a challenge to go from infrequent or even frequent big scans to watching things continuously. The key is to take incremental steps, and the goal is, instead of having a big massive vulnerability project every quarter or every month, the goal is to get down to where it’s part of the routine, you’re taking small remediated measures on a daily or regular basis. There’s still going to be things when Microsoft or Oracle come out with a big patch that will require a bigger tool-up but you’re going to need to do this continuously and reach that point where you do small pieces of the task continuously rather than one big task. That’s the goal is to get to where you’re doing this continuously so you get to where you’re blowing out birthday candles rather than putting out forest fires.

Jack Daniel, a strategist at Tenable Network Security, has over 20 years experience in network and system administration and security, and has worked in a variety of practitioner and management positions. A technology community activist, he supports several information security and technology organizations. Jack is a co-founder of Security BSides, serves on the boards of three Security BSides non-profit corporations, and helps organize Security B-Sides events. Jack is a regular, featured speaker at ShmooCon, SOURCE Boston, DEF CON, RSA and other marque conferences. Jack is a CISSP, holds CCSK, and is a Microsoft MVP for Enterprise Security.

Join the conversation – @theopengroup #ogchat #ogBWI

Comments Off on Managing Your Vulnerabilities: A Q&A with Jack Daniel

Filed under Boundaryless Information Flow™, Internet of Things, RISK Management, Security, The Open Group, The Open Group Baltimore 2015