Category Archives: Internet of Things

The Open Group San Francisco Day Two Highlights

By The Open Group

Day two of The Open Group San Francisco event was held Tuesday, January 31 on another sunny, winter day in San Francisco. Tuesday’s welcome address featured Steve Nunn, President & CEO, and Jim Hietala, VP Business Development and Security, both of The Open Group, greeting attendees for a morning of sessions centered around the theme of Making Standards Work®. Nunn kicked off the morning by reporting that the first day of the conference had been very well received with copious positive feedback on Monday’s speakers.

It was also announced that the first certification courses for ArchiMate® 3.0 , an Open Group standard, kicked off at the conference. In addition, the San Francisco event marked the launch of The Open Group Open Process Automation™ Forum, a Forum of The Open Group, which will address standards development for open, secure, interoperable process control architectures. The Forum will include end users, suppliers, systems integrators, integrated DCS vendors, standards organizations and academics from a variety of industries, including food and beverage, oil and gas, pulp and paper, petrochemical, pharmaceuticals, metals and mining, and utilities.  Hietala joined Nunn on stage to discuss the launch of the Forum, which came out of a vision from ExxonMobil. The Forum has already grown rapidly, with almost 100 members. Forum Members are also attending and holding events at the annual ARC Advisory Group Industry Forum in Orlando.

The morning plenary began with Dennis Stevens from Lockheed Martin discussing “The Influence of Open Architecture Standards on the Emergence of Advance Process Control Systems.” Stevens, who is involved in The Open Group FACE™ Consortium, will also be leading the Open Process Automation Forum. Stevens opened by saying that this is a particularly exciting time in industrial automation due to of the intersection of standards, technology and automation. According to Stevens, the work that has been done in the FACE Forum over the past few years has paved the way for what also needs to be done in process automation.

Stevens noted that many of the industrial systems in use today will be facing obsolescence in the next few years due to a variety of reasons, including a proliferation of proprietary and closed systems, a lack of sophisticated development tools and the high-cost of technology refreshes. Tech trends such as the Internet of Things, cybersecurity, open source and virtualization are also forcing a need for industrial manufacturers to change. In addition, the growth of complexity in software systems and the changeover from hardware dominant to software dominant systems is also compelling factors for automation change. However, Stevens says, by reusing existing and creating new standards, there are many opportunities for cost savings and reducing complexity.

According the Stevens, the goal is to standardize the interfaces that companies can use so there is interoperability across systems built atop a common framework. By standardizing the interface only, organizations can still differentiate themselves by bringing their own business processes and designs to those systems via hardware or software components. In addition, by bringing elements from the FACE standardization model to Open Process Automation, the new forum can also take advantage of proven processes that already take into account regulations around co-opetition and anti-trust. Stevens believes that Open Process Automation will ultimately enable new markets and suppliers for process automation as well as lower the cost of doing business in industrial automation.

Following the morning break, Chair of the Department of Economics at San Jose State University Dr. Lydia Ortega took stage for the second morning session, entitled “Innovative Communities.”  Ortega took a refreshing look at what The Open Group does and how it works by applying economic theory to illustrate how the organization is an “Innovative community.” Ortega began by providing what she called an “economist’s definition” of what open standards are, which she defined as a collection of dispersed knowledge that is a building block for innovation and is continually evolving. She also described open standards as a “public good,” due to the fact that they are knowledge-based, non-rivalrous, non-excludable and produced once and available to others at marginal cost. Teamwork, consensus, community are also characterizing features of what makes the organization work. Ortega plans to continue her research into what makes The Open Group work by examining competing standards bodies and the organization’s origins among other things.

Prior to introducing the next session, Steve Nunn presented an award to Steve Whitlock, a long-time Open Group member who recently retired from Boeing, for more than 20 years of leadership, contributions and service to The Open Group. Colleagues provided additional praise for Whitlock and his willingness to lead activities on behalf of The Open Group and its members, particularly in the area of security.

The morning’s third session featured Mike Jerbic, Principal Consultant for Trusted System Consulting Group, highlighting how the “Norwegian Regional Healthcare Project & Open FAIR” have been used to analyze the cost benefits of a home treatment program for dialysis patients in Norway. Currently, due to health and privacy regulations and security requirements, patients who receive home dialysis must physically transport data regarding their treatments to hospitals, which affects the quality of patient’s lives but protects the state from security issues related to transporting data online. Jerbic and a group of economics students at San Jose State University in California did an economic analysis to examine the costs vs. benefits of the program. Using The Open Group Open FAIR™ body of knowledge to analyze the potential threats to both patient privacy and information security, the group found it would make sense to pose the program risks as an engineering problem to be solved. However, they must do additional research to weigh the benefits of potential cost savings to the state vs. the benefits of quality of life for patients.

Concluding Tuesday’s plenary sessions was a panel entitled “Open FAIR in Practice,” which extended the conversation regarding the Norwegian healthcare project by taking questions from the audience about the program. Jerbic moderated the panel, which included Ortega; Eva Kuiper, ESS GRC Security Consultant, HPE; John Linford, Lecturer, Department of Economics, San Jose State University; and Sushmitha Kasturi, Undergraduate Researches, San Jose State University.

Jerbic also announced that a number of students from San Jose State, many of whom were in attendance, have recently either completed or begun their certification in Open FAIR.  He also talked about an Academic Program within The Open Group that is working with students on projects that are mutually beneficial, allowing The Open Group to get help with the work needed to create standards, while providing important practical work experience for students.

by-the-open-group

by-the-open-group

San Jose State University Students

Following the plenary, Tuesday’s lunchtime partner presentation featured Sean Cleary, Senior Consultant, Orbus Software, presenting on “Architecture Roadmap Visualization with ArchiMate® 3.0.”

Afternoon sessions were split into two tracks, Cognitive Computing and EA in Practice.

  • EA in Practice – Hosted by Len Fehskens of the Association of Enterprise Architects, two sessions looked at maxims and folktales for architects, presented by Fehskens, and how to enable government and management with continuous audits with Robert Weisman, CEO/COO of Build the Vision.
  • Cognitive Computing – Chris Harding from The Open Group served as host for four sessions in the track:
    • Ali Arsanjani, CTO for Analytics and Emerging Technologies, IBM – Arsanjani provided an overview of different ways that data can be structured for cognitive computing applications. According to Arsanjani, cognitive systems are meant to augment, not replace, human systems and to be of service to us. By combining human interaction and curation with automated data analysis and machine learning, companies will be able to gain greater business advantages. However, we also must also always be aware of the implications of using artificial systems and the potential consequences of doing so, he said.
    • Jitendra Maan, Enterprise Architect and Center of Excellence Lead, Tata Consultancy Services – Maan says cognitive computing signals a shift in how machines interact with humans, other machines and the environment, with potential for new categories of business outcomes and disruption. The design of automated systems is critical to how cognitive systems are expected to evolve but unlike traditional computing, cognitive will rely on a combination of natural language processing, machine learning and data. Potential business applications already in progress include service support centers, contract management, risk assessment, intelligent chat bots and conversation work flows. Maan predicts bots will actually replace many service functions in the next few years.
    • Swaminathan Chandrsekaran, Industry Apps & Solutions, IBM Watson, both of IBM – Chandrsekaran’s talk took a deeper dive into cognitive computing and the make-up of cognitive systems. Understanding, reason, learning and interaction are key to teaching cognitive systems how to work, he said. Cognitive systems are also broadly categorized around language, speech, vision and data & insights, much like the human brain. Patterns can generally be created from cognitive conversations, discovery and application extensions. Chandreskaran also shared how to model a reference architecture for a cognitive conversation pattern.
    • The Cognitive Computing panel, moderated by Harding, included afternoon speakers Arsanjani, Maan and Chandrsekaran. The panel discussed how businesses can gain advantage from cognitive computing, learned personalization and contextualization via systems training, the time it takes to train a system (now days or weeks vs. months or years), making the systems more intelligent over time, and the need to aggregate and curate data from the beginning of a project and also focus on introducing domain-relevant data, as well as the importance of good data curation.

The day concluded with a social event and dinner for attendees held at the Autodesk Gallery, a San Francisco destination that marries creativity, design and engineering in more than 20 exhibits sponsored by companies such as Lego and Mercedes Benz.

by-the-open-group

Networking at the Autodesk Gallery

The following day, the event offered track sessions in areas including  Internet of Things (IoT) and Architecture.  The Open Group San Francisco drew to a close with Members Only Meetings on February 2.

@theopengroup #ogSFO

We are looking forward to seeing you at The Open Group Berlin April 24-27, 2017! #ogBER

 

Leave a comment

Filed under ArchiMate®, Digital Transformation, Enterprise Architecture, Enterprise Architecture (EA), FACE™, Internet of Things, IoT, O-BA Standard, Open Business Architecture (O-BA), Open FAIR, Open Process Automation, standards, Steve Nunn, The Open Group, The Open Group San Francisco 2016, TOGAF®, Uncategorized

To Colonize Mars, Look to Standards Development

By The Open Group

In advance of The Open Group San Francisco 2017, we spoke with Keegan Kirkpatrick, one of the co-founders of RedWorks, a “NewSpace” start-up focused on building 3D printable habitats for use on earth and in space.  Kirkpatrick will be speaking during the Open Platform 3.0™/Internet of Things (IoT) session on February 1.

Keegan Kirkpatrick believes that if we are to someday realize the dream of colonizing Mars, Enterprise Architects will play a critical role in getting us there.

Kirkpatrick defines the contemporary NewSpace industry as a group of companies that are looking to create near-term solutions that can be used on Earth, derived from solutions created for long-term use in space. With more private companies getting into the space game than ever before, Kirkpatrick believes the means to create habitable environments on the moon or on other planets isn’t nearly as far away as we might think.

“The space economy has always been 20 years away from where you’re standing now,” he says.

But with new entrepreneurs and space ventures following the lead of Elon Musk’s SpaceX, the space industry is starting to heat up, branching out beyond traditional aerospace and defense players like NASA, Boeing or Lockheed Martin.

“Now it’s more like five to ten years away,” Kirkpatrick says.

Kirkpatrick, who has a background in aerospace engineering, says RedWorks was born out of NASA’s 3D Printed Habitat Challenge, a “Centennial Challenge” where people from all kinds of backgrounds competed to create 3D printing/construction solutions for building and surviving on Mars.

“I was looking to get involved in the challenge. The idea of 3D printing habitats for Mars was fascinating to me. How do we solve the mass problem? How do we allow people to be self-sufficient on Mars once they get there?” he says.

Kirkpatrick says the company came together when he found a small 3D printing company in Lancaster, Calif., close to where he lives, and went to visit them. “About 20 minutes later, RedWorks was born,” he says. The company currently consists of Kirkpatrick, a 3D printing expert, and a geologist, along with student volunteers and a small team of engineers and technicians.

Like other NewSpace companies, RedWorks is focusing on terrestrial solutions first; both in order to create immediate value for what they’re doing and to help raise capital. As such, the company is looking to design and build homes by 3D printing low-cost materials that can be used in places that have a need for low-cost housing. The company is talking with real estate developers and urban planners and looking to areas where affordable housing might be able to be built entirely on site using their Mars-derived solutions.

“Terrestrial first is where the industry is going,” Kirkpatrick says. “You’ll see more players showing up in the next few years trying to capitalize on Earth-based challenges with space-based solutions.”

RedWorks plans to use parametric architecture models and parametric planning (design processes based on algorithmic thinking in which the relationship between elements is used to inform the design of complex structures) to create software for planning the printable communities and buildings. In the short-term, Kirkpatrick believes 3D printing can be used to create smart-city living solutions. The goal is to be able to combine 3D printing and embedded software so that people can design solutions specific to the environments where they’ll be used. (Hence the need for a geologist on their team.) Then they can build everything they need on site.

“For Mars, to make it a place that you can colonize, not just explore, you need to create the tools that people with not much of an engineering or space architecture background can use to set up a colony wherever they happen to land,” Kirkpatrick says. “The idea is if you have X number of people and you need to make a colony Y big, then the habitat design will scale everything with necessary utilities and living spaces entirely on-site. Then you can make use of the tools that you bring with you to print out a complete structure.”

Kirkpatrick says the objective is to be able to use materials native to each environment in order to create and print the structures. Because dirt and sand on Earth are fundamentally similar to the type of silicate materials found on the Moon and Mars, RedWorks is looking to develop a general-purpose silica printer that can be used to build 3D structures. That’s why they’re looking first to develop structures in desert climates, such southern California, North Africa and the Middle East.

A role for architecture and standards

As the private, NewSpace industry begins to take off, he believes there will be a strong need for standards to guide the nascent industry—and for Enterprise Architects to help navigate the complexities that will come with designing the technology that will enable the industry.

“Standards are necessary for collaborating and managing how fast this will take off,” he says.

Kirkpatrick also believes that developing open standards for the new space industry will better help NewSpace companies figure out how they can work together. Although he says many of NewSpace start-ups already have an interest in collaborating, with much of their work in the very early stages, they do not necessarily have much incentive to work together as of yet. However, he says, “everyone realizes that collaboration will be critical for the long-term development of the industry.”  Beginning to work toward standards development with an organization such as The Open Group now will help incentivize the NewSpace community to work together—and thus push the industry along even faster, Kirkpatrick says.

“Everyone’s trying to help each other as much as they can right now, but there’s not a lot of mechanisms in place to do so,” he says.

According to Kirkpatrick, it’s important to begin to think about standards for space-related technology solutions before the industry reaches an inflection point and begins to take off quickly. Kirkpatrick expects that inflection point will occur once a launcher like SpaceX is able to do full return landings of its rockets that are then ready for reuse. He expects that launch costs will begin to fall rapidly over the next five to ten years once launch providers can offer reliable reusable launch services, spurring the industry forward.

“Once you see launch costs fall by a factor of 10 or 100, the business side of the industry is going to grow like a weed. We need the infrastructure in place for everyone to work together and enable this incredible opportunity we have in space. There’s a very bright horizon ahead of use that’s just a little hard for everyone to see right now. But it’s coming faster than anyone realizes.”

@theopengroup #ogSFO

by-the-open-groupKeegan Kirkpatrick is the Team Lead and founder of RedWorks, a NewSpace startup in Lancaster, California. He has an undergraduate degree in Aerospace Engineering from Embry-Riddle Aeronautical University, and before turning entrepreneur worked as an engineer at Masten Space Systems on the Mojave Air and Spaceport.

In 2015, Keegan founded RedWorks with Paul Petros, Susan Jennings, and Lino Stavole to compete in and make it to the finals of the NASA Centennial 3D Printed Habitat Challenge. Keegan’s team is creating ways to 3D-print habitats from on-site materials, laying the groundwork for human settlement of the solar system.

Leave a comment

Filed under digital technologies, Enterprise Architecture (EA), Future Technologies, Internet of Things, IoT, Open Platform 3.0, Standards, The Open Group, The Open Group San Francisco 2017, Uncategorized

The Role of Enterprise Architecture in Platform 3.0 Transformation

By Stuart Macgregor, CEO, Real IRM and The Open Group South Africa

Our transition to the highly-connected realm of Platform 3.0 will radically disrupt the way that we approach Enterprise Architecture (EA).

The current architectures and methodologies will simply not hold up in the era of Platform 3.0 – characterised by the forces of big data, mobility, the Internet of Things, and social media colliding.

In the Platform 3.0 era, power shifts to the customer – as we choose from a range of services offered conveniently via digital channels. By embracing Platform 3.0, organisations can respond to newly-empowered customers. New entrants can scale at unprecedented rates, and incumbents can pivot business models rapidly, while entering and exiting new markets as opportunities emerge.

EA plays an essential role in making these possibilities a reality. EA infuses IT into the DNA of the business. No longer is it about ‘IT’ and ‘business’. Technology is absolutely integral to the entire business, and business leaders are quickly realising the fundamental truth that ‘if you can’t change the system, you can’t change the business’.

A new and exciting Platform 3.0 architectural reality is emerging. It’s composed of microservices and platforms that are combined in radical new ways to serve point-in-time needs – powering new-found business opportunities and revenue streams, dramatically transforming your organisation.

Platform 3.0 refers to radically different ways for the organisation to securely engage with partners, suppliers, and others in your value chain or ecosystem.”

Managing volatile change

But, while driven by an urgent need to transform, to become faster and more agile, large organisations are often constrained by legacy infrastructure.

With an EA-focused approach, organisations can take a step back, and design a set of architectures to manage the volatile change that’s inherent in today’s quickly-digitising industries. EA allows business systems in different departments to be united, creating what The Open Group (the vendor-neutral global IT standards and certifications consortium) aptly describes as a “boundaryless” flow of information throughout the organisation.

Platform 3.0 refers to radically different ways for the organisation to securely engage with partners, suppliers, and others in your value chain or ecosystem. For a retailer, stock suppliers could access real-time views of your inventory levels and automatically prepare new orders. Or a factory, for example, could allow downstream distributors a view of the production facility, to know when the latest batch run will be ready for collection.

In almost every industry, there are a number of new disruptors offering complementary service offerings to incumbent players (such as Fintech players in the Banking industry). To embrace partnerships, venture-capital opportunities, and acquisitions, organisations need extensible architectural platforms.

More and more transactions are moving between organisations via connected, instantaneous, automated platforms. We’re seeing the fulfilment of The Open Group vision of Boundaryless Information Flow™ between organisations and fuels greater efficiencies.

Architecting for an uncertain future

We need to architect for an uncertain future, resigning ourselves to not always knowing what will come next, but being prepared with an architectural approach that enables the discovery of next-generation digital business opportunities.

By exploring open standards, this transformation can be accelerated. The concept of ‘openness’ is at the very heart of Platform 3.0-based business transformation. As different business systems fall into and out of favour, you’ll want to benefit from new innovations by quickly unplugging one piece of the infrastructure, and plugging in a new piece.

Open standards allow us to evolve from our tired and traditional applications, to dynamic catalogues of microservices and APIs that spark continuous business evolution and renewal. Open standards help up to reach a state of radical simplicity with our architecture.

The old-world view of an application is transformed into new applications – volatile and continually morphing – combining sets of APIs that run microservices, and serve a particular business need at a particular point-in-time. These APIs and microservices will form the basis for whatever application we’d like to build on top of it.

Architects need to prepare themselves and their organisations for an uncertain future, where technology’s evolution and businesses’ changing demands are not clearly known. By starting with a clear understanding of the essential building blocks, and the frameworks to re-assemble these in new ways in the future, one can architect for the uncertain future lying in wait.

Platform 3.0 requires a shift towards “human-centered architectures”: where we start acknowledging that there’s no single version of the truth. Depending on one’s role and skill-set, and the level of detail they require, everyone will perceive the organisation’s structure and processes differently.

But ultimately, it’s not about the user, or the technology, or the architecture itself. The true value resides in the content, and not the applications that house, transmit or present that content. Human-centered architectural principles place the emphasis on the content, and the way in which different individuals (from inside or outside the organisation) need to use that content in their respective roles.

As the EA practice formalises intellectual capital in the form of business models and rules, we create an environment for machine learning and artificial intelligence to play an essential role in the future of the organisation. Many describe this as the future of Platform 3.0, perhaps even the beginning of Platform 4.0?

Where this will eventually lead us is both exciting and terrifying.

@theopengroup

by-stuart-macgregor-ceo-real-irm

Stuart Macgregor is the CEO, Real IRM Solutions and  The Open Group South Africa. Through his personal achievements, he has gained the reputation of an Enterprise Architecture and IT Governance specialist, both in South Africa and internationally.

Macgregor participated in the development of the Microsoft Enterprise Computing Roadmap in Seattle. He was then invited by John Zachman to Scottsdale, Arizona to present a paper on using the Zachman framework to implement ERP systems. In addition, Macgregor was selected as a member of both the SAP AG Global Customer Council for Knowledge Management, and of the panel that developed COBIT 3rd Edition Management Guidelines. He has also assisted a global Life Sciences manufacturer to define their IT Governance framework, a major financial institution to define their global, regional and local IT organizational designs and strategy. He was also selected as a core member of the team that developed the South African Breweries (SABMiller) plc global IT strategy.

Stuart, as the lead researcher, assisted the IT Governance Institute map CobiT 4.0 to TOGAF®, an Open Group standard. This mapping document was published by ISACA and The Open Group. He participated in the COBIT 5 development workshop held in London in 2010.

1 Comment

Filed under architecture, Boundaryless Information Flow™, digital business, Enterprise Architecture, Enterprise Architecture (EA), Future Technologies, Internet of Things, Open Platform 3.0, Platform 3.0, Standards, The Open Group, Uncategorized

The Enviable Pedigree of UNIX® and POSIX®

By Andrew Josey, VP, Standards and Certification, The Open Group

Technology can be a fickle thing. Spurred by perpetual innovation, the one constant in the tech industry is change. As such, we can expect that whatever is the hottest thing in the industry today—Cloud, Big Data, Mobile, Social, what have you—will be yesterday’s news within a few years’ time. That is how the industry moves and sustains itself, with constant development and creativity—all of which is only getting faster and faster.

But today’s breakthroughs would be nowhere and would not have been possible without what came before them—a fact we sometimes forget. Mainframes led to personal computers, which gave way to laptops, then tablets and smartphones, and now the Internet of Things. Today much of the interoperability we enjoy between our devices and systems—whether at home, the office or across the globe—owes itself to efforts in the 1980s and 1990s to make an interoperable operating system (OS) that could be used across diverse computing environments—the UNIX operating system.

Created at AT&T Bell Laboratories in the early 1970s, the UNIX operating system was developed as a self-contained system that could be easily adapted and run on commodity hardware. By the 1980s, UNIX workstations were widely used in academia and commercially, with a large number of system suppliers, such as HP, IBM, and Sun Microsystems (now Oracle), developing their own flavors of the OS.

At the same time, a number of organizations began standardization efforts around the system. By the late 1980s, three separate organizations were publishing different standards for the UNIX operating system, including IEEE, ISO/IEC JTC1 and X/Open (which eventually became The Open Group).

As part of the standardization efforts undertaken by IEEE, it developed a small set of application programming interfaces (APIs). This effort was known as POSIX, or Portable Operation System Interface. Published in 1988, the POSIX.1 standard was the first attempt outside the work at AT&T and BSD (the UNIX derivative developed at the University of California at Berkeley) to create common APIs for UNIX systems. In parallel, X/Open (an industry consortium consisting at that time of over twenty UNIX suppliers) began developing a set of standards aligned with POSIX that consisted of a superset of the POSIX APIs.  The X/Open standard was known as the X/Open Portability Guide and had an emphasis on usability. ISO also got involved in the efforts, by taking the POSIX standard and internationalizing it.

In 1995, the Single UNIX Specification was created to represent the core of the UNIX brand. Born of a superset of POSIX APIs, the specification provided a richer set of requirements than POSIX for functionality, scalability, reliability and portability for multiuser computing systems. At the same time, the UNIX trademark was transferred to X/Open (now The Open Group). Today, The Open Group holds the trademark in trust for the industry, and suppliers that develop UNIX systems undergo certification, which includes over 40,000 tests, to assure their compatibility and conformance to the standard.

These tri-furcated efforts by separate standards organizations continued through most of the 1990s, with the people involved in developing the standards constantly bouncing between organizations and separate meetings. In late 1997, a number of vendors became tired of having three separate parallel efforts to keep track of and they suggested all three organizations come together to work on one standard.

In 1998, The Open Group, which had formed through the merger of X/Open and the Open Software Foundation, met with the ISO/IEC JTC 1 and IEEE technical experts for an inaugural meeting at IBM’s offices in Austin, Texas. At this meeting, it was agreed that they would work together on a single set of standards that each organization could approve and publish. Since then the approach to specification development has been “write once, adopt everywhere,” with the deliverables being a set of specifications that carry the IEEE POSIX designation, The Open Group Technical Standard designation, and the ISO/IEC designation. Known as the Austin Group, the three bodies still work together today to progress both the joint standard. The new standard not only streamlined the documentation needed to work with the APIs but simplified what was available to the market under one common standard.

A constant evolution

As an operating system that forms the foundational underpinnings of many prominent computing systems, the UNIX OS has always had a number of advantages over other operating systems. One of the advantages is that those APIs have made it possible to write code that conforms to the standard that can run on multiple systems made by different vendors. If you write your code to the UNIX standard, it will run on systems made by IBM, HP, Oracle and Apple, since they all follow the UNIX standard and have submitted their operating systems for formal certification. Free OSs such as Linux and BSD also support the majority of the UNIX and POSIX APIs, so those systems are also compatible with all the others. That level of portability is key for the industry and users, enabling application portability across a wide range of systems.

In addition, UNIX is known for its stability and reliability—even at great scale. Apple claims over 80 million Mac OS X systems in use today – all of them UNIX certified. In addition, the UNIX OS forms the basis for many “big iron” systems. The operating systems’ high through-put and processing power have made it an ideal OS for everything from supercomputing to systems used by the government and financial sectors—all of which require high reliability, scale and fast data processing.

The standard has also been developed such that it allows users to “slice and dice” portions of it for use even when they don’t require the full functionality of the system, since one size does not fit all. Known as “profiles,” these subsets of the standard API sets can be used for any number of applications or devices. So although not full UNIX systems, we see a lot of devices out there with the standard APIs inside them, notably set top boxes, home routers, in-flight entertainment systems and many smart phones.

Although the UNIX and POSIX standards tend to be hidden, deeply embedded in the technologies and devices they enable today, they have been responsible for a great many advances across industries from science to entertainment. Consider the following:

  • Apple’s Mac OS X, the second widely most used desktop system today is a certified UNIX system
  • The first Internet server for the World Wide Web developed by Tim Berners Lee was developed on a UNIX system
  • The establishment of the World Wide Web was driven by the availability of connected UNIX systems
  • IBM’s Deep Blue supercomputer, a UNIX system, was the first computer to beat World Chess Champion Gary Kasparov in 1997
  • Both DNA and RNA were sequenced using a UNIX system
  • For eight consecutive years (1995-2002), each film nominated for an Academy Award for Distinguished Achievement in Visual Effects was created on Silicon Graphics computers running the UNIX OS.

Despite what one might think, both the UNIX and POSIX standards are continually under development still even today.  The community for each is very active—meeting more than 40 times a year to continue developing the specifications.

Things are always changing, so there are new areas of functionality to standardize. The standard is also large so there is a lot of maintenance and ways to improve clarity and portability across systems.

Although it might seem that once a technology becomes standardized it becomes static, standardization usually has the opposite effect—once there is a standard, the market tends to grow even more because organizations know that the technology is trusted and stable enough to build upon. Once the platform is there, you can add things to it and run things above it. We have about 2,000 application interfaces in UNIX today.

And as Internet-worked devices continue to proliferate in today’s connected world, chances are many of these systems that need big processing power, high reliability and huge scale are going to have a piece of the UNIX standard behind them—even if it’s deep beneath the covers.

By Andrew JoseyAndrew Josey is VP, Standards and Certification at The Open Group overseeing all certification and testing programs. He also manages the standards process for The Open Group.

Since joining the company in 1996, Andrew has been closely involved with the standards development, certification and testing activities of The Open Group. He has led many standards development projects including specification and certification development for the ArchiMate®, TOGAF®, POSIX® and UNIX® programs.

He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects (AEA).  He holds an MSc in Computer Science from University College London.

@theopengroup

1 Comment

Filed under Association of Enterprise Architects (AEA), Certifications, digital business, enterprise architecture, Enterprise Architecture (EA), Internet of Things, IoT, IT, operating system, Oracle, Single UNIX Specification, standards, Uncategorized, UNIX

Using Apprenticeships to Develop Your IT Workforce: A Conversation with Andy Ruth

By The Open Group

It’s no secret that the IT workforce is suffering from a skills gap. Not only are there not enough workers available to fill tech positions at many companies, but even the workers available may not possess the skills that companies need today to deal with the rapid changes being brought about by digital transformation.

Andy Ruth, Managing Director of Sustainable Evolution, spoke at The Open Group Austin 2016 in July about one way companies can tackle the skills gap—apprenticeship programs. We spoke with Andy about the state of the IT workforce, why apprenticeship works and how it can help bring a new and more diverse population of workers into the IT workforce.

What are some of the things currently stymieing the IT work force?

There are a couple different things that are really a challenge. We have an older workforce that is being replaced in large part by a younger workforce. That younger workforce is smaller and many don’t have fundamental knowledge of what’s going on under the covers because they grew up learning in a world with higher levels of abstraction. For instance, if someone learns Python or Rails, they may not have the deeper understanding and stronger foundations that they might if they were to start with C or C+. I was coaching a kid that’s going to MIT, and he asked ‘What do I need to do while I’m there?’ I suggested he build an operating system for one of the new IoT processors and learn the C language. He countered with ‘Well, C’s not in use anymore and nobody builds operating systems,’ to which I said, ‘Perhaps, but that builds deep understanding and good fundamentals. You’ll know how things work and you can think deeply about it. That’s what you need is that foundation, just like you need to be able to do simple math before algebra, trig and physics.’ So, I think part of it is the shift in what and how the workforce learns.

We also are in a time of such tremendous change in IT. IT is about people, process and technology. In the past we have had big shifts in technology, then we change process and people to match. Right now we have change in all three, each having an impact on the other two. Technology change is the easiest to adopt since we are geeks and naturally track it. Process change is a bit more challenging and not as interesting, so a bit harder. People are the hardest to change because they like working the way they like to work. They don’t like to be told what to do or how to do it, and really don’t feel they need someone to tell them they need to change. Having change in people, process and technology at the same time is disruptive to people.

The change is especially hard for architects since we typically have a number of years in the industry and everything is completely different from what we grew up with. We are responsible for planning the changes needed to people, process and technology, and if we haven’t experienced it we don’t know how to get started. Also, a lot of us want to stick with the old ways or haven’t needed to change yet. We used to ask ourselves if we should still code as an architect, now if we are not coding we are not relevant.

We’ve also changed the way we develop software and the way that IT works altogether. We shifted from waterfall to agile approaches, and now DevOps is the latest approach. With architecture, we no longer have the luxury of doing heavy design and evaluation. Rather, we get started and learn as we go. If we take the wrong path, we start over. I think that it’s a challenge across the board. Worst of all, many of us haven’t worked in modern IT environments so we’re not able to teach the younger folks how to be successful in the new paradigm. Unless people have been in a start-up environment, they probably haven’t worked in the modern IT workspace.

Why is there a disconnect between the skills IT people are learning and what the workforce requires?

Two groups of people need education or reeducation. Let me address the new workforce or kids going to college first. It takes about three years to get a curriculum change into the college system, so there is a natural lag. Some colleges work closely with start-up companies or big comm and those colleges can make the change fairly quickly. For the colleges working with some of the older echelon companies that have been playing it safe, they don’t have the awareness of what’s going on in the industry, so they’re slower to change their curriculum—those are the two key pieces.

In terms of the workforce at large and their reeducation, IT has been run the same way for a long time and business has run so close to the bone. There are a lot of companies that are not operating in SOA environments and are not ready for the digital transformation going on right now. People have not been able to apply modern IT techniques at work, and hands-on is the best way to learn. Since they haven’t changed, a lot of existing staff haven’t learned the new technologies and approaches.

In the early 2000s we shifted from a structured and composed N-tier environment to decomposed integration (SOA) environments. Some companies have adopted that and some haven’t. Now we’re moving from SOA on-premise to leveraging the Cloud. People and organizations who haven’t adopted SOA yet have to take two major leaps with their people, process and technology. A majority of companies are in that boat, where they have to shift to service orientation and then have to figure out how to design for the cloud. That is two gigantic leaps, and people can take one leap at a time—often unwillingly, but they can take it. When they have to jump two levels, it kills them and they’re paralyzed.

Is that part of the reason we’re now seeing companies doing bi-modal IT?

Bi-modal or multi-model are needed to successfully adopt modern concepts and complete digital transformation. In some conversations I’ve had, there’s a difference of opinion in what bi-modal means. One is, you have an IT department that runs at two different speeds. The first speed is for the systems of record, and the second is for systems of integration. Another way to put that is that you have a consistent core and you have agility at the edge. When you move from a large system and start decomposing it, you pick off integration pieces and develop using more agile approaches. For the big back-end chunks, you take more time planning and longer timeline efforts.

Another, much more controversial definition of bi-modal is that you gracefully retire the old guard by bringing in fresh talent while modernizing your IT environment. You have the old guard maintain the current environment and the new people work on the transition to the new environment. Once you have enough talent and technology operating in the new environment you deprecate the old. If you can’t get the experienced people to shift to the new ways, they are part of that deprecation process.

What can companies do to better train and maintain employees? That seems to be a continual problem at most companies.

Invest in people and spotlight the ones that are making the shift to modern IT. That’s my passion area. As I have worked with IT groups I’ve seen the retraining budget go from about $14,000 per year per person down to a few thousand dollars down to almost zero. At the same time, there have been massive layoffs occurring all over the place so there’s no loyalty or reason to learn. Experienced people have little or no loyalty to the companies they work for and new entrants only work for a company for about 18 months, then move. If you’re a millennial in any job for more than three years then other millennials start looking at you funny like you can’t get another job. In that type of environment there’s not a lot of emphasis on the company investing in the employee or in the employee having company loyalty.

The way that I’ve been approaching it, and it’s been very successful, is by setting up apprenticeship programs very much like journeymen do in construction, or in hospitals where doctors go through residency programs for on-the-job training. I break the skills acquisition into two pieces—one is the very specific skills for the organization that can’t be taught but need to be experienced through on-the-job training. For instance, I am talking to one organization that needs 250 people on staff that can do integration. They either can’t find the talent or the talent is out of price range or unwilling to move. So I gave them an approach where they take the concept of apprenticeship and bring in people that have the key entry level skills and the right work ethic, and then pair them with someone that’s experienced with integration in that environment. The person being mentored shadows the mentor to see how it’s done, and then the mentor shadows the person being mentored and provides coaching to accelerate the apprentice’s competence. You can do that for the skills associated with business capability.  

The other thing you do is help the apprentice with the foundational skills that are not specific to the job or to a business capability. The interpersonal skills, time management or whatever general skills they need to survive and maintain decent work/life balance. For these type of skills you provide external training and discussion rather than job shadowing. You make the mentor responsible for the care and growth of that individual, and you tie the mentor’s yearly review goals to their success at growing the new talent.

Have you been able to implement that at some specific companies and has it be successful?

I can’t name the companies but yes, I have been able to do it. I have also been operating my company this way to create and improve the process and build out the tools and training to support apprenticeship. I’ve been successful accelerating new workforce entrants into productive employees, and with moving existing staff into more advanced or different roles. I’ve been able to move people from traditional IT shops to agile and DevOps type environments, from dev leads to architects, and from traditional architects to modern IT architects.

The most recent and most exciting is to take kids that are not going to be able to finish college. They have the skill to get a degree but don’t have the money or interest in completing it. I’ve been taking them from doing minimum wage jobs to shifting them over and getting them into the workforce and making them productive. I’ve been able to move people into IT-related jobs as well as other business-related positions.

I apprentice them by using customer journey mapping. I teach them how it works and then have the apprentices transcribe the interviews I record and when I do a whiteboard workshop, I have them transcribe those notes into an Excel spreadsheet. I could do that electronically or with automation, but by having them do it, they learn the overall rhythm and language of business and they start to understand it. Then by talking with them about the customer journey from discovery through support or separation, they understand what the customer journey looks like. They also understand the underpinning interface with the company and how the business works and how they interact with the customer. That has been wildly successful.

With that basic knowledge they learn new skills very quickly, allowing me to focus more on helping them grow a strong work ethic and better time management. I drive through objectives rather than hours worked. I let them manage themselves so they gain a lot of confidence and they drive forward and push themselves. The other thing I do is, for the life skills they may not have, I teach those. For instance, a lot of them don’t know how to budget. I tell them not to budget using money—budget using hours. Think about a cup of Starbucks coffee as 70 minutes of your time in order to pay for it, think of your apartment rent as two weeks work, think of your car as a week’s pay. I get them thinking that way and money becomes tangible, and they get better at budgeting. 

With these entry level people who are transitioning from minimum wage jobs, are they also being hired by a company or are you teaching them the skills and then they go out and get a job?

It works both ways. I’ve helped companies get apprenticeship programs going and also apprenticed people, then they go get jobs or take jobs with the companies I consult with. Before we start, the customer and I agree I’ll be using some unskilled people to help them grow, and in return the company has the opportunity to hire the person when they are ready. I pay my apprentices a living wage as I grow them and expose them to my customers. I’m very transparent about how much they cost me and how much they have to earn to break even, and I tell them that in every business, that’s what they’re looking at. I teach them that, and then as they are introduced to my customers, my customers are welcome to hire them. Gigantic win for my employees and my customers.

This seems like it could be another avenue to help solve some of the diversity problems that the tech community is facing right now. Have you also been looking at apprenticeships in that manner?

Absolutely I have. This is another thing that is near and dear to my heart. The reason that I’m in IT is because my sister went into IT in the mid-1970s. I watched her live through that horrible time for women in IT. I’ve tried to do my part to help create a more diversified workforce in IT. Now my daughter is in IT and her journey was 10 times better than my sister’s. Not perfect, but better. Since then I have worked to identify what is broken and fix it.

I’ve also worked with a lot of kids who are disadvantaged, and I’ve been able to help them move up and into IT. Once they see a way out of their current environment and have hope, and that all it takes is some effort on their part, they are in. They’ve got somebody that believes in them and willing to invest time in them, and they’re all over it, working harder and better than most of the privileged kids that I’ve worked with, or the ones that feel like they’re entitled.

What can employers do to make their employees more loyal these days?

That’s a tough one because when you look at it, millennials are different. The big five leadership indicators manifest different and they are not driven by the same incentives. There’s a big shift with millennials and there will be for future generations but there are a lot of things you can do culturally to address that. A lot have to do with the policies that you have. For instance, companies that allow you to bring a dog in or work remotely or wear jeans and a t-shirt, or bow ties, those little things help.

But what I’ve found is the number one thing that has helped is to have millennials form relationships with the people that have a lot of experience and giving them time to grow relationships and skills. Every millennial I’ve reached out to and worked with has been hungry for the relationship and growth. They don’t want platitudes, they want people who really want to interact with them and have a genuine interest in helping them. Once you show that, big win.  

The other thing you have to do is let them experiment and not put them in a box. You have to put a group of them together and let them figure out their own boundaries and just make it an objective base. I think doing that helps an awful lot. So building those relationships, which you can do through an apprenticeship program and then providing some freedom so they can operate in a different way, those are two of the things you can do. The heavy handed review cycles and trying to either intimidate or incent millennials with money is not going to work. A lot of them have a high-minded idea of the way they world should work, and they’re going to be more loyal if the company they work for represents that or if the manager they work for represents that.

What are some of those ideals that they’re looking for?

Most of them are worried about the world and want it to be a better place. They see the disparity between the highest paid and lowest paid, and they want fairness and to work as a group, and for the group being successful. A lot of their idealism is centered on those concepts, and allowing them volunteer time to work with charities and have outreach programs.

What role can certification programs such as The Open Group’s play in helping to close the skills gap?

It can play a gigantic role by providing frameworks and methodologies that reflect today’s IT environment. I think we also have to shift the way that we do certification and training and a lot of that is starting to happen. We’re starting to move the bar and have a lot more practical and hands-on certifications and training.

I think we need to shift from taking an online course and then going to a place and taking a test to working with and interacting with another person. An example of that is the top certifications for architects that The Open Group has, those are based on defending your experience and going through an interview process with peer members of that group, and them saying yes, this person is what they say. Using a test you can’t do that.

This type of approach makes it a lot more personal. What you will see over time is that people say ‘I had so and so on my board’ or ‘I had this person mentor me,’ and they start talking about their lineage based on the people they’ve worked with in the industry. If we shift more toward that type of validation as opposed to using multiple choice tests, we’ll be a lot better off.

I also think you’ll see hybrid industry/customer certifications just like you see industry/customer training. Someone will join a company and get trained and certified, but that certification will be able to follow the person rather than go away when they leave the company. What you’ll see is when an employee decides to leave, they can take part of the external facing portion of a credential with them, and only lose the internal portion. For the piece they lose, they will rely on their resume.

The other big area where you’ll see a shift in certification is, rather than being tied to technology and platforms, certification will be tied to business capabilities and business outcomes. You’ll certify that someone can build a solution toward a specific business outcome or capability that’s trying to be enabled.

@theopengroup #ogAUS

By The Open GroupAndy started his career in IT as a technical expert in consulting roles and well as staff roles. In the mid-1990s, he shifted from delivering IT capability to delivering training, speaking at conferences and writing books and training covering the IT space. The end of the 1990s Andy joined Microsoft as a subject matter expert working on their public training and certification programs.

He grew to own curriculum development, then certification development, and then creating and delivering new training and certification programs. Additionally, Andy spent time as a role owner, defining job roles, levels, career ladders and compensation models to field-based architects and consultants. Over the last several years, Andy employs his talents as a consultant helping with business and IT strategy, and has a passion for workforce development.

1 Comment

Filed under Certifications, devops, Enterprise Architecture, enterprise architecture, Enterprise Architecture (EA), Internet of Things, IT, operating system, Professional Development, skills gap, Standards, The Open Group, The Open Group Austin 2016, Uncategorized

The Open Group Austin 2016 Event Highlights

By Loren K. Baynes, Director, Global Marketing Communications, The Open Group

During the week of July 18th, The Open Group hosted over 200  attendees from 12 countries at the Four Seasons hotel on the beautiful banks of Lady Bird Lake in Austin, Texas, USA.

On Monday, July 18, Steve Nunn, President and CEO of The Open Group, welcomed the audience and set the stage for all the great upcoming speakers and content.

Steve’s remarks included the recent release of the Open Business Architecture (O-BA) Preliminary Standard Part I to Support Business Transformation.  This is the first in a series of installments that will help Business Architects get to grips with transformation initiatives and manage the demands of key stakeholders within the organization. Steve also referenced William Ulrich, President, Business Architecture Guild, who consulted on the development of the standard.

The plenary began with Jeff Scott, President, Business Innovation Partners, with his presentation “The Future of Business Architecture, Challenges and Opportunities”.  Jeff stated some interesting facts, which included noting that Architects are among the best and brightest members of our organizations.  He also stated that Business Architects need support from a wide group of senior managers, not just the CEO. The ultimate goal of Business Architecture is not to model the organization but to unlock organizational capacity and move forward.

By Loren K. Baynes

Jeff Scott

The Business Architecture (BA) theme continued with Aaron Rorstrom, Principal Enterprise Architect, Capgemini.  Aaron further elaborated on The Open Business Architecture (O-BA) Standard.  The O-BA Standard provides guidance to companies for establishing BA practice and addresses three transformation challenges: consistent communication, alignment and governance, systemic nature.

The sessions were followed by Q&A moderated by Steve Nunn.

Up next was “ArchiMate® 3.0 – A New Standard for Architecture” with Marc Lankhorst, Managing Consultant and Service Line Manager, Enterprise Architect, BiZZdesign and Iver Band, Enterprise Architect, Cambia Health Solutions.

Marc and Iver discussed practical experiences and a Healthcare case study, which included a discussion on personal health and wellness websites.

ArchiMate®, an Open Group standard, provides a language with concepts to describe architectures; a framework to organize these concepts; a graphical notation for these concepts; a vision on visualizations for different stakeholders. ArchiMate 3.0 has recently been released due to: the increasing demand for relating Enterprise Architecture (EA) to business strategy; technology innovations that mix IT and physical world; usage in new domains (i.e. manufacturing, healthcare, retail); improved consistency and comprehensibility; improved alignment between Open Group standards, notably TOGAF®.

The final session of Monday’s plenary featured a panel on “Architecture Standards Development” with Marc Lankhorst, Iver Band, Mike Lambert (Fellow of The Open Group) and Harry Hendrickx (Business Architect, Hewlett Packard Enterprise).  Moderated by Chris Forde, GM, Asia Pacific and VP, Enterprise Architecture, The Open Group, the panel represented a diverse group of the population contributing to the development of open standards.

In the afternoon, sessions were divided into tracks – Security, ArchiMate, Open Business Architecture.

Don Bartusiak, Chief Engineer, Process Control, ExxonMobil Research & Engineering presented “Security in Industrial Controls – Bringing Open Standards to Process Control Systems”.  Don went into detail on the Breakthrough R&D project which is designed to make step-change improvement to reduce cost to replace and to increase value generation via control system.  ExxonMobil is working with The Open Group and others to start-up a consortium of end user companies, system integrators, suppliers, and standards organizations for sustained success of the architecture.

Also featured was “Applying Open FAIR in Industrial Control System Risk Scenarios” by Jim Hietala, VP, Business Development and Security, The Open Group.  The focus of ICS systems is reliability and safety.  Jim also shared some scenarios of recent real life cyberattacks.

The evening concluded with guests enjoying a lively networking reception at the Four Seasons.

Day two on Tuesday, July 19 kicked off the Open Source/Open Standards day with a discussion between Steve Nunn and Andras Szakal, VP & CTO, IBM U.S. Federal. Steve and Andras shared their views on Executable Standards: convergence of creation of open source and innovation standards; the difference between Executable Standards and traditional standards (i.e. paper standards); emergence of open source; ensuring interoperability and standardization becomes more effective of time. They further explored open technology as driving the software defined enterprise with SOA, social, Open Cloud architecture, e-Business, mobile, big data & analytics, and dynamic cloud.

A panel session continued the conversation on Open Standards and Open Source.  The panel was moderated by Dave Lounsbury, CTO and VP, Services for The Open Group.  Panelists were Phil Beauvoir, Archi Product Manager, Consultant; John Stough, Senior Software Architect, JHNA, Inc.; Karl Schopmeyer, Independent Consultant and representing Executable Standards activity in The Open Group.  Topics included describing Archi, Future Airborne Capability Environment (FACE™, a consortium of The Open Group) and OpenPegasus™, an open-source implementation of the DMTF, CIM and WBEM standards.

The Open Group solves business problems with the development and use of open standards.  Interoperability is key.  Generally, no big barriers exist, but there are some limitations and those must be realized and understood.

Steve presented Karl with a plaque in recognition of his outstanding leadership for over 20 years of The Open Group Enterprise Management Forum and OpenPegasus Project.

Rick Solis, IT Business Architect, ExxonMobil Global Services Co. presented “Driving IT Strategic Planning at IT4IT™ with ExxonMobil”.  Business is looking for IT to be more efficient and add value. ExxonMobil has been successfully leveraging IT4IT concepts and value chain. The IT4IT™ vision is a vendor-neutral Reference Architecture for managing the business of IT.  Rich emphasized people need to think about the value streams in the organization that add up to the business value.  Furthermore, it is key to think seamlessly across the organization.

Joanne Woytek, Program Manager for the NASA SEWP Program, NASA spoke about “Enabling Trust in the Supply Chain”.  SEWP (Solutions for Enterprise-Wide Procurement) is the second biggest IT contract in the US government.  Joanne gave a brief history of their use of standards, experience with identifying risks and goal to improve acquisition process for government and industry.

Andras Szakal again took the stage to discuss mitigating maliciously tainted and counterfeit products with standards and accreditation programs.  The Open Trusted Technology Provider™ Standard (O-TTPS) is an open standard to enhance the security of the global supply chain and the integrity of Commercial Off The Shelf (COTS) Information and Communication Technology (ICT). It has been approved as an ISO/IEC international standard.

Afternoon tracks consisted of Healthcare, IT4IT™, Open Platform 3.0™ and Professional Development.  Speakers came from organizations such as IBM, Salesforce, Huawei, HPE and Conexiam.

The evening culminated with an authentic Texas BBQ and live band at Laguna Gloria, a historic lakefront landmark with strong ties to Texas culture.

By Loren K. Baynes

The Open Group Austin 2016 at Laguna Gloria

Wednesday, July 20 was another very full day.  Tracks featured Academia Partnering, Enterprise Architecture, Open Platform 3.0 (Internet of Things, Cloud, Big Data, Smart Cities), ArchiMate®.  Other companies represented include San Jose State University, Quest Diagnostics, Boeing, Nationwide and Asurion.

The presentations are freely available only to members of The Open Group and event attendees.  For the full agenda, please click here.

In parallel with the Wednesday tracks, The Open Group hosted the third TOGAF® User Group Meeting.  The meeting is a lively, interactive, engaging discussion about TOGAF, an Open Group standard.  Steve Nunn welcomed the group and announced there are almost 58,000 people certified in TOGAF.  It is a very large community with global demand and interest.  The key motivation for offering the meeting is to hear from people who aren’t necessarily ‘living and breathing’ TOGAF. The goal is to share what has worked, hasn’t worked and meet other folks who have learned a lot from TOGAF.

Terry Blevins, Fellow of The Open Group, was the emcee.  The format was an “Oxford Style” debate with Paul Homan, Enterprise Architect, IBM and Chris Armstrong, President, Armstrong Processing Group (APG).  The Proposition Declaration: Business Architecture and Business Architects should be within the business side of an organization. Chris took the ‘pro’ position and Paul was ‘con’.

Chris believes there is no misalignment with Business and IT; business got exactly what they wanted.  Paul queried where do the Business Architectures live within the organization? BA is a business-wide asset.  There is a need to do all that in one place.

Following the debate, there was an open floor with audience questions and challenges. Questions and answers covered strategy in Architecture and role of the Architect.

The meeting also featured an ‘Ask the Experts’ panel with Chris Forde; Jason Uppal, Chief Architect, QRS; Bill Estrem, TOGAF Trainer, Metaplexity Associates; Len Fehskens, Chief Editor, Journal of Enterprise Architecture, along with Chris Armstrong and Paul.

Organizations in attendance included BMC Software, City of Austin, Texas Dept. of Transportation, General Motors, Texas Mutual Insurance, HPE, IBM.

A more detailed blog of the TOGAF User Group meeting will be forthcoming.

A special ‘thank you’ to all of our sponsors and exhibitors:  avolution, BiZZdesign, Good e-Learning, Hewlett Packard Enterprise, AEA, Orbus Software, Van Haren Publishing

@the opengroup #ogAUS

Hope to see you at The Open Group Paris 2016! #ogPARIS

By Loren K. BaynesLoren K. Baynes, Director, Global Marketing Communications, joined The Open Group in 2013 and spearheads corporate marketing initiatives, primarily the website, blog, media relations and social media. Loren has over 20 years experience in brand marketing and public relations and, prior to The Open Group, was with The Walt Disney Company for over 10 years. Loren holds a Bachelor of Business Administration from Texas A&M University. She is based in the US.

 

 

Comments Off on The Open Group Austin 2016 Event Highlights

Filed under Accreditations, ArchiMate, ArchiMate®, Association of Enterprise Architects (AEA), Business Architecture, Business Transformation, Certifications, Cloud, COTS, Cybersecurity, digital technologies, Digital Transformation, enterprise architecture, Enterprise Architecture (EA), Internet of Things, Interoperability, Jeff Kyle, O-TTPS, Open FAIR, Open Platform 3.0, Professional Development, Security, Standards, Steve Nunn, The Open Group Austin 2016, TOGAF®, TOGAF®

The Open Group Austin Event to Take Place July 18-21, 2016

The Open Group, the vendor-neutral IT consortium, is hosting its latest event in Austin, TX, USA July 18—21, 2016. The event, taking place at Austin’s Four Seasons Hotel, will focus on open standards, open source and how to enable Boundaryless Information Flow™.

Industry experts will explain how organizations can use openness as an advantage and how the use of both open standards and open source can help enterprises support their digital business strategies. Sessions will look at the opportunities, advantages, risks and challenges of openness within organizations.

The event features key industry speakers including:

  • Steve Nunn,  President & CEO, The Open Group
  • Dr. Ben Calloni, Fellow, Cybersecurity, Lockheed Martin Aeronautics
  • Rick Solis, IT Business Architect, ExxonMobil Global Services Co
  • Zahid Hossain, Director, IT Architecture, Nationwide
  • William Wimsatt, Oracle Business Architect, Oracle

Full details on the agenda and speakers can be found here.

The Open Business Architecture Standard (O-BA) and ArchiMate® 3.0, a new standard for Architecture, will be the focus of Monday’s keynote sessions. There will also be a significant emphasis on IT4IT™, with the Tuesday plenary and tracks looking at using and implementing the IT4IT™ Reference Architecture Version 2.0 standard.

Further topics to be covered at the event include:

  • Open Platform 3.0™ – driving Lean Digital Architecture and large scale enterprise managed cloud integration
  • ArchiMate® – New features and practical use cases

Member meetings will take place throughout the course of the three-day event as well as the next TOGAF® User Group meeting taking place on July 20.

Registration for The Open Group Austin event is open now, is available to members and non-members, and can be found here.

By The Open Group

@theopengroup #ogAUS

For media queries, please contact:

Holly Hunter
Hotwire PR
+44 207 608 4638
UKOpengroup@hotwirepr.com

Comments Off on The Open Group Austin Event to Take Place July 18-21, 2016

Filed under ArchiMate, Boundaryless Information Flow™, Business Architecture, Certifications, Digital Transformation, Enterprise Architecture, Enterprise Architecture (EA), Internet of Things, IT4IT, Steve Nunn, The Open Group, The Open Group Austin 2016, TOGAF®, TOGAF®, Uncategorized