The Enviable Pedigree of UNIX® and POSIX®

By Andrew Josey, VP, Standards and Certification, The Open Group

Technology can be a fickle thing. Spurred by perpetual innovation, the one constant in the tech industry is change. As such, we can expect that whatever is the hottest thing in the industry today—Cloud, Big Data, Mobile, Social, what have you—will be yesterday’s news within a few years’ time. That is how the industry moves and sustains itself, with constant development and creativity—all of which is only getting faster and faster.

But today’s breakthroughs would be nowhere and would not have been possible without what came before them—a fact we sometimes forget. Mainframes led to personal computers, which gave way to laptops, then tablets and smartphones, and now the Internet of Things. Today much of the interoperability we enjoy between our devices and systems—whether at home, the office or across the globe—owes itself to efforts in the 1980s and 1990s to make an interoperable operating system (OS) that could be used across diverse computing environments—the UNIX operating system.

Created at AT&T Bell Laboratories in the early 1970s, the UNIX operating system was developed as a self-contained system that could be easily adapted and run on commodity hardware. By the 1980s, UNIX workstations were widely used in academia and commercially, with a large number of system suppliers, such as HP, IBM, and Sun Microsystems (now Oracle), developing their own flavors of the OS.

At the same time, a number of organizations began standardization efforts around the system. By the late 1980s, three separate organizations were publishing different standards for the UNIX operating system, including IEEE, ISO/IEC JTC1 and X/Open (which eventually became The Open Group).

As part of the standardization efforts undertaken by IEEE, it developed a small set of application programming interfaces (APIs). This effort was known as POSIX, or Portable Operation System Interface. Published in 1988, the POSIX.1 standard was the first attempt outside the work at AT&T and BSD (the UNIX derivative developed at the University of California at Berkeley) to create common APIs for UNIX systems. In parallel, X/Open (an industry consortium consisting at that time of over twenty UNIX suppliers) began developing a set of standards aligned with POSIX that consisted of a superset of the POSIX APIs.  The X/Open standard was known as the X/Open Portability Guide and had an emphasis on usability. ISO also got involved in the efforts, by taking the POSIX standard and internationalizing it.

In 1995, the Single UNIX Specification was created to represent the core of the UNIX brand. Born of a superset of POSIX APIs, the specification provided a richer set of requirements than POSIX for functionality, scalability, reliability and portability for multiuser computing systems. At the same time, the UNIX trademark was transferred to X/Open (now The Open Group). Today, The Open Group holds the trademark in trust for the industry, and suppliers that develop UNIX systems undergo certification, which includes over 40,000 tests, to assure their compatibility and conformance to the standard.

These tri-furcated efforts by separate standards organizations continued through most of the 1990s, with the people involved in developing the standards constantly bouncing between organizations and separate meetings. In late 1997, a number of vendors became tired of having three separate parallel efforts to keep track of and they suggested all three organizations come together to work on one standard.

In 1998, The Open Group, which had formed through the merger of X/Open and the Open Software Foundation, met with the ISO/IEC JTC 1 and IEEE technical experts for an inaugural meeting at IBM’s offices in Austin, Texas. At this meeting, it was agreed that they would work together on a single set of standards that each organization could approve and publish. Since then the approach to specification development has been “write once, adopt everywhere,” with the deliverables being a set of specifications that carry the IEEE POSIX designation, The Open Group Technical Standard designation, and the ISO/IEC designation. Known as the Austin Group, the three bodies still work together today to progress both the joint standard. The new standard not only streamlined the documentation needed to work with the APIs but simplified what was available to the market under one common standard.

A constant evolution

As an operating system that forms the foundational underpinnings of many prominent computing systems, the UNIX OS has always had a number of advantages over other operating systems. One of the advantages is that those APIs have made it possible to write code that conforms to the standard that can run on multiple systems made by different vendors. If you write your code to the UNIX standard, it will run on systems made by IBM, HP, Oracle and Apple, since they all follow the UNIX standard and have submitted their operating systems for formal certification. Free OSs such as Linux and BSD also support the majority of the UNIX and POSIX APIs, so those systems are also compatible with all the others. That level of portability is key for the industry and users, enabling application portability across a wide range of systems.

In addition, UNIX is known for its stability and reliability—even at great scale. Apple claims over 80 million Mac OS X systems in use today – all of them UNIX certified. In addition, the UNIX OS forms the basis for many “big iron” systems. The operating systems’ high through-put and processing power have made it an ideal OS for everything from supercomputing to systems used by the government and financial sectors—all of which require high reliability, scale and fast data processing.

The standard has also been developed such that it allows users to “slice and dice” portions of it for use even when they don’t require the full functionality of the system, since one size does not fit all. Known as “profiles,” these subsets of the standard API sets can be used for any number of applications or devices. So although not full UNIX systems, we see a lot of devices out there with the standard APIs inside them, notably set top boxes, home routers, in-flight entertainment systems and many smart phones.

Although the UNIX and POSIX standards tend to be hidden, deeply embedded in the technologies and devices they enable today, they have been responsible for a great many advances across industries from science to entertainment. Consider the following:

  • Apple’s Mac OS X, the second widely most used desktop system today is a certified UNIX system
  • The first Internet server for the World Wide Web developed by Tim Berners Lee was developed on a UNIX system
  • The establishment of the World Wide Web was driven by the availability of connected UNIX systems
  • IBM’s Deep Blue supercomputer, a UNIX system, was the first computer to beat World Chess Champion Gary Kasparov in 1997
  • Both DNA and RNA were sequenced using a UNIX system
  • For eight consecutive years (1995-2002), each film nominated for an Academy Award for Distinguished Achievement in Visual Effects was created on Silicon Graphics computers running the UNIX OS.

Despite what one might think, both the UNIX and POSIX standards are continually under development still even today.  The community for each is very active—meeting more than 40 times a year to continue developing the specifications.

Things are always changing, so there are new areas of functionality to standardize. The standard is also large so there is a lot of maintenance and ways to improve clarity and portability across systems.

Although it might seem that once a technology becomes standardized it becomes static, standardization usually has the opposite effect—once there is a standard, the market tends to grow even more because organizations know that the technology is trusted and stable enough to build upon. Once the platform is there, you can add things to it and run things above it. We have about 2,000 application interfaces in UNIX today.

And as Internet-worked devices continue to proliferate in today’s connected world, chances are many of these systems that need big processing power, high reliability and huge scale are going to have a piece of the UNIX standard behind them—even if it’s deep beneath the covers.

By Andrew JoseyAndrew Josey is VP, Standards and Certification at The Open Group overseeing all certification and testing programs. He also manages the standards process for The Open Group.

Since joining the company in 1996, Andrew has been closely involved with the standards development, certification and testing activities of The Open Group. He has led many standards development projects including specification and certification development for the ArchiMate®, TOGAF®, POSIX® and UNIX® programs.

He is a member of the IEEE, USENIX, UKUUG, and the Association of Enterprise Architects (AEA).  He holds an MSc in Computer Science from University College London.

@theopengroup

One comment

Comments are closed.