The Human-Computer Interface

The user interface is not strictly speaking part of the operating system, but an intermediate layer of software that allows the user to interact effectively with the operating system. In some operating systems, the user interface software is closely integrated with the operating system software (examples include Microsoft Windows and Mac OS). Other operating systems keep the user interface distinctly separate from the operating system, allowing users to select from a range of available user interfaces (Linux is a good example). Virtually all operating systems, however, provide a command-line interface, usually called a command shell, which can be used to enter short text commands to execute system commands, run programs or manage files and directories. Windows 7, for example, provides a command line facility called cmd.exe.

Although most users today prefer to use the more intuitive graphical user interface (GUI), with its application windows, program icons, and mouse-driven menus, a command line environment provides some powerful features, particularly from the point of view of system administrators, who can use it to quickly perform low-level system management and configuration tasks. The GUI, on the other hand, takes advantage of the fact that users can recognise and respond to visual cues, and removes the need to learn obscure commands by allowing a pointing device (usually a mouse) to be used to perform operations such as opening a user application or navigating the file system at the click of a button.

Modern application software is usually written for a particular operating system. This allows the programmer to take advantage of the operating system's application programming interface (API), which provides a standard set of functions for creating the user interface for the application. The result is that all applications written for a particular operating system present a standardised interface to the user. This facilitates ease of use, because a user encountering a new software application can concentrate on learning the salient features of the application immediately, without having to get used to a completely different style of user interface.

The study of human-computer interaction (HCI) is concerned with the design and implementation of interactive systems, and with the interaction between humans and computers. It is often unclear whether it is a branch of computer science, cognitive psychology, sociology or even industrial design. From the point of view of this discussion, we are mostly interested in the computer science aspects, although other disciplines have a supporting role to play, such as communication theory, graphic design, linguistics and the social sciences. Chiefly, we are concerned with effective cooperation between human beings and computers in jointly carrying out various tasks.

Interface design, specification and implementation must consider the ways in which human beings can effectively communicate with machines, and how they learn to use the interface through which this is achieved. The human-computer interface in a modern operating system may well represent more than half of the operating system's program code. The application of software engineering and design principles is obviously important to the development of the operating system software, but consideration must also be given to the performance of the user. A graphical user interface is not necessarily the most efficient kind of interface, but for most people it is the preferred way of working with the computer. It is usually far easier and quicker to learn how to use a graphical user interface than to become reasonably proficient in a command line environment.

You will notice if you look at your keyboard that the first row of letters spells "QWERTYUIOP". This layout was originally devised for typewriters, and was designed by inventor Christopher Scholes of Milwaukee sometime around 1872. The keys are arranged in this fashion for a reason. In the first typewriters, the keys were a sequence of hammers arranged in a circle which struck an inked ribbon. The ribbon was forced onto the paper, leaving a printed character. Because the keys would often jam together if pressed one after the other too quickly, Scholes arranged them in such a way that any pair of letters that the typist was likely to hit in quick succession were not adjacent to each other. The layout became standard for typewriters, and was subsequently migrated to computer keyboards. So far, a more ergonomic keyboard has not found widespread use, although an alternative has been around since 1932 in the form of the Dvorak keyboard. The main claims made for the Dvorak layout is that it is more comfortable to use, and may help to reduce the number of repetitive strain injuries.


An early QWERTY typewriter keyboard

An early QWERTY typewriter keyboard


The mouse was invented in 1964, and has been a standard feature of the human-computer interface since 1973, although it was not until the 1980s that it began to be used with IBM-based PCs. The graphical user interface (GUI) was first popularised by Apple's Macintosh computer in 1983, and has become a standard feature of all modern desktop operating systems. The GUI uses metaphors such as the desktop, and its characteristic features include Windows, Icons, Menus, and Pointers (the term GUI replaces the older WIMP acronym). Windows are the workspaces inside which each application runs, and can typically be resized, moved around, and hidden as required by the user. Icons are small images that are used as shortcuts, for example to open applications, files or directories. A menu contains a short list of functions, allowing the user to easily select the task they wish to perform. Pointers are moveable images, controlled by a pointing device such as a mouse, that allow the user to track their position on the screen and to select windows, icons or menu items at the click of a button.


The Windows 7 desktop

The Windows 7 desktop



Interface design principles

A number of design principles are involved when creating a user interface:


Future developments in interface design

Attempts to predict the future of technology have often proved to be spectacularly inaccurate. It is therefore useful to bear in mind that any speculation about the future of the human-computer interface can be based only on what is currently known to be possible. The current economic model is one in which the cost of hardware is decreasing, while speed and capacity is increasing, suggesting that computational facilities will become increasingly ubiquitous, and that the degree of human-computer interaction will continue to increase. The miniaturisation of hardware components, together with ever lower power requirements, will make it possible to deploy embedded computer systems in an increasing range of applications, including hand-held mobile devices, vehicles, household appliances and personal accoutrements.

New display technologies have already appeared, making it possible to view multimedia content or take part in a video conference using a mobile phone, for example. TFT monitors, whilst currently still significantly more expensive than CRT monitors, have many advantages, including being lighter, having a smaller footprint, using considerably less power, and emitting less heat and radiation. The assimilation of computation into the environment means that it is already possible to automate the monitoring and control of temperature, humidity and lighting levels in an office building. New developments in I/O techniques may mean that the keyboard and mouse will soon be obsolete, as our computers learn to recognise voice commands, or even learn to converse with us. Alternative I/O techniques are already being developed to improve the availability of computing resources to various disadvantaged groups.