G-EXEC and the G-Experience
In the dim and distant days of 1973, I joined a small team of Keith Jeffery at the Institute of Geological Sciences (now the British Geological Survey) and Elizabeth Gill at the Atlas Computer Laboratory (now part of the Rutherford Appleton Laboratory), who were doing something quite amazing. Though none of us really appreciated it at the time, they had started developing a relational database management system. The Geological Survey was faced with a serious problem: much of its geological applications software was written for, and would only work on, the ICT Atlas computer. Unfortunately this machine was at the end of its operating life and was to be replaced by a shiny new IBM 360/195, totally incompatible with the Atlas. Urgently something had to be done to provide software support for the exploration geochemists and others who (even then) relied on computer aplications to manage their large volumes of data, and to generate maps and statistics.
Faced with a clean slate and an urgent requirement to satisfy the needs of several different groups of users at the same time, in November 1972 Keith Jeffery embarked on a completely new concept in software design. The system which was being developed was named G-EXEC, and was based on a set of fundamental design principles:
The system was developed entirely in Fortran, and its design principles dictated the use of a simple standard file structure. In fact, the second major innovation in G-EXEC was the definition of a file structure which could be used to handle any type of geoscience data. The G-EXEC file consisted of a simple table of rows and columns, with a header block cointaining field names and definitions of field type, length, etc.
This self-describing table file structure lent itself readily to relational database management, and G-EXEC was perhaps the first extensive system to be based consciously on Codd's relational database principles. Database functionality was implemented as explicit procedures rather than in a new language (there was nothing like SQL), but this provided a great element of robustness: user errors could be traced and fixed very easily.
Because of the restrictions of Fortran 66 and the IBM360 operating system, it was impossible directly to allow for varying file sizes in the Fortran code: that became possible only much later with full implementations of Fortran 77. However, this was no problem for G-EXEC. All that was needed was an operating system facility for one job to spawn another - and this was available. Thus G-EXEC was implemented as a two-stage operation. In the first stage, the required file sizes and the sequence of processing were specified, and the G-EXEC 'Controller' actually wrote the Fortran code for the JCL ( job control language) and a mainline program to be executed in the second stage, defining array sizes as needed and calling the required set of process programs as subroutines. Today this might seem a very clunky solution to the problem, but in the 1970s it was a very elegant way to circumvent the limitations of the mainframe computers of the day. The same G-EXEC system structure was implemented on many different computers including mainframes and minis from IBM, Univac, Honeywell, DEC, Data General, and even a Cray-1.
It is interesting that G-EXEC - though providing more of a learning challenge than many of the single-purpose programs currently in use - became an instant hit among the BGS user community. The software development and support team grew to four core members (with the inclusion of John Cubitt) with several others also contributing. However, it was a product which took root and grew despite a perceived indifference and sometimes direct opposition from upper echelons of the management hierarchy. The system gained a degree of official acceptance in the later 1970s when Keith Jeffery and Elizabeth Gill were appointed as the founder members of the NERC Central Computing Group (NERC being the UK Natural Environment Research Council, the umbrella organisation of BGS). This move reflected the spread of G-EXEC use to other institutes within the Research Council - by 1979 the system was used not only in geology but also in fields such as oceanographic mapping, terrestrial ecology, and even statistics of bird observations.
One of the unique features of G-EXEC was its breadth of applications. At its core lay a wide range of relational database operators and data manipulation utilities. However, this was supplemented from its earliest days by statistical and graphical applications. Its first and most intensive users were in geochemical exploration, especially the large team led by Dr (now Professor) Jane Plant. This UK-based team had its parallels in the Overseas Division, carrying out similar work around the world, and one of my own roles was in the computing support of these teams - operating in countries as diverse as Indonesia, the Solomon Islands, Botswana, Ecuador, and Costa Rica.
In 1975, a major event was a workshop held at the Atlas Computer Laboratory, in which Dr Peter Sutterlin of the University of Western Ontario, who had spent a year's sabbatical with the G-EXEC team, demonstrated a data interchange standard 'Filematch' which he had developed, to interface among G-EXEC and counterpart systems from Germany, France, Sweden, Canada, and the USA. It is interesting that part of the preparation for this workshop was carried out by e-mail (though we knew it as asynchronous teleconferencing) over the Arpanet, the predecessor of the Internet. Following this conference, we put a great deal of effort into the definition of new file structures for data transfer, resulting in the 'G-SEND' structure which was used for multi-frame graphics files and also to transmit data and text files.
In early work (1978-9) on disposal of high-level radioactive waste, I developed three-dimensional modelling and fluid-flow simulation applications within G-EXEC, at the same time extending the G-EXEC command language to provide a full general simulation framework. This was used to model the Loch Doon granite in southern Scotland, then considered a potential disposal site, but there was insufficient geological data available to validate the model.
The future seemed to be very bright for G-EXEC, which technologically was perhaps at least a decade ahead of its time. However, management opposition crystallised in 1979 with proposals to form a 'NERC Computing Service' to take over responsibility for all computing from the separate institutes. The first director of this service was an external appointee, Mr Brian Rule, who had little understanding of, or sympathy for G-EXEC and its 'nerd' sub-culture (a popular saying had been coined about G-EXEC: "not so much a system, more a way of life"). Within the space of only two or three more years, the G-EXEC team had dispersed. In a few places the software did, however, remain in use until the 1990s - especially for some of the oceanographic applications, to which it was particularly well suited.
After I left BGS, and set up Mineral Industries Computing Ltd with Peter Stokes in 1981, the G-EXEC design principles and structure were adapted to form the core of the Datamine mining software system, and even now there remains much code - especially in database management, statistics, and graphics applications - within the centre of Datamine, which can be traced directly back to the 1970s.
Even apart from that, the G-EXEC system - or at least its name and spirit - are not necessarily dead, though. If you look at www.g-exec.com you will see some plans and ideas to develop a new product on the same principles as the original G-EXEC system, though using XML as a basis for its database - but maybe still using some flavour of Fortran as its main development language.Stephen Henley
Copyright © 2000 Stephen Henley
G-EXEC and the G-Experience: Earth Science Computer Applications, v.15, no.11, p.1-3