Professional Documents
Culture Documents
__________________________________________________________________________
STEEL CONSTRUCTION:
COMPUTER AIDED DESIGN & MANUFACTURE
1. INTRODUCTION
The ways in which computers have affected the various activities involved in steel
construction have been led by developments in computing hardware, user environments,
software and systems for data exchange. These developments in themselves have been
interlinked, typically by advances in hardware allowing new possibilities for software
development. However, not all advances for the end-user have followed this sequence; to
Interact with the client, architect and other specialists, possibly including a
fabricator.
Conceive, agree and rationalise a structural form.
Perform rapid structural design calculations.
Produce a limited range of drawings
Decide on material requirements and construction processes.
Use these for estimating a tender price and producing tender documents.
This stage clearly involves a great deal of work which may, after the contract is awarded,
have been fruitless. From this point of view, therefore, there is a need to minimise the
effort expended in a very risky endeavour. On the other hand, in the event of winning the
contract, it is essential to reduce the amount of eventual variation from the tender
specification, so this process must be carried out in a conscientious fashion. There is
obvious scope at this stage for a relatively crude computerised approach to save a larger
amount of employee-time in preliminary sizing of members, in production of tender
drawings and in cost-estimating.
When the contract has been awarded, the successful design team is then faced with the
need to:
2. COMPUTER HARDWARE
Mechanically-based digital 'computers' were first developed by mathematicians in the 19th
Century. They were developed further only as far as the 'adding machines' and electromechanical calculators (sometimes analogue rather than digital) used in commercial,
industrial and military applications until the mid-20th Century. They performed numerical
computations much faster than could be done manually, but were limited by their large
numbers of precision-made moving parts to fairly simple general arithmetic, or to unique
tasks such as range finding for artillery.
The first electronic computers began to be developed in the mid-20th century, using radio
valves as their basic processing components. These components were accommodated on
racks and the computers thus acquired the title of mainframes. They generated large
amounts of heat and efficient cooling and air-conditioning systems were always required.
Early computers were unreliable because of the limited life of the thermionic valves and as
the size of installations grew so did the probability of failure. The natural limit to the size
of such computers arrived when a design was considered which employed so many valves
that it was estimated by normal probability theory that it would average 57 minutes of
'down-time' out of every hour. Maintenance and operation of a computer required a large
number of specialised personnel. Compared with the previous generation of mechanical
devices, these computers were extremely powerful. Within industry they tended to be
installed mainly for payroll and financial management, but in the research environment
their development allowed the field of numerical analysis to begin to grow.
The development in the 1950's of transistors and in the 1960's and 70's of miniaturised
integrated circuits (microchips) led to progressive improvements in the size, energy
consumption, computing power, reliability and cost of computer hardware. This enabled a
5. INTERACTION
Direct interactive use of computers was not possible on the early mainframes, but it has
progressively become the most effective method of use in most cases. Initially, dumb
terminals were used so that users could type and send to the computer directly the kind of
batch programming commands which had previously been read from punched cards.
However, with mainframes two-way communication was slow since a large number of
users might be sharing time on the central processor and data transmission rates were
rather low in any case. It was only when communication and processing speeds had
increased that interactive programs became possible. At this point, an executing program
could be made to pause and request additional data or decisions from the user at the
remote terminal, and to resume execution when this data had been entered. Results could
be shown on the terminal or printed as a hard copy.
The use of dumb terminals has now largely been superseded by distributed computing.
The personal computer itself has enough processing power and memory for most
applications, so that communication with the central processor is not subject to timesharing and truly interactive software is possible. Where access to software or data needs
to be shared between numbers of users, computers tend to be attached to a network. In a
network a number of computers, each of which uses its own processing power, is linked
together (Figure 2) so that each has access to the others and, more importantly, each has
access to a very large central filestore on which data and software is stored. This filestore
is controlled by a "slave" computer known as the file server which generally runs the
network. When a computer in the ring needs to use a particular program it loads the
program from the filestore and runs it locally. Data produced by one computer can be held
in a common database on the central filestore and accessed by others. Such networks are
often provided with gateways to larger, national or international networks so that
information can be shared by a large group of people. Even with a home computer the use
of a modem allows a user to access the network via an ordinary telephone connection, thus
providing a dial-in facility. This possibility obviously carries the implication that data
needs protection against being corrupted by unauthorised users and, in some cases,
confidentiality must be maintained. Various systems of password protection are used to
attempt to ensure that network users do not have access beyond the areas in which they
have a legitimate interest.
Computers are not the only devices which can be attached to a network. Most of the
common types of peripheral (such as printers, plotters, scanners and other input/output
devices) can also be attached. In the case, say, of a plotter the file server will control
access to the device by queuing the output to it so that control is maintained. This queuing
system can be applied to any peripheral device which can be attached to the network; in
the context of a fabrication plant, it can be applied to a numerically controlled workshop
machine for which a number of jobs may be waiting at any one time.
To show on the VDU screen the line of characters which was being typed at the
keyboard and eventually to send them to the remote computer (typically when the
"Enter" key was pressed).
To show on the screen any characters sent to the terminal from the computer.
Although window interfaces make computers accessible to a very wide range of potential
users, they present some difficulties for developers of software. The requirement for onboard memory is high, as is that for hard disk storage. Development of original software
for windows environments is usually rather slow and time-consuming and, therefore, the
economics of writing original technical programs for a restricted market is not always
favourable. Conversion of well-established software running in the normal operating
system environment, in such a way that it keeps its full functionality and retains the
working methods which have made it popular whilst taking advantage of the common
user-interface, is an even more difficult task. It is, therefore, often necessary to work
within the normal keyboard-based operating system environment. On PCs this is usually
MSDOS and on workstations Unix. Using a computer in these environments requires much
more understanding of the functions of the operating system and how data is stored on
disk. Visually the user sees a blank screen, or part of a screen, with a flashing cursor to the
right of a brief prompt. In order to make the computer perform any useful task it is
necessary to type in a command in the operating system's high-level language. This is less
daunting than it sounds - with only a few commands in one's vocabulary and a working
knowledge of the directory structuring of hard disks it is possible to work very effectively
with either a personal computer or a workstation.
7. PROGRAMMING COMPUTERS
At the level of the processor chip very large numbers of very simple instructions are
executed in order to perform even the simplest of computing tasks. The task of
programming a computer in such terms is a very tedious process and is only attempted
when execution speed is the very highest priority for an item of software. High-level
programming languages provide an alternative means of presenting a sequence of more
Structural design software is a much more recent phenomenon, since it relies very heavily
on interaction with the design engineer and only started to become widespread when
microcomputers began to flourish in the early 1980s. Much structural design involves
relatively simple calculations - standard loading calculations, analysis and element sizing
based on rules embodied in codes of practice. These calculations have traditionally been
performed by hand, but interactive computing now enables designers to take advantage of
the power of the computer without relinquishing control over design decisions. Design
software relieves the designer of the tedium of laborious manual calculations - in many
cases a degree of 'optimisation' is incorporated within the program, but decisions about
selecting the most appropriate individual member sizes remain with the designer. Design
software now reaches into nearly all areas, but is very variable in its nature, style and
quality. The best allows considerable flexibility in use, making revisions to existing
designs easy and allowing data to be exchanged with software for analysis, CAD and
modelling and for estimating quantities.
In the context of steel structure design, the material available starts with "free disks"
provided by manufacturers of cold-formed products such as sheeting, composite decking
and purlins, which effectively provide quick look-up tables for safe working loads and
spans against key dimensions. Element design to various codes includes beams (both steel
and composite), columns and beam-columns, and connections of various kinds. Whilst
element design usually takes the form of free-standing executable programs the power of
present-day spreadsheet software is such that applications for standard spreadsheets can
provide a very flexible way of automating these fairly straightforward design processes,
with good links to other standard software. Plastic design of steel frames, particularly lowrise frames such as portals, is available in different degrees of sophistication in terms of its
convenience in use, links to downstream software and CAD, and in the order of analysis it
offers. Plastic design is one area where different degrees of analytical capability provide
different orders of realism in results; the more non-linear analysis, which allows
development of plastic zones, can produce distinctly lower load resistances than the rigidplastic and elastic-plastic versions.
Snapping, for instance onto the end or mid-points of lines, grid points, tangents,
etc.
Automatic grid generation.
Rubber-band shapes, including lines, rectangles, circles and other shapes, allowing
them to be replaced, dragged, stretched and distorted.
These facilities are now fairly typical in professional personal computer CAD tools.
Increased intelligence has been introduced into the way elements are represented, for
instance, in according specific relationships between drawn elements. There is, however, a
penalty to be paid for storing data in an intelligent form, since:
The definition of a 3D model in this way contains a complete geometrical and topological
description of the structure, including all vertices, edges and surfaces of each physical
piece of steel. As a result all element dimensions are automatically tested for
compatibility, and clashes which can easily carry through in the traditional processes are
removed. The model allows the efficient generation of conventional drawing information,
including general arrangement drawings (plans, elevations, sections, foundations,
isometric views - Figure 8), full shop fabrication details for all members, assemblies and
fittings (Figures 9a and 9b), and calculation of surface areas and volumes for all steelwork.
Further benefits of such systems are related to the links which can be established with
other parts of the production process. Full size templates can be drawn, e.g. for gusset
plates, and wrap-around templates for tubes. Erection drawings can be output and material
lists (including details of cutting, assembly, parts, bolts, etc. produced automatically. An
interface to a management information system can also facilitate stock control, estimating,
accounting, etc. Potentially of greatest importance is the possibility of downloading data
directly to Numerically Controlled (NC) fabrication machinery, automating much of the
fabricating work itself. At this level, 3D modelling is the central controlling tool for an
integrated steel fabrication works in which the total design-and-build package is offered.
Computing facilities continue to improve dramatically and their use is now highly
cost effective for a wide range of activities within steel construction.
Interactive graphical user interfaces have become standard, making it easier for
non-specialists to use computers.
Different facilities are required by different organisations within the design and
construction process.
The greater the degree of automatic data transfer between different applications,
the more efficient the overall process will be.