You are on page 1of 12

7th IFAC Conference on Manufacturing Modelling, Management,

and Control
International Federation of Automatic Control
June 19-21, 2013. Saint Petersburg, Russia

Virtual and Augmented Reality Applications in Manufacturing


A.Y.C. Nee1, S.K. Ong2

Mechanical Engineering Department, National University of Singapore,


Singapore (Tel: 65-65162892; e-mail: {1mpeneeyc/2mpeongsk@nus.edu.sg})

Abstract: Augmented Reality (AR) is a fast rising technology and it has been applied in many fields
such as gaming, learning, entertainment, medical, military, sports, etc. This paper reviews some of the
academic studies of AR applications in manufacturing operations. Comparatively, it is lesser addressed
due to stringent requirements of high accuracy, fast response and the desirable alignment with industrial
standards and practices such that the users will not find drastic transition when adopting this new tech-
nology. This paper looks into common manufacturing activities such as product design, robotics, facili-
ties layout planning, maintenance, CNC machining simulation and assembly planning. Some of the is-
sues and future trends of AR technology are also addressed.
Keywords: Virtual reality; Augmented Reality; Manufacturing

tary, and manufacturing (Ong and Nee, 2004, Yuan et al.,


1. INTRODUCTION
2008a).
Barely a century ago, manufacturing was known as the black
AR research in manufacturing applications is a strong and
art where most of the tools and technologies were primarily
growing area. However, it faces a higher order of accuracy,
mechanical in nature. Mechanical moving elements were
response, and interface design. The challenge is to implement
initially powered by steam and later by electric power.
integrated AR-assisted simulation tools that could improve
Elaborate overhead belt systems were used to provide power
manufacturing operations, as well as product and process
supply to each machine as it was more economical than hav-
development, leading to faster learning and training, hence
ing machines driven by individual power sources.
shorter lead-time, and consequently, reduced cost and im-
In the 1950s, numerical controlled machine tools made a proved quality.
huge leap and since then, manufacturing had entered a new
era. In the last several decades, due to the advancement of 2. HARDWARE DEVICES AND SOFTWARE SYSTEMS
information technology, digital manufacturing has become a
common platform worldwide. Computer-integrated manufac- 2.1 Hardware devices
turing systems have eliminated data handling errors. Com-
puter simulation using CAD modelling tools and finite ele- Head-mounted display (HMD) devices have been a popular
ment analysis has assisted manufacturing engineers to reach choice when AR applications were first developed, as the
decisions faster and free from errors. eye-level display facilitates direct perception of the com-
Virtual reality as a simulation tool was first reported in the bined AR scene. HMD devices, however, are uncomfortable
and may cause headache and dizziness, especially after pro-
1960s. Since then, many different forms had appeared, from
2D monitor-based to 3D immersive and sophisticated set up longed usage.
such as the CAVE. In just over two decades, augmented real- Current research in AR applications is towards mobility us-
ity (AR) technology has matured and proven to be an innova- ing handheld devices (HHD) either commercially available
tive and effective tool to address some of the critical prob- or specially designed (Hakkarainen et al/, 2008, Stutzman et
lems to simulate, guide and improve manufacturing proc- al., 2009, Xin et al., 2008). The advantages of using HHD are
esses before they are launched. Activities such as design, quite obvious as high resolution camera, touch screen, gyro-
planning, machining, etc., can now be done right-the-first- scope, etc., have already been embedded in these mobile
time without the need for subsequent re-work and modifica- devices.
tions. Much like VR which is a great simulation tool, AR is a
novel human-computer interaction tool that overlays com- The use of off-the-shelf mobile phones in AR applications is
puter-generated information on the real scene. The informa- on the rise. However, due to the limited processing and stor-
tion display and image overlay are context-sensitive depend- age capabilities of the mobile phones, some researchers use a
ing on the observed objects (Azuma et al., 2001). AR can be client-server architecture to improve real-time performance.
combined with human abilities to provide efficient and com- Hakkarainen et al reported a study on the use of mobile
plementary tools to assist manufacturing tasks. Several suc- phones for an AR-assisted assembly guidance system.
cessful demonstrations have been made in various domains
such as: gaming, advertising, entertainment, medical, mili-

978-3-902823-35-9/2013 IFAC 15 10.3182/20130619-3-RU-3018.00637


2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

A client-server architecture was designed using a mobile virtual prototyping, web-based virtual machining, assembly,
phone and a PC was used for computing CAD models, and fault diagnosis and learning, and various types of manufac-
the static rendered images are sent to the mobile phone for turing operations. Advancement in computer and manufac-
fast rendering. A similar set up was reported by Ha et al. turing technologies has provided suitable interfaces to allow
using an Android mobile phone. users to interact directly with the manufacturing information
associated with the manufacturing processes. AR can provide
Several AR applications use force feedback to enhance the users with an intuitive way to interact directly with informa-
immersive sensation of the user. Wearable datagloves such tion in manufacturing processes. It also allows the operators
as in assembly, design, etc., have been studied (Valentini, to use their natural spatial processing abilities to obtain a
2009). Haptic devices for path planning of a virtual robot sense of presence in the real world with virtual information.
have been reported in (Chen et al., 2010).
Research on the manufacturing applications of VR and AR is
2.2 Software systems definitely a strong and growing area, it has progressed much
in the last ten years due to advances in both hardware and
The most important elements in any AR applications are software. Hardware has become considerably smaller and
tracking and registration. Precise tracking and registration more powerful while many efficient and robust algorithms
allows the virtual and real objects to be aligned accurately. have been developed to allow faster response, improved ac-
curacy in tracking and registration.
Current tracking and registration algorithms can be classified
into marker-based, natural feature-based and model-based.
3.1 VR and AR research in design
Fiducial markers are popular as they have unique geometric
patterns to allow easy detection and identification in a video VR has been used in creating product design as it provides
stream. Marker-based tracking is suitable for a prepared en- very intuitive interaction with the designers in terms of visu-
vironment where markers are placed a priori. ARToolKit alization and the interfacing with downstream processes. A
(ARToolKit 2.11, 2011, Kato and Billinghurst, 1999) has the hybrid immersive modelling environment merging desktop-
most well-known tracking library. CAD was created by Stark et al (2010), Wiese et al (2009)
For an unprepared environment, natural feature tracking will and Israel et al (2009). They noted that the current modelling
be necessary. Current natural feature tracking is based on the media using paper and CAD system is complementary but it
robust point matching approach. Various feature descriptors, lacks interaction. Digital media offers the great freedom of
such as Binary Robust Independent Elementary Features exploring different dimensions and features, using stored
(BRIEF), Speeded Up Robust Features (SURF), ferns fea- forms and shapes from the library, and the advantage of inte-
tures, Scale Invariant Feature Transform (SIFT) features, grating a product model with associated physical properties.
etc., have been explored in natural feature detection. In addition, some downstream processes, such as process
planning, machining, and inspection can be fully integrated.
For estimating camera pose in an unknown scene, Parallel Fig. 1 shows the transformation of a sketch of a ceiling lamp
Tracking and Mapping (PTAM) (Klein and Murray, 2007) is to a final product using rapid prototyping techniques.
often used. By processing tracking and mapping in parallel
threads and keyframe-based mapping, mapping of an un-
known environment can be constructed with prominent fea-
tures, allowing virtual objects to be registered onto the real
world.
Model-based tracking can be used to match detected features
from a list of pre-created models. A model-based tracking
library, Open Tracking Library (OpenTL) (Panin et al., Fig. 1. Example of a ceiling lamp from sketch (left) to the
2008), provides good APIs and can handle multiple objects refined model (middle) and the final prototype (right) (Stark
tracking. OpenTL uses multi-threading and GPU-based com-
etal.,2010).
puting to achieve real-time performance. However, it is not
developed specifically for AR applications, but for general-
AR is becoming a major part of the prototyping process in
purpose model-based object tracking.
product design in many industries, e.g., in the automotive
industry, AR has been used for assessing interior design by
3. MAJOR VR AND AR RESEARCH IN MANUFACTUR-
overlaying different car interior mock-ups, which are usually
ING
only available as 3D-models in the initial phases of devel-
Many researchers in the manufacturing industries, academic opment, on real car bodies (Frnd et al., 2005). However,
institutes and universities have started exploring the use of few systems can support product creation and modification
AR technology in addressing some complex problems in in AR using 2D or 3D interaction tools.
manufacturing. Effective simulation before an actual opera- An AR-based mock-up system for design evaluation was
tion will ensure that it can be carried out right-the-first-time, presented by Park (2008). In this system, interactive modifi-
eliminating many trials and re-works, saving materials, en- cation of shapes as well as colors, textures, and user interfac-
ergy and labour. VR applications have been well reported in

16
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

es can be carried out. Physical mockups of handheld Media


Players were used and with AR, a user is able to experiment
with different features of the product such as color and inter-
faces (e.g., size of touch screens). A user could also decide
on a design model that they want to evaluate and construct a
mock-up by assembling specific components. On the other
hand, everyday objects were used by Ng et al. (2010) to fa-
cilitate interactive design. An AR computer-aided design
environment (ARCADE) has been developed to facilitate
interactive design by a layman (Ng et al., 2010).

3.2 VR and AR in robotics

VR has been proven to be useful in medical robots for sur- Fig. 2. RPAR system architecture (Ong et al., 2010).
geries (Burdea, 1996), tele-robotics (Freund and Rossmann,
2005), welding (Liu et al., 2010), modeling of a six-DOF
virtual robot arm (Chen et al., 2010), etc. In (Chen et al.,
2010), the authors proposed a new Human Computer Interac-
tion (HCI) method for VR-based robot path planning and
virtual assembly systems. However, the main constraint in
VR-based robot programming is the need to construct the
entire Virtual Environment (VE), and this requires full a pri-
ori knowledge of the workpieces, working area and thus
more computational resources.
AR-based robotic systems offer the users with graphics, text
and animation through augmenting illustrative and informa-
tive elements over the real scene via a video stream. An AR
cueing method was reported by Nawab et al. (2007) and
Chintamani et al. (2010) to assist the users in navigating the
end-effector (EE) of a real robot using two joysticks.

The use of AR to address human-robot interaction and robot


Fig. 3. Physical layout of the experimental set-up (Fang et
programming issues have been reported in several studies.
al., 2012).
Operators can program and guide the virtual model without
having to interact physically with the real robot. Zaeh and
In RPAR-II (Fig. 4), a collision-free path can be generated
Vogl (2006) introduced a laser-projection-based approach
through human-virtual robot interaction in a real working
where the operators can manually edit and modify the
environment, as illustrated in Fig. 5. Fig. 5(a) is the setup for
planned paths projected over the real workpiece through an
a robotic task, which is to transfer an object from a start point
interactive stylus. Reinhart et al. (2008) adopted a similar
to a goal point. With the start point and goal point known a
human-robot interface in remote robot laser welding applica-
priori, after generating a collision-free volume (CFV) in the
tions. Chong et al. (2009) and Ong et al. (2010) presented a
workspace (Fig. 5(b)), the user proceeds to create a series of
methodology to plan a collision-free path through guiding a
control points within the collision-free volume using the in-
virtual robot using a probe attached with a planar marker and
teraction device (Fig. 5(c)). Using these points as inputs, a
developed the RPAR (Robot Programming using Augmented
cubic-spline interpolation is applied to generate a smooth
Reality) system. The methodology is interactive as the hu-
path automatically (Fig. 5(d)).
man is involved in obtaining the 3D data points of the de-
sired curve to be followed through performing a number of
demonstrations, defining the free space relevant to the task, 3.3 Factory layout planning (FLP) systems
and planning the orientations of the end-effector along the
curve (Fig. 2). FLP refers to the design of the layout plans of the ma-
chines/equipment in a manufacturing shopfloor. A well-
RPAR was further developed and enhanced to the RPAR-II designed manufacturing layout plan can reduce up to 50% of
system (Fang et al., 2009, Fang et al., 2012) which is shown the operating cost (Xie and Sahinidis, 2008). Traditionally,
in Fig. 3. It includes a SCORBOT-ER VII manipulator, grip- FLP solutions are achieved by building scaled models as it is
per, robot controller, desktop-PC, desktop-based display, easy to visualize the individual components and many plan-
stereo camera, and an interaction device attached with a ners can study the model at the same time.
marker-cube. The augmented environment consists of the
physical entities that exist in the robot operation space and a
virtual robot model which includes the virtual end effector to
replicate the real robot.

17
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

ings include time-consuming modelling of the entire factory


floor space and all the facilities.
Several AR-based FLP systems have been reported. These
systems allow the users to place virtual objects in the real
environment, allowing immediate visualisation of the layout.
It is an attempt to integrate human intuitiveness with the lay-
out design process. However, as AR technology was not ma-
ture then, many reported systems were at the conceptual de-
sign stage. Recently, AR tools and technologies have ad-
vanced considerably (Nee et al., 2012, Reinhart and Patron,
2003, Ong et al., 2007). Jiang and Nee (2013) presented an
on-site planning and optimization method based on AR tech-
nology. Information of the existing facilities is obtained in
real-time to formulate the layout criteria and physical con-
straints. Figure 6 shows the enhanced sense of the existing
facilities obtained in real-time to formulate the layout criteria
Fig. 4. Architecture of the RPAR-II system (Fang et al., and constraints. The enhanced sense of reality can facilitate
2012). the full utilization of a users experience, knowledge and
intuition for identifying specific issues to be addressed and
examined on-site. A system named AFLP (AR-based FLP)
has been developed to implement the proposed methodology
(Fig. 6 and Fig. 7).

Fig. 6. Using AR to facilitate FLP for existing shopfloors


(Jiang and Nee, 2013).

Fig. 5. Geometric path planning in the RPAR-II system


(Fang et al., 2012).

Algorithmic tools and mathematical formulation of FLP are


often used, examples are, the Quadratic Assignment Problem
model, the Mixed Integer Problem model, and the develop-
ment of efficient algorithms to solve these models, e.g., GA
(genetic algorithm), SA (simulated annealing), etc. However,
due to the combinatorial complexity of the FLP problem, it is
almost impossible to find the best solution.
More recently, VR simulation tools have been applied for
generating solutions for FLP tasks. These systems are known
as plant simulation or manufacturing software. These VR
systems (Calderon and Cavazza, 2003, Iqbal and Hashmi
Fig. 7. User interface of the AFLP system (Jiang and Nee,
2001, Zetu et al., 1998) have quite similar features as they all
2013).
provide visual on-line layout planning platforms to the users.
The tedious design processes, however, are difficult to im-
An on-site modeling method has been developed to obtain
prove with these systems. Moreover, as the entire plant envi-
the geometric data of existing facilities. These data, together
ronment is simulated virtually, any deviation from reality
with the data that represent the facilities to be laid out, are
could reduce the usefulness of the solutions. Other shortcom-
utilized to define the layout criteria and constraints evaluated
in real-time for comparison purposes. In the augmented

18
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

shopfloor, users can manipulate the layout of new facilities scene is provided in OFA (Fig. 10(a)), and a mobile user
intuitively until a good solution has been achieved. In addi- interface, consisting of a physical marker and a virtual panel
tion, an optimization scheme is adopted to provide alterna- is provided in OSA. The physical marker is tracked in 3D
tive layout plans. space and acts as a 2D cursor and/or 3D placement tool. The
virtual panel is a virtual display of computer augmented in-
3.4 Maintenance formation, e.g., virtual buttons. The user can place the 2D
cursor on a virtual button for a predefined time range to acti-
vate it (Fig. 10(b)). The user can use the 3D placement tool
Maintenance plays an important role in ensuring equipment to arrange the virtual objects spatially (Fig. 10(c)).
performance and reduction of downtime and disruption to
production schedules. However, increasing equipment com-
plexity has posed great challenges to the maintenance per-
sonnel. Several aspects of maintenance can be supported
with advanced information technologies (van Houten et al.,
1998, Setchi and White, 2003, van Houten and Kimura,
2000). Augmented reality (AR) can be used to enhance main-
tenance activities (Feiner et al., 1993). AR provides a better
approach for providing maintenance information compared
with paper and computer-based manuals and can improve the
workflow of maintenance operations.
AR applications in maintenance activities started in the early
1990s. The current research focus has been shifted from
demonstrating the benefits of applying AR to improving the
Fig. 8. ARAMS system architecture (Ong and Zhu, 2013).
usefulness in routine and ad hoc maintenance activities.
However, no single AR system has yet been proven to be
well accepted by the industry (Ong et al., 2008). The useful-
ness of AR systems in maintenance depends on several fac-
tors. Firstly, the maintenance information should provide
context-awareness. A system is context-aware if it could col-
late, reason, and use context information (Dey, 2001), and
adapt its functionality to varying contexts (Byun and
Cheverst, 2004). For example, details of maintenance in-
structions can vary according to the individual expertise of
the technicians. Secondly, the maintenance information pro-
vided should be editable and hence can be updated easily.
This is useful as it allows technicians to document and up-
date any incorrect maintenance information in the database.
Thirdly, suitable collaboration tools should be provided to
allow remote experts to create AR-based instructions to as-
sist on-site technicians who may need assistance. Lastly, a
bi-directional content creation tool that allows dynamic AR
maintenance contents creation offline and on-site.
Ong and Zhu (2013) developed ARAMS (Fig. 8) which con-
sists of (1) On-site authoring for maintenance technicians to
create, edit and update AR contents; (2) Offline authoring for
maintenance experts to develop context-aware AR mainte-
nance contents, to form the bi-directional tool; (3) Online
authoring for experts to create AR-based instructions during
remote maintenance activities; (4) Database stores virtual Fig. 9. Bi-directional authoring (Ong and Zhu, 2013).
and AR maintenance contents; (5) Context management col-
lects and reasons maintenance contexts; (6) Tracking and
registration; and (7) AR-based visualization for rendering the 3.5 CNC simulation
AR contents in the maintenance environments.
Several commercial 3D graphics-based CNC machining
A bi-directional maintenance content creation tool has been simulation systems, such as DELMIA Virtual NC (Delmia,
developed to build context-aware AR contents. The bi- 2009), EASY-ROB NC Simulation (Easy-Rob, 2009), hy-
directional process (Fig. 9) consists of two main steps, con- perMILL by OPEN MIND (hyperMill, 2009), etc., are avail-
text modeling and information instance modeling. Intuitive able. The ability for operators to analyse machining informa-
user interfaces are provided, where a desktop user interface tion can be complemented using 3D graphics and instant
consisting of an authoring panel and the augmented virtual

19
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

access to databases. AR technology enables this through ren- human-machine interfaces, which include a Firewire CCD
dering virtual information onto a real machining environ- camera, and a high-end PC. Either a head-mounted display or
ment, thus providing a real world supplemented with rich a monitor can be used in the in situ system as the display
information to the users. device.
To achieve in situ CNC simulation, the position of the cutter
is registered in the machining coordinate system using a hy-
brid tracking method and constraints extracted from given
NC codes. The system is designed to be used by novice ma-
chinists, who can use the system to alter NC codes and ob-
serve responses from the CNC machine without possible tool
breakage and machine breakdowns. According to the simula-
tion results, alarms can be rendered in the augmented display
to notify the users of dangers and errors (Fig. 12).

Fig. 10. User interfaces (Ong and Zhu, 2013).

In an AR-based machining simulation environment, the user


retains the awareness of the real CNC machine, while the
augmented 2D or 3D information, such as cutting parame-
ters, CNC programs, etc. can enhance the users visual, aural
and proprioceptive senses. Fig. 11. Experimental setup of ARCNC (Zhang et al., 2008).
Many studies in applying AR technology to the information-
intensive and time-consuming tasks in manufacturing have
been conducted. Comparatively, fewer AR applications can
be found in CNC machining. This is probably due to the fact
that the processing procedures are machine-centric and fewer
human factors are involved during this stage. The ASTOR
system (Olwal et al., 2008, Olwal et al., 2005) applies a pro-
jection-based AR display mechanism to allow users to visu-
alize machining data that is projected onto a real machining
scene. The system first obtains the machining process data
and the resultant cutting forces, and displays the information
onto a holographic optical element window, which is in-
stalled on the sliding door of the lathe. Weinert et al. (2008]
developed a CNC machining simulation system for 5-axis
CNC machines. In the system, ARToolKit-based tracking
was applied to track the movement of the cutter with respect Fig. 12. System architecture of AR-assisted CNC simulation
to the machine table, and dexel boards were applied to model system (Zhang et al., 2008).
the workpiece. The simulation module in the system can es-
timate the cutting forces and predict collision between the In the ARCNC system, machining simulation is performed
head stock and the workpiece. between a real cutter and a virtual workpiece and displayed
to the users using a video see-through scene rendering
An AR-assisted in situ CNC machining simulation system, mechanism (Fig. 13). Simulations of material removal proc-
namely, the ARCNC system, has been developed (Zhang et esses are displayed to the user to assist in the inspection and
al., 2008, Zhang et al., 2010a, Zhang et al., 2010b) for ma- evaluation of the machining processes before performing real
chining operations on a 3-axis CNC machine. The system machining, thus reducing material wastage and power con-
setup and architecture are shown respectively in Fig. 11 and sumption. In addition, the user can inspect the physical as-
Fig. 12. This AR-assisted in situ CNC simulation system pects of the machining processes based on the estimated ma-
consists of three main units, viz., a 3-axis vertical CNC ma- chining conditions, which are augmented onto the scene, e.g.,
chine in this research, a display device, and the AR-assisted machining forces, etc. The application of the video see-

20
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

through technology in the proposed system allows different Virtual Reality (VR) technology plays a vital role in simulat-
users to focus on different information and tasks, which can ing advanced 3D human-computer interactions (HCI), espe-
be useful during training when several trainees are involved. cially for mechanical assemblies, by allowing users to be
completely immersed in a synthetic environment. Many VR
systems have been proposed successfully to assist assembly
activities, e.g., CAVE (Cruz-Neira et al., 1992, 1993); IVY
(Inventor Virtual Assembly) (Kuehne and Oliver, 1995);
Vshop (Pere et al.,, 1996); VADE (Virtual Assembly Design
Environment) (Jayaram et al.,, 1997, 1999, 2000a, 2000b,
Taylor et al., 2000), HIDRA (Haptic Integrated Dis/Re-
assembly Analysis) (Coutee et al., 2001, Coutee and Bras,
2002), SHARP (Seth et al.,, 2005, 2006). However, there are
limitations in VR assembly systems. A major limitation is
the need to create a fully immersed VR experience, which
Fig. 13. An in situ CNC machining simulation system
may not be highly convincing as it is not easy to fully and
(Zhang et al., 2010b).
accurately model the actual working environments which are
critical to the manufacturing process. Although there are ad-
During an in situ simulation, a virtual cutter is registered
vanced approaches to accelerate the computation process
with the real cutter in near real time. A virtual workpiece is
(e.g., GPU acceleration: http://www.nvidia.com), the real-
either rendered onto a worktable or aligned with a fixture on
time capability is still a challenge as a result of the extensive
the worktable. Simulation of the machining process can be
computation in VR.
achieved according to the movements of the virtual cutter
and the workpiece (which moves together with the workta-
AR technology can overcome some of the limitations de-
ble). Both geometric and physical simulations can be per-
scribed as it does not need the entire real world to be mod-
formed and displayed. To the operator, it would look like a
elled (Ong et al., 2008), thus reducing the high cost of full
real cutter machining a virtual workpiece. The operator can
immersive VR environments, in terms of both preparation
interactively observe the simulation as it proceeds, with NC
and computation time. More importantly, AR enhances the
codes, cutter coordinates, and estimated physical cutting
interaction between the systems and the users by allowing
conditions provided on a virtual interaction panel (Yuan et
them to manipulate the objects naturally. Therefore, AR
al., 2004). Feedback from the operator to the machine tool
technology has emerged as one of the most promising ap-
can be included in the architecture. When certain values in
proaches to facilitate mechanical assembly processes, which
the physical simulation, e.g., cutting forces exceeding certain
complexity can be enormous. In the past two decades, AR
limits, an alarm can be displayed to the operator, who can
has proven its ability by integrating various modalities in real
respond accordingly, such as pressing an emergency button
time into the real assembly environment. With an AR-based
on the virtual panel to stop the machine tool.
environment, an intuitive way to interact directly with prod-
uct design and manufacturing information is provided to the
The AR-assisted in situ simulation system may perform bet-
users, allowing them to use natural spatial processing abili-
ter than 3D graphics-based simulation systems in several
ties to obtain a sense of presence in the real assembly work-
aspects. First of all, the cutting simulation is presented to the
space with both real and virtual objects. Researchers in the
operator with a heightened sense of reality, and he can oper-
manufacturing industries, academic institutes and universi-
ate the CNC machine and observe the simulation simultane-
ties all around the world have been exploring the use of AR
ously. The system can be used with any CNC machine that
technology in addressing some complex problems in me-
the operator is familiar with or is trained for. The selection of
chanical assembly.
a machine tool will not affect the simulation procedures as
long as initialization is conducted accordingly. For example, Ong and Wang (2011) presented an AA system which can
the parameters applied in a physical simulation may be dif- interpret a users manual assembly intent, support on-line
ferent from machine to machine. Thus, calibrations should be constraint recognition, and provide a robust 3D bare-hand
performed first and stored in the physical model module. interaction interface to allow visual feedback during assem-
Scene rendering time and effort in this AR-assisted system is bly operations (Fig. 14). 3D natural bare-hand interaction
reduced as compared to the graphic-based simulation sys- (3DNBHI) method has been developed to implement a dual-
tems, since only a few virtual objects are modelled and up- hand AA interface for users to manipulate and orientate
dated geometrically and spatially. Furthermore, the move- components, tools and sub-assemblies simultaneously. This
ments of the cutter and the worktable are obtained from vi- allows close replication of real world interactions in an AE,
sion-based tracking and registration. Hence, the simulation making the user feel that he is assembling the real product,
can reflect the real dynamic tool movements, rather than an and hence the AA process becomes more realistic and almost
ideal model of the machine in 3D graphic-based simulation comparable to the real process. A tri-layer assembly data
systems. structure (TADS) is used for assembly data management in
this system. An interactive constraint-based AA using bare-
3.6 Assembly operations hand has been developed to increase the interaction between
the user and the virtual components to realize the active par-
ticipation of the users in the AA process. Figure 15 shows

21
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

the architecture of a bare-hand interaction augmented assem- motion of the right bracket is constrained in the planar sur-
bly (BHAA) system. face of the base.
This case study, although quite simple, has demonstrated the
use of bare-hands for assembly simulation. It has good po-
tential for training a novice worker in assembling parts to-
gether. After several training sessions, the worker can oper-
ate without further AR guidance, while the initial learning
curve can be shortened considerably. Work is underway to
implement tactile feedback for detecting interference fit and
mismatching of parts. The assembly simulation can also pro-
vide feedback to the design-for-assembly approach, and in-
corporate ergonomic principles for reducing worker fatigue.

Fig. 14. BHAA system setup (Ong and Wang, 2011).

Fig. 16. The AA processes of a pulley bracket (Ong and


Fig. 15. Architecture of BHAA system (Ong and Wang, Wang, 2011).
2011).

The exploded view of a pulley bracket shown in Fig. 16 is TECHNICAL ISSUES AND CHALLENGES IN AR
used as a case study. In Fig. 16(a), the user first grasps the
pulley with his right hand and the left bush with his left hand 4.1 Tracking accuracy
and assembles the two parts together. If the parts collide, the
system will analyze the surface information in the contact list AR applications in manufacturing require a high level of
and detects any possible constraints. In this case, a cylindri- tracking accuracy such as in robot path planning and CNC
cal fit constraint has been recognised. Next, the position and machining simulation. A combination of computer-vision,
orientation of the pulley in the users right hand is adjusted inertial and hybrid tracking techniques will be required. CV-
automatically to ensure that the cylindrical fit constraint has based tracking will not be able to handle high frequency mo-
been met precisely and the assembly operation is then com- tion as well as rapid camera movements. Hybrid systems
pleted. In Fig. 16(b), the user next assembles the right using laser, RFID and other types of sensing devices will
bracket with the base. After the two co-planar fit constraints need to be considered.
have been recognized, it is realised that the right bracket can
only be translated on the base. In Fig. 16(c), the user then
4.2 Registration of virtual information
proceeds to fasten the two bolts using a standard spanner for
fixing the right bracket onto the base. In the final step shown
One of the basic issues in AR is the placing of virtual objects
in Fig. 16(d), the user assembles the left bracket on to the
with the correct pose in an augmented space. This is also
base. After a coplanar-fit constraint has been recognized, the

22
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

referred to as registration, which is a difficult and much re- three features are further used to integrate AR technology
searched topic. As different tracking methodologies possess and develop custom-built 3D simulations.
their own inherent deficiencies and error sources, it is neces-
sary to study the best tracking method for a particular appli- 3D interface and wearable computing devices are popular
cation which could be subject to poor lighting condition, areas of AR research on interfacing technologies. Poupyrev
moving objects, etc. et al (2002) divided the AR interface design space along two
orthogonal approaches, viz., 3D AR interfaces and tangible
The first type of errors is referred to as static error which interfaces. In the 3D AR interface, users interact with virtual
arises from the inaccuracy present in the sensory devices, contents via HMDs and monitor-based displays and these are
misalignments between sensors, and/or incorrect registration not the tools that they would interact with the real world. In
algorithms (Dong and Kamat, 2010). These types of errors tangible interfaces, users would use traditional tools in the
can be eliminated quite easily as higher accuracy sensors are same way as they manipulate the physical objects.
available and other sensor alignments can be set up accu-
rately. 5. CONCLUSIONS
The second type of errors is the dynamic errors that are less Augmented reality is finding new application almost daily.
predictable, which can be due to latency problems between Its ability to provide high user intuition and the relative ease
data streams due to off-host delay, synchronization and com- of implementation has outperformed VR, which was one of
putational delays (Dong and Kamat, 2010). Researchers have the most notable impacts of the late 1990s. A proliferation of
been working on methods to resolve the latency issues and AR applications can be found on handheld devices and smart
some of the solutions are to adopt multi-threading program- phones. Moving from marker-based to markerless registra-
ming or scheduling system latency (Jacobs et al., 1997), and tion and tracking, mobile and outdoor AR is rapidly gaining
predicting the camera motion using Kalman filter (Lie- popularity.
berknecht et al., 2009).
AR application in manufacturing operations is relatively new
compared to social and entertainment applications. This is
4.3 Latency issues due largely to the more stringent requirements in tracking
and registration accuracy, and a good alignment with tradi-
AR displays require an extremely low latency to maintain the tional practices. Unlike playing an AR game where one can
virtual objects in a stable position (Pasman et al., 1999). An quit any time and restart at will, the engineering users are
important source of alignment errors come from the differ- likely to spend a considerable amount of time using the sys-
ence in time between the moment an observer moves and the tem in their jobs, and this is where ergonomics, human fac-
time when the image which corresponds to the new position tors and cognitive strain on the users must be well recognised
of the observer is displayed. This time difference is called the and taken care of.
end-to-end latency, which is important as head rotations can
be very fast and this would cause significant changes to the This paper presents some of the applications of AR which are
scene being observed. It is suggested (Padmos and Milders, relevant to the manufacturing community, although most of
1992) that the displacement of objects between two frames them are still in the experimental stage. The paper empha-
should not exceed 0.25 of a degree. In terms of latency, this sizes the importance of designing and providing intuitive and
would translate to 5ms when an observer rotates his head at a effective human interfaces, as well as suitable content devel-
speed of 50 degrees per second. Pasman et al (1999) de- opment in order to make AR a powerful tool in the manufac-
scribed a method to meet this requirement. Their method turing engineering field.
used a combination of several levels of position and orienta-
tion tracking using varied relative and absolute accuracies, as REFERENCES
well as different levels of rendering to reduce the 3D data to
relatively simple scenes such that the 3D data can be ren- ARTool 2.11 (Last accessed on June 29 2011).
Azuma, R., Baillot, Y., Behringer, R., Feiner, S., Julier, S.,
dered in a shorter period of time.
and B. MacIntyre (2001). Recent advances in aug-
mented reality, IEEE Computer Graphics and Applica-
4.4 AR interfacing technology tions, vol. 21, no. 6, pp. 34-47.
Burdea, G.C. (1996). Virtual reality and robotics in medi-
Four essential elements are needed to set up an AR environ- cine, Proceedings of the IEEE International Workshop
ment (Kim & Dey 2010), namely, target places, AR contents, on Robot and Human Communication, Tsukuba, Japan:
tracking module and the display system. 11-14 November 1996, pp. 16-25.
Byun, H.E. and K. Cheverst (2004). Utilizing context histo-
Kim and Dey (2010) reported a comprehensive review of AR ry to provide dynamic adaptations, Applied Artificial
prototyping trends and methods. They addressed three fea- Intelligence, vol. 18, no. 6, pp. 533-548.
tures for creating an AR environment that are essential for Calderon, C., and M. Cavazza (2003). A New Approach to
end-user interaction, viz., intuitive observation, informative Virtual Design for Spatial Configuration Problems,
visualization and immersive interaction, and in the develop- Proceedings of the Seventh International Conference on
ment of Interactive Augmented Prototyping (IAP). These Information Visualization (IV 03), pp. 518-523.

23
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

Chen, C.J., Ong, S.K., Nee, A.Y.C., and Y.Q. Zhou (2010). Freund, E., and J. Rossmann (2005). Projective virtual real-
Haptic-based Interactive Path Planning for a Virtual ity as a basis for on-line control of complex systems-not
Robot Arm, International Journal of Interactive Design only-over the Internet, Journal of Robotic Systems, vol.
and Manufacturing, vol. 4, no. 2, pp. 113-123. 22, no. 3, pp. 147-155.
Chintamani, K., Cao, A., Ellis, R.D., and A.K. Pandya Frnd, J., Gausemeier, J., Matysczok, C., and R. Radkowski
(2010). Improved tele-manipulator navigation during (2005). Using Augmented Reality Technology to Sup-
display-control misalignments using Augmented Reality port Automobile Development, Lecture Notes in Com-
cues, IEEE Transactions on Systems, Man, and Cyber- puter Science, vol. 3168, pp. 289-298.
netics Part A: Systems and Humans, vol. 40, no. 1, pp. Ha, J., Cho, K., Rojas, F.A., and H.S. Yang, (2011). Real-
29-39. Time Scalable Recognition and Tracking based on the
Chong, J.W.S., Nee, A.Y.C., Ong, S.K., and K. Youcef- Server-Client Model for Mobile Augmented Reality,
Toumi (2009). Robot Programming using Augmented Proceedings IEEE 1st International Symposium on Vir-
Reality: an Interactive Method for Planning Collision- tual Reality Innovation 2011 (ISVRI 2011), March 19-
Free Paths, International Journal of Robotics and 20, 2011, Singapore, pp. 267-272.
Computer-Integrated Manufacturing, vol. 25, no. 3, pp. Hakkarainen, M., Woodward, C., and M. Billinghurst (2008).
689-701. Augmented Assembly using a Mobile Phone, Pro-
Coutee, A.S. and B. Bras (2002). Collision detection for ceedings 7th IEEE International Symposium on Mixed
virtual objects in a haptic assembly and disassembly and Augmented Reality (ISMAR 2008), 2008 September
simulation environment, In: ASME design engineering 16-18, Cambridge, pp. 167-168.
technical conferences and computers and information in HyperMILL by OPEN MIND (2009). http://www.openmind-
engineering conference, Montreal, Quebec, Canada, tech.com/zv. (Last accessed 3 Jun 2009).
2002, pp. 11-20. Iqbal, M., and M.S.J. Hashmi (2001). Design and Analysis
Coutee, A.S., McDermott, S.D., and B. Bras (2001). A hap- of a Virtual Factory Layout, Journal of Material Proc-
tic assembly and disassembly simulation environment essing Technology, vol. 118, pp. 403-410.
and associated computational load optimization tech- Israel, J.H., Wiese, E., Mateescu, M., Zollner, C., and R.
niques, J. Comput. Inf. Sci. Eng., vol. 1, no. 2, pp. 113- Stark (2009). Investigating three-dimensional sketching
122. for early conceptual design Results from expert dis-
Cruz-Neira, C., Sandin, D.J., DeFanti, T.A., Kenyon, R.V., cussion and user studies, Computers and Graphics, vol.
and J.C. Hart (1992). The CAVE: audio visual experi- 33, no. 462-473.
ence automatic virtual environment, Communications Jacobs, M.C., Livingston, M.A., and A. State (1997). Man-
of the ACM, vol. 35, no. 6, pp. 64-72. aging Latency in Complex Augmented Reality Sys-
Cruz-Neira, C., Sandin, D.J., and T.A. DeFanti, (1993). Sur- tems, Proceedings of the 1997 Symposium on Interac-
round-screen projection-based virtual reality: the design tive 3D Graphics, pp. 49-54.
and implementation of the CAVE, In: Proceedings of Jayaram, S., Connacher, H.I., and K.W. Lyons (1997). Vir-
the 20th annual conference on Computer graphics and tual assembly using virtual reality techniques, Com-
interactive techniques, Anaheim, CA, USA, 1993, pp. puter Aided Design, vol. 29, no. 8, pp. 575-584.
135-142. Jayaram, S., Jayaram, U., Wang, Y., Tirumali, H., Lyons, K.,
Delmia (2009). DELMIA Virtual NC, and P. Hart (1999). VADE:a virtual assembly design
http://www.delmia.com. (Last accessed 3 June 2009). environment, Comput Graph Appl, vol. 19, no. 6, pp.
Dey., A.K. (2001). Understanding and using context, Per- 44-50.
sonal and Ubiquitous Computing, vol. 5, no. 1, pp. 4-7. Jayaram, S., Jayaram, U., Wang, Y., and K. Lyons (2000a).
Dong, S., and V.R. Kamat (2010). Robust mobile comput- CORBA-based Collaboration in a Virtual Assembly
ing framework for visualization of simulated processes Design Environment, In: Proceedings of ASME design
in augmented reality, Proceedings of the 2010 Winter engineering technical conferences and computers and
Simulation Conference (WSC 10), pp. 3111-3122. information in engineering conference, Baltimore, MD,
Easy-rob (2009). EASY-ROB NC Simulation, USA, 2000.
http://www.easy-rob.com/en/product/apis-additional- Jayaram, U., Tirumali, H., and S. Jayaram (2000b). A
options/nc-simulation.html. (Last accessed 3 June 2009). tool/part/human interaction model for assembly in vir-
Fang, H.C., Ong, S.K., Nee, A.Y.C, 2009, Robot Program- tual environments, In: Proceedings of ASME design
ming using Augmented Reality, International Confer- engineering technical conferences, Baltimore, MD,
ence on Cyberworlds, 7-11 September 2009 University USA, 2000.
of Bradford, UK, 13-20. Jiang, S. and A.Y.C. Nee (2013). A novel facility layout
Fang, H.C., Ong, S.K., and A.Y.C. Nee (2012). Interactive planning and optimization methodology, CIRP Annals
Robot Trajectory Planning and Simulation using Aug- Manufacturing Technology, vol. 62, no. 1 (to appear).
mented Reality, Robotics and Computer-Integrated Kato, H. and M. Billinghurst (1999). Marker tracking and
Manufacturing, vol. 28, no. 2, pp. 227-237. HMD calibration for a video-based augmented reality
Feiner, S., Macintyre, B., and D. Seligmann (1993). conferencing system, Proceedings of the 2nd IEEE and
Knowledge-based Augmented Reality, Communica- ACM International Workshop on Augmented Reality,
tions of the ACM, vol. 36, pp. 53-62. CA, USA, pp. 85-94.

24
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

Kim, S. and A.K. Dey (2010) AR interfacing with prototype tional Journal of Production Research, vol. 46, no. 10,
3D applications based on user-centered interactivity, pp. 2707-2742.
Computer Aided Design, vol. 42, no. 5, pp. 373-386. Ong, S.K. and J. Zhu (2013). A novel maintenance system
Klein, G. and D. Murray (2007). Parallel Tracking and for equipment serviceability improvement, CIRP An-
Mapping for Small AR Workspaces, Proceedings of the nals Manufacturing Technology, vol. 62, no. 1 (to ap-
6th International Symposium on Mixed and Augmented pear).
Reality (ISMAR 07, Nara), pp. 250-259. Padmos, P., and M.V. Milders (1992). Quality criteria for
Kuehne, R. and J. Oliver (1995). A virtual environment for simulator images: A literature review, Human Factors,
interactive assembly planning and evaluation, In: Pro- vol. 34, no. 6, pp. 727-748.
ceedings of ASME design automation conference, Bos- Panin, G., Lenz, C., Nair, S., Roth, E., Wojtczyk, M., Friedl-
ton, MA, USA, 1995. huber, T., and A. Knol (2008). A Unifying Software
Liang, J., Shaw, C., and M. Green (1991). On temporal- Architecture for Model-based Visual Tracking, In: Pro-
spatial realism in the virtual reality environment, Pro- ceedings of the 20th Annual Symposium of Electronic
ceeding of 1991 Symposium on User Interface Software Imaging, San Jose, CA, 6813:03-17.
and Technology, Hilton Head, South Carolina,, ACM Park, J. (2008). Augmented Reality Based Re-formable
New York, pp. 19-25. Mock-Up for Design Evaluation, Proceedings of the
Liu, Z., Bu, W., and J. Tan (2010). Motion navigation for 2008 International Symposium on Ubiquitous Virtual
arc welding robots based on feature mapping in a simu- Reality, IEEE Computer Society Washington, DC, USA,
lation environment, Robotics and Computer-Integrated 2008, pp. 17-20.
Manufacturing, 26/2:137144. Pasman, W., van der Schaaf, A., Lagendijk, R.L., and F.W.
Nee, A.Y.C., Ong, S.K., Chryssolouris G., and D. Mourtzis Jansen (1999). Accurate overlaying for mobile aug-
(2012). Augmented Reality Applications in Design and mented reality, Computer & Graphics, vol. 23, no. 6,
Manufacturing, CIRP Annals Manufacturing Tech- pp. 875-881.
nology, vol. 61, no. 2, pp. 657-679. Pere, E., Langrana, N., Gomez, D., and G. Burdea (1996).
Ng, L.X. Ong, S.K., and A.Y.C. Nee (2010). ARCADE: A Virtual mechanical assembly on a PC-based system,
Simple and Fast Aug-mented Reality Computer-Aided In: Proceedings of ASME design engineering technical
Design Environment Using Everyday Objects, Pro- conferences and computers and information in engineer-
ceedings of IADIS Interfaces and Human Computer In- ing conference (DETC1996/DFM-1306), Irvine, CA,
teraction 2010 Conference (IHCI 2010), pp. 227-234. USA, 1996.
Olwal, A., Gustafsson, J., and C. Lindfors (2008). Spatial Poupyrev, I., Tan, D.S., Billinghurst, M., Kato, H., Regen-
augmented reality on industrial CNC machines, Pro- brecht, H., and N. Tetsutani (2002). Developing a ge-
ceedings of the International Conference on The Engi- neric augmented-reality interface, IEEE Journal of
neering Reality of Virtual Reality 2008, vol. 6804; 2008 Computers, vol. 35, no. 3, pp. 44-50.
Jan 27-31; San Jos, California; 2008, 680409:1-9. Reinhart, G., Munzert, U., and W. Vogl (2008). A pro-
Olwal, A., Lindfors, C., Gustafsson, J., Kjellberg, T., and L. gramming system for robot-based remote-laser-welding
Mattson (2005). ASTOR: An autostereoscopic optical with conventional optics, CIRP Annals Manufactur-
see-through augmented reality system, Proceedings of ing Technology, vol. 57, no. 1, pp. 37-40.
IEEE and ACM International Symposium on Mixed and Reinhart, G. and C. Patron (2003). Integrating Augmented
Augmented Reality, pp. 24-27. Reality in the Assembly Domain-Fundamentals, Bene-
Ong, S.K., Chong, J.W.S., and A.Y.C. Nee (2010). A Novel fits and Applications, CIRP Annals Manufacturing
AR-Based Robot Programming And Path Planning Technology, vol. 52, no. 1, pp. 5-8.
Methodology, Robotics and Computer-Integrated Ong, S.K., Pang Y., and A.Y.C. Nee (2007). Augmented
Manufacturing Journal, vol. 26, no. 3, pp. 240-249. Reality Aided Assembly Design and Planning. CIRP
Ong, S.K. and A.Y.C. Nee (2004). Virtual and Augmented Annals Manufacturing Technology, vol. 56, no. 1, pp.
Reality Applications in Manufacturing, London, 49-52.
Springer, ISBN 1-85233-796-6. Setchi, R. and D. White (2003). The development of a hy-
Ong, S.K., Pang, Y., and A.Y.C. Nee (2007). Augmented permedia maintenance manual for an advanced manu-
reality aided assembly design and planning, CIRP An- facturing company, The International Journal of Ad-
nals Manufacturing Technology, vol. 56, no. 1, pp. 49- vanced Manufacturing Technology, vol. 22, no. 5-6, pp.
52. 456-464.
Ong, S.K. and Y. Shen (2009). A Mixed Reality Environ- Seth, A., Su, H.J., and J.M. Vance (2005), A desktop net-
ment for Collaborative Product Design and Develop- worked haptic VR interface for mechanical assembly,
ment, CIRP Annals Manufacturing Technology, vol. In: ASME 2005 International Mechanical Engineering
58, no. 1, pp. 139-142. Congress & Exposition, Orlando, Florida, USA, 2005,
Ong, S.K. and Z.B. Wang (2011). Augmented assembly pp. 173-180.
technologies based on 3D bare-hand interaction, CIRP Seth, A., Su, H.J., and J.M. Vance (2006). SHARP: A Sys-
Annals Manufacturing Technology, vol. 60, no. 1, pp. tem for Haptic Assembly & Realistic Prototyping, In:
1-4. ASME 2006 International Design Engineering Technical
Ong, S.K., Yuan M.L., and A.Y.C. Nee (2008). Augmented Conferences and Computers and Information in Engi-
reality applications in manufacturing: a survey, Interna-

25
2013 IFAC MIM
June 19-21, 2013. Saint Petersburg, Russia

neering Conference, Philadelphia, Pennsylvania, USA, International Symposium on Mixed and Augmented Re-
2006, pp. 905-912. ality. Santa Barbara, CA: 22-25 October 2006, pp. 125-
Stark, R., Israel, J.H., and T. Whler (2010). Towards hy- 128.
brid modeling environments Merging desktop-CAD Zetu, D., Schneider, P., and P. Banerjee (1998). Data Input
and virtual reality-technologies, CIRP Annals Manu- Model for Virtual Reality-aided Factory Layout, IIE
facturing Technology, vol. 59, pp. 179-182. Transactions, vol. 30, no. 7, pp. 597-620.
Stutzman, B., Nilsen, D., Broderick, T., and J. Neubert Zhang, J., Ong, S.K., and A.Y.C. Nee (2008). AR-Assisted
(2009). MARTI: Mobile Augmented Reality Tool for in situ Machining Simulation: Architecture and Imple-
Industry, Proceedings of 2009 World Congress on mentation, Proceedings of ACM SIGGRAPH 7th Inter-
Computer Science and Information Engineering national Conference on Virtual-Reality Continuum & its
(CSIE2009), 2009, March 31 - April 2, Los Angeles, Applications in Industry, 2008 Dec 8-9; Singapore.
USA, pp. 425-429. Zhang, J., Ong, S.K., and A.Y.C. Nee (2010a). Develop-
Taylor, F., Jayaram, S., and U. Jayaram (2000). Functional- ment of an AR system Achieving in situ machining
ity to facilitate assembly of heavy machines in a virtual simulation on a 3-axis CNC machine, Computer Ani-
environment, In Proceedings of ASME design engineer- mation and Virtual Worlds, vol. 21, no. 2, pp. 103-115.
ing technical conferences, Baltimore, MD, USA, 2000. Zhang, J., Ong, S.K., and A.Y.C. Nee (2010b). A Multi-
Valentini, P.P. (2009). Interactive virtual assembling in Regional Computation Scheme in an AR-Assisted in situ
augmented reality, International Journal of Interactive CNC Simulation Environment, Computer-Aided De-
Design and Manufacturing, vol. 3, no. 2, pp. 109-119. sign, vol. 42, no. 12, pp. 1167-1177.
van Houten, F.J.A.M., Tomiyama T., and O.W. Salomons
(1998). Product modeling for model-based mainte-
nance, CIRP Annals Manufacturing Technology, vol.
47, no. 1, pp. 123-128.
van Houten, F.J.A.M. and F. Kimura (2000). The virtual
maintenance system: A computer-based support tool for
robust design, product monitoring, fault diagnosis and
maintenance planning. CIRP Annals Manufacturing
Technology, vol. 49, no. 1, pp. 91-94.
Weinert, K., Zabel, A., Ungemach, E., and S. Odendahl
(2008). Improved NC Path Validation and Manipula-
tion with Augmented Reality Methods, Production En-
gineering, vol. 2, no. 4, pp. 371-376.
Wiese, E., Israel, J.H., Zllner, C., Pohlmeyer, A.E., and R.
Stark (2009). The potential of immersive 3D-sketching
environments for design problem-solving, Proceedings
of 13th International Conference on Human-Computer
Interaction, HCI 2009, pp. 485-489.
Xie, W. and N.V. Sahinidis (2008). A Branch-and-bound
Algorithm for the Continuous Facility Layout Problem,
Computers and Chemical Engineering, vol. 32, no. 4,
pp. 1016-1028.
Xin, M., Sharlin, E., and M.C. Sousa (2008). Napkin Sketch
Handheld Mixed Reality 3D Sketching, Proceedings
of 15th ACM Symposium on Virtual Reality Software
and Technology (VRST 2008), 2008, October 27-29,
Bordeaux, France, 2008, pp. 223-226.
Yuan, M.L, Ong, S.K., and A.Y.C. Nee (2004). The virtual
interaction panel: an easy control tool in augmented real-
ity system, Computer Animation and Virtual Worlds
Journal, vol. 15, no. 3-4, pp. 425-432.
Yuan, M.L., Ong, S.K., and A.Y.C. Nee (2008a). Aug-
mented Reality Applications in Manufacturing: A Sur-
vey, International Journal of Production Research, vol.
46, no. 10, pp. 2702- 2742.
Yuan, M.L., Ong, S.K., and A.Y.C. Nee (2008b). Aug-
mented reality for assembly guidance using a virtual in-
teractive tool, International Journal of Production Re-
search, vol. 46, no. 7, pp. 1745-1767.
Zaeh, M.F. and W. Vogl (2006). Interactive laser-projection
for programming industrial robots, Proceedings of the

26

You might also like