development of intelligent robot systems based on sensor control

233
PhD Thesis Development of intelligent robot systems based on sensor control Mikael Fridenfalk Division of Robotics Department of Mechanical Engineering Lund Institute of Technology Lund University

Upload: vuongquynh

Post on 31-Dec-2016

214 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Development of intelligent robot systems based on sensor control

PhD Thesis

Development of intelligentrobot systems based on sensorcontrol

Mikael FridenfalkDivision of RoboticsDepartment of Mechanical EngineeringLund Institute of TechnologyLund University

Page 2: Development of intelligent robot systems based on sensor control
Page 3: Development of intelligent robot systems based on sensor control

Organization Document Name

Lund University PhD ThesisDepartment of Mechanical Engineering Date of Issue

P.O. Box 118, SE-221 00 Lund, Sweden 2003-03-18Phone: +46-46-222 45 92 CODEN: LUTMDN/(TMMV-1056)/1-217/2003

Fax: +46-46-222 45 29Author(s) Sponsoring Organization(s)

Mikael FridenfalkTitle and subtitle

Development of intelligent robot systems based on sensor control

Sensor based control systems increase flexibility and autonomy in robotics. One of the most commonapplications in robotics is welding, which is often limited to objects with fine tolerances. To be able tomanage objects with relatively large tolerances, such as ship-hulls, sensor guided control systems haveto be incorporated that are able to handle large deviations from the drawings. To be able to handle thecomplexity at the development of such intelligent autonomous systems, a new development environ-ment for robot systems was designed in this work, called FUSE, that integrates software prototypingwith virtual prototyping. By the application of FUSE and the methodology developed in this thesis,a seam tracking system for robotic arc welding based on arc sensing was developed and physically val-idated on-site at a shipyard in the European ROWER-2 project. The aim of the ROWER-2 projectwas to develop a robot system for welding of hull-sections of large structures such as oil tankers andcruisers. In addition, a true 6D seam tracking system was designed and validated in FUSE that allowsseam tracking of seams following 3D spline curves by the real-time correction of both the position andthe direction of the torch. Previous systems based on arc sensing only allow seam tracking in 2D or3D. Similar 6D seam tracking system was also developed and validated based on laser scanning. Sincekinematics singularities may disturb the sensor guided control process in robotics, a new method wasalso suggested that among others eliminates position errors near and on inner singularity points, basedon knowledge of the kinematics and dynamics constraints of the robot system and its surroundingenvironment.Keywords

Robot, sensor control, virtual sensor, simulation, software development, rapid prototyping, virtual pro-totyping, FUSE, ROWER-2, Asimov, robotic welding, seam tracking, arc sensing, laser scanning, 3D,6D, Matlab, Igrip, Envision, kinematics, singularity.Classification system and/or Index term (if any)

Supplementary bibliographical information Language

EnglishISSN and key title ISBN

91-628-5550-6Recipient’s notes Number of pages 217 Price

Security classification

We, the undersigned, being the copyright owner of the abstract of the above mentioned thesis,hereby grant to all reference sources permission to publish and disseminate the abstract of theabove mentioned thesis.

Signature Date

Page 4: Development of intelligent robot systems based on sensor control

Development of intelligent robot systems based onsensor control

Mikael Fridenfalk

Akademisk avhandling som för avläggande av doktorsexamen vid tekniska fakulteten vidUniversitetet i Lund kommer att offentligen försvaras i sal M:E, Tekniska Högskolan i Lund,fredagen den 11 april 2003, kl 10:15. Institutionens opponent är Viktor Berbyuk, ChalmersTekniska Högskola.

Page 5: Development of intelligent robot systems based on sensor control

Development of intelligent robot systems based onsensor control

Mikael Fridenfalk

Division of RoboticsDepartment of Mechanical Engineering

Lund Institute of TechnologyLund University, P.O. Box 118, SE-221 00 Lund, Sweden

Page 6: Development of intelligent robot systems based on sensor control

PhD ThesisCODEN: LUTMDN/(TMMV-1056)/1-217/2003ISBN 91-628-5550-6

c by Mikael Fridenfalk and the Department of Mechanical Engineering, Lund University.All rights reserved.

Printed in SwedenKFS i Lund AB, Lund

Page 7: Development of intelligent robot systems based on sensor control

Abstract

Sensor based control systems increase flexibility and autonomy in robotics.One of the most common applications in robotics is welding, which is oftenlimited to objects with fine tolerances. To be able to manage objects withrelatively large tolerances, such as ship-hulls, sensor guided control systemshave to be incorporated that are able to handle large deviations from the draw-ings. To be able to handle the complexity at the development of such intelli-gent autonomous systems, a new development environment for robot systemswas designed in this work, called FUSE, that integrates software prototypingwith virtual prototyping. By the application of FUSE and the methodology de-veloped in this thesis, a seam tracking system for robotic arc welding basedon arc sensing was developed and physically validated on-site at a shipyardin the European ROWER-2 project. The aim of the ROWER-2 project was todevelop a robot system for welding of hull-sections of large structures suchas oil tankers and cruisers. In addition, a true 6D seam tracking system wasdesigned and validated in FUSE that allows seam tracking of seams following3D spline curves by the real-time correction of both the position and the dir-ection of the torch. Previous systems based on arc sensing only allow seamtracking in 2D or 3D. Similar 6D seam tracking system was also developed andvalidated based on laser scanning. Since kinematics singularities may disturbthe sensor guided control process in robotics, a new method was also sugges-ted that among others eliminates position errors near and on inner singularitypoints, based on knowledge of the kinematics and dynamics constraints of therobot system and its surrounding environment.

Page 8: Development of intelligent robot systems based on sensor control

vi

Page 9: Development of intelligent robot systems based on sensor control

Acknowledgements

First of all, I would like to thank Professor Gunnar Bolmsjö for the excellentsupervision and for providing for vision, knowledge and guidance throughoutthis work. Professor Bolmsjö has been my mentor since the age of 27 anda source of inspiration already from the times at junior high, when I firstaccoutered his book “Industriell Robotteknik”.

I would further like to thank Dr. Giorgos Nikoleris for valuable feedback on mywork, Dr. Magnus Olsson for valuable feedback on my work and technical sup-port, and Dr. Lars Randell for provision of technical support. I wish to thankmy collegues Per Cederberg, Lars Holst, Arne Ingemansson, Ulf Lorentzon, andmy former colleagues Jonas Assarsson, Fredrik Ahlquist, Dr. Hamid Nasri andPer Hedenborn for the great support. Thanks also to one of the members ofthe technical staff, Mikael Vatau for inventing the acronym “FUSE”, and theformer technical staff member Mats Rosengren. My colleagues have given methe best time at the Department of Robotics and the highlight was of courseour philosophical discussions, covering a wide area of subjects.

I would like to further thank Dr. Anders Robertsson, Dr. Klas Nilsson andProfessor Rolf Johansson at the Department of Automatic Control for the greatencouragement I have received. Thanks also to Professor Gunnar Sparr at theDepartment of Mathematics for good advice in some mathematical issues andProfessor Gustaf Olsson at the Department of Industrial Electrical Engineeringand Automation for supporting me in one of my private research projects inrobotics.

I would also like to thank our partners in the ROWER-2 project, especiallythose I had the pleasure to cooperate with, which are Tiberio Grasso, MasciaLazzarini and Alessio Nista at Tecnomare in Venice, Italy, and Dr. Manuel Ar-mada and Dr. Pablo González de Santos at IAI in Madrid, Spain. The ROWER-2project was funded by European Commission under the BRITE/EURAM pro-gram.

Finally, I would like to thank my parents, friends and former teachers, whichare extremely dear to me.

Page 10: Development of intelligent robot systems based on sensor control

viii

Page 11: Development of intelligent robot systems based on sensor control

Contents

Abstract v

Acknowledgements v

1 Introduction 11.1 Complex systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11.2 Hypothesis and research methodology . . . . . . . . . . . . . . . . 31.3 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.4 Ethical principals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4

2 Contributions 5

3 Application 93.1 Intelligence in manufacturing systems . . . . . . . . . . . . . . . . 93.2 The ROWER-2 project . . . . . . . . . . . . . . . . . . . . . . . . . . . 10

4 Flexible Unified Simulation Environment 154.1 Virtual prototyping . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15

4.1.1 Software prototyping of real-time systems . . . . . . . . . . 154.1.2 Virtual mechanical prototyping and CAR systems . . . . . 164.1.3 Integration of software and mechanical prototyping . . . 164.1.4 Virtual sensors . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

4.2 FUSE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174.2.1 FUSE design . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174.2.2 Linking method . . . . . . . . . . . . . . . . . . . . . . . . . . 174.2.3 Program structure . . . . . . . . . . . . . . . . . . . . . . . . . 184.2.4 Envision Motion Pipeline . . . . . . . . . . . . . . . . . . . . . 184.2.5 Flow structure . . . . . . . . . . . . . . . . . . . . . . . . . . . 194.2.6 FUSE Console . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204.2.7 Run-time interactive programming . . . . . . . . . . . . . . 20

5 Development and implementation of seam tracking algorithms 215.1 Seam tracking at robot welding . . . . . . . . . . . . . . . . . . . . . 21

5.1.1 Robot welding . . . . . . . . . . . . . . . . . . . . . . . . . . . 21

Page 12: Development of intelligent robot systems based on sensor control

x Contents

5.1.2 Seam tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . 215.1.3 Arc sensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 225.1.4 Laser scanning . . . . . . . . . . . . . . . . . . . . . . . . . . . 225.1.5 Comparison between arc sensors and laser scanners . . . 225.1.6 Power source settings and adaptive control . . . . . . . . . 23

5.2 Implementation of seam tracking in the ROWER-2 project . . . . 235.3 6D seamtracking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23

5.3.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 235.3.2 Development and validation of 6D SGRC systems . . . . . 24

6 Systems development methodology 276.1 Development methodologies . . . . . . . . . . . . . . . . . . . . . . 27

6.1.1 Software development . . . . . . . . . . . . . . . . . . . . . . 276.1.2 Mechanical virtual prototyping . . . . . . . . . . . . . . . . . 286.1.3 Systems development methodologies in robotics . . . . . . 28

6.2 Formulation of a new methodology . . . . . . . . . . . . . . . . . . 286.3 Application in the ROWER-2 project . . . . . . . . . . . . . . . . . . 306.4 Development of the 6D seam tracking systems . . . . . . . . . . . 31

7 Discussion 33

8 Conclusions 35

9 Future work 37

Bibliography 39

CONTRIBUTIONS 50

Paper I Simulation based design of a robotic welding system 531 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 552 Simulation development process . . . . . . . . . . . . . . . . . . . . 573 Simulation environment . . . . . . . . . . . . . . . . . . . . . . . . . 594 Experimentals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 595 Implementation of virtual sensor . . . . . . . . . . . . . . . . . . . . 646 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 657 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 66

Paper II Virtual prototyping and experience of modular robotics designbased on user involvement 711 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 732 Modular design and virtual prototyping . . . . . . . . . . . . . . . 743 Technical development . . . . . . . . . . . . . . . . . . . . . . . . . . 764 Validation of technical specification and user requirements using

VR-tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 785 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 806 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80

Page 13: Development of intelligent robot systems based on sensor control

Contents xi

Paper III The Unified Simulation Environment - Envision Teleroboticsand Matlab merged into one application 851 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 88

1.1 Integration of systems . . . . . . . . . . . . . . . . . . . . . . 881.2 Integration in robotics . . . . . . . . . . . . . . . . . . . . . . 881.3 FUSE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 881.4 Objective . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 89

2 Materials and methods . . . . . . . . . . . . . . . . . . . . . . . . . . 892.1 Envision TR . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 892.2 Matlab . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 892.3 FUSE . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92

3 Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 934 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 945 Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 94

Paper IV Design and validation of a sensor guided robot control systemfor welding in shipbuilding 971 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 99

1.1 Robot welding . . . . . . . . . . . . . . . . . . . . . . . . . . . 991.2 Seam tracking . . . . . . . . . . . . . . . . . . . . . . . . . . . 1001.3 Process control . . . . . . . . . . . . . . . . . . . . . . . . . . . 1001.4 Through-arc sensing . . . . . . . . . . . . . . . . . . . . . . . 1011.5 Control algorithms for seam tracking . . . . . . . . . . . . . 1021.6 Simulation using virtual sensors . . . . . . . . . . . . . . . . 1031.7 ROWER-2 application . . . . . . . . . . . . . . . . . . . . . . . 103

2 Materials and methods . . . . . . . . . . . . . . . . . . . . . . . . . . 1042.1 Systems development methodology . . . . . . . . . . . . . . 1042.2 Joint profiles . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1062.3 Experimental setup . . . . . . . . . . . . . . . . . . . . . . . . 106

3 Experimental results . . . . . . . . . . . . . . . . . . . . . . . . . . . 1073.1 Simulation experiments . . . . . . . . . . . . . . . . . . . . . 1073.2 Validation by robot welding . . . . . . . . . . . . . . . . . . . 110

4 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1245 Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . 124

Paper V Design and validation of a universal 6D seam tracking systemin robotic welding using laser scanning 1291 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1312 Materials and methods . . . . . . . . . . . . . . . . . . . . . . . . . . 133

2.1 Sensors guided robot control . . . . . . . . . . . . . . . . . . 1332.2 Robot system . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1352.3 Simulation system . . . . . . . . . . . . . . . . . . . . . . . . . 1352.4 Calculation of position and orientation errors . . . . . . . 137

3 Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1373.1 Position control . . . . . . . . . . . . . . . . . . . . . . . . . . 137

Page 14: Development of intelligent robot systems based on sensor control

xii Contents

3.2 Trajectory tangent vector control . . . . . . . . . . . . . . . 1383.3 Pitch control . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1383.4 Code and pseudo code samples . . . . . . . . . . . . . . . . 1393.5 Computational costs . . . . . . . . . . . . . . . . . . . . . . . 1413.6 Kinematics singularities . . . . . . . . . . . . . . . . . . . . . 1413.7 Initial and final conditions . . . . . . . . . . . . . . . . . . . . 141

4 Experimental results . . . . . . . . . . . . . . . . . . . . . . . . . . . 1424.1 Workpiece samples . . . . . . . . . . . . . . . . . . . . . . . . 1424.2 Laser scanning . . . . . . . . . . . . . . . . . . . . . . . . . . . 145

5 Results and discussion . . . . . . . . . . . . . . . . . . . . . . . . . . 1466 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 146

Paper VI Design and validation of a universal 6D seam tracking systemin robotic welding using arc sensing 1511 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1532 Materials and methods . . . . . . . . . . . . . . . . . . . . . . . . . . 155

2.1 Sensors guided robot control . . . . . . . . . . . . . . . . . . 1552.2 Robot system . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1592.3 Simulation system . . . . . . . . . . . . . . . . . . . . . . . . . 1592.4 Calculation of position and orientation errors . . . . . . . 160

3 Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1613.1 Position control . . . . . . . . . . . . . . . . . . . . . . . . . . 1623.2 Trajectory tangent vector control . . . . . . . . . . . . . . . 1633.3 Pitch control . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1643.4 Arc sensing weaving interpolation . . . . . . . . . . . . . . . 1643.5 Code and pseudo code samples . . . . . . . . . . . . . . . . 1653.6 Kinematics singularities . . . . . . . . . . . . . . . . . . . . . 1673.7 Initial and final conditions . . . . . . . . . . . . . . . . . . . . 167

4 Experimental results . . . . . . . . . . . . . . . . . . . . . . . . . . . 1684.1 Workpiece samples . . . . . . . . . . . . . . . . . . . . . . . . 1684.2 Arc sensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 170

5 Results and discussion . . . . . . . . . . . . . . . . . . . . . . . . . . 1716 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 172

Patent I Method eliminating position errors caused by kinematics sin-gularities in robotic wrists for use in intelligent control and telero-botics applications 1751 Background of the invention . . . . . . . . . . . . . . . . . . . . . . 1772 Problem analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1773 New method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 178

APPENDIX 183

Patent II Intelligent system eliminating trajectory programming andgeometrical databases in robotic welding 1851 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187

Page 15: Development of intelligent robot systems based on sensor control

Contents xiii

1.1 Robot welding . . . . . . . . . . . . . . . . . . . . . . . . . . . 1872 Materials and methods . . . . . . . . . . . . . . . . . . . . . . . . . . 189

2.1 Sensors guided robot control . . . . . . . . . . . . . . . . . . 1892.2 Robot system . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1912.3 Simulation system . . . . . . . . . . . . . . . . . . . . . . . . . 1922.4 Calculation of position and orientation errors . . . . . . . 192

3 Modeling . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1933.1 Position control . . . . . . . . . . . . . . . . . . . . . . . . . . 1933.2 Trajectory tangent vector control . . . . . . . . . . . . . . . 1943.3 Pitch control . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1943.4 Code samples . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1953.5 Computational costs . . . . . . . . . . . . . . . . . . . . . . . 1963.6 Arc sensing weaving interpolation . . . . . . . . . . . . . . . 1973.7 Kinematics singularities . . . . . . . . . . . . . . . . . . . . . 1973.8 Initial and final conditions . . . . . . . . . . . . . . . . . . . . 198

4 Experimental results . . . . . . . . . . . . . . . . . . . . . . . . . . . 1984.1 Workpiece samples . . . . . . . . . . . . . . . . . . . . . . . . 1984.2 Laser scanning . . . . . . . . . . . . . . . . . . . . . . . . . . . 1994.3 Arc sensing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 200

5 Results and discussion . . . . . . . . . . . . . . . . . . . . . . . . . . 2006 Future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 201

Page 16: Development of intelligent robot systems based on sensor control

xiv Contents

Page 17: Development of intelligent robot systems based on sensor control

Chapter 1

Introduction

1.1 Complex systems

Human progress and mere survival have always been dependent on the abilityto process large amounts of complex information and to respond to it like-wise in a very complex manner. The reason why we often experience thatour response to different situations come easily and naturally, such as whensomeone says “hello”, is not because reacting to such an obvious situation isan easy thing in itself, but experienced as such since the complexity of ourreaction patterns are encapsulated to hide the immensely diversified processthat goes on in our minds. Due to the parallel architecture of the human brain,a qualified guess is that a proper reaction to a simple greeting involves morecomputations than the total amount of computations performed by a moderncomputer today during its entire lifetime1.

As complexity in ordered form is often considered a beautiful thing, such as amathematical formula or a piece of music, complexity may be quite terrifyingwhen it is not ordered, such as in software "spaghetti-code".

Humans often use to decompose complex information into simple elements.This creates a feeling of understanding of the contents of the information andhelps to place it in the right context along with other information stored inour minds. This concept of systemization by placing clusters of processes andinformation in logical and often hierarchical structures, is used among othersin the OOP (Object-Oriented Programming) methodology [20, 107, 111]. En-capsulation of complexity is essential here. By hiding complexity, we are lessbothered by large amounts of information, and are free to focus our efforts.

1The human brain has more than 10 billion neurons with thousands of connections eachoperating in the millisecond range [8, 73]. This assumption does not however include hardwareimplementation of neural networks.

Page 18: Development of intelligent robot systems based on sensor control

2 Introduction

As an attempt to increase the development efficiency of robot systems, in thisthesis a methodology was developed and validated that integrates softwareprototyping with mechanical virtual prototyping from the conceptual levelto the design of the physical prototype. The objective of this methodologyis to minimize the illusion of complexity in designing mechatronic systems,thereby saving time, effort and costs in practical applications.

Today, software prototyping and mechanical virtual prototyping are often per-formed in separate environments independently of each other. This remindsof separating methods from data in OOP. Within the frames of this methodo-logy a new simulation environment was designed by the integration of Matlab2

(including Simulink and Real-Time Workshop) with the robot simulator Envi-sion3, by linking and compiling the kernels of Matlab and Envision into a newsoftware system, called FUSE (Flexible Unified Simulation Environment).

Presently, at software prototyping involving mathematical models which isusually the case in robotics, applications such as Matlab, Maple [6] and Math-ematica [7] accelerate the development significantly compared to program-ming languages such as for instance C++. The reason is that these applicationsare interactive, use concise matrix notation and provide for large mathematicallibraries. The difference between these applications and C++ may in this con-text be compared to the difference between C++ and machine code. Thoughlow-level programming is essential for optimizing the final code, high-levelprogramming is more suited for software prototyping.

The developed methodology was basically validated using FUSE for the designand validation of a 3D seam tracking system using arc sensing (also calledthrough-arc sensing) in the European ROWER-2 project4. FUSE was also usedfor the design and validation of 6D seam tracking systems based on arc sensingand laser scanning, see papers V-VI and Appendix. Today, though there exist6D seam tracking systems based on laser scanning, the performance of thesesystems are however unknown, since no evaluations of such systems havebeen published. In arc scanning however, there is no record as far as theauthor knows of any attempts to design a 6D seam tracking system, so thissystem opens up new possibilities to increase the intelligence in robotic arcwelding without using expensive laser scanning systems.

It should be noted that this thesis does not deal with the welding process ex-plicitly at arc welding, since the experiments in the ROWER-2 project showedthat such considerations was not necessary for the implementation and val-

2MathWorks, Inc.3Robot simulator, Delmia Inc.4The objective of the ROWER-2 project was to develop a robot system for automatic and

adaptive welding of double-hull sections in super-cruisers and oil tankers. The idea was thatthe workers only had to mount the robot inside the hull-cell and would be able to supervisethe welding process from a remote distance. A robot system was developed and physicallyvalidated along with a SGRC (Sensor Guided Robot Control) system for seam tracking based onarc welding. For a more detailed description about this project, see papers I and IV.

Page 19: Development of intelligent robot systems based on sensor control

1.2 Hypothesis and research methodology 3

idation of the seam tracking algorithm, even using an adaptive power-sourcethat itself controlled the arc current, see paper IV. The only measure takenbefore using the signal in the control system implemented in the ROWER-2project, was pre-filtering of the arc signal by a low-pass filter, which showedto be sufficient for robust control.

In addition, a new method was developed that deals with problems causedby kinematics singularities in robotic wrists. This was considered importantas the functionality of the 6D seam tracking systems will change the orienta-tion and possibly thus drive the robot into singularity areas. By the suggestedmethod, the robot may move through singularity areas while maintaining asmooth trajectory. This system is especially important for intelligent and tel-erobotics systems, but is also applicable in traditional robotic systems wherebasically the same problems exist, but are less frequent.

1.2 Hypothesis and research methodology

The hypothesis of this thesis is formulated as:

full integration between the software and the mechanical parts of arobot system saves significant development time, effort and costscompared to traditional methods where software prototyping andmechanical virtual prototyping are performed independently.

Integration of software prototyping and mechanical virtual prototyping toolsprovides for a holistic and analytical perspective in micro and macro-scalethat is essential for the development of complex systems, such as intelligentsensor guided robotic systems. The research methodology is hence basedon development of sensor guided control systems based on full integrationbetween software and mechanical parts, for all sub-levels of the robot system,ranging from systems analysis and the overall seam tracking process to detailsin the software and mechanical systems that may play an important role inthe development of the robot system.

The research methodology includes also the use of simulation tools for en-hanced communication and cooperation between research and developmentteams spread over a wide geographical area, such as in the ROWER-2 projectwhich included team members from several European countries.

1.3 Objective

In the manufacturing industry, high level of intelligence and flexibility savestime and effort and causes fewer interruptions in production, especially for

Page 20: Development of intelligent robot systems based on sensor control

4 Introduction

one-off production and building of large structures, such as ships, where re-latively high tolerances are allowed in the sheet material to minimize manu-facturing costs. The first objective in this work was therefore to design andvalidate a 3D seam tracking algorithm in the ROWER-2 project. The second wasto design universal 6D seam tracking systems for robotic welding that werebased on arc sensing and laser scanning. 6D seam tracking systems basedon arc sensing do not exist presently and would most probably decrease thecosts of 6D seam tracking to less than 1/10 of existing systems based on laserscanning.

To develop such systems, a new simulation tool was required that integratedsoftware prototyping with virtual mechanical prototyping. Simulation is of-ten used in robotics to save time, effort and costs. Today, in the developmentof new robotic systems, virtual prototyping is rarely integrated in the soft-ware development process. Such integration would however make the devel-opment of intelligent systems, including SGRC systems, less time-consumingand complex. Though such systems exist for some mechanical simulation ap-plications by integration with mathematical and software prototyping toolssuch as Matlab, there exist no system that the author knows of that includes arobotic simulator. The third objective of this work was therefore to design anintegrated simulation environment especially for the development of roboticsystems.

Another problem that was addressed was kinematics singularities at SGRC,causing disturbance in the robot control system. Higher amount of autonomyin robotic systems increases the chance for the robot to come near such areasand may in the case of robotic welding probably ruin the weld. The finalobjective was therefore to suggest a new method by which such problemswould be minimized for robots with RPY-wrists, which are the predominantcategory of wrists for industrial robots.

1.4 Ethical principals

The ROWER-2 project was examined from an ethical point of view by analysisof its effects on human life. Robot automation yields (1) solution to the prob-lem of the lack of qualified welders in a relatively rapidly expanding economy,(2) improvement of environmental conditions for workers, (3) increased globalwealth due to higher productivity and (4) increased welding quality and safetyin ships. There are also negative effects, such as (1) the need of dismissingor transferring of some workers to other departments and (2) deteriorationin the skills of manual welding. As the positive effects outweigh the negativehowever, the ROWER-2 project is viewed to be based on ethical principals.

Page 21: Development of intelligent robot systems based on sensor control

Chapter 2

Contributions

The attached papers and patents in this thesis are briefly summarized in thissection, by the order of date of submission, date of publication or category.

Paper I: Simulation based design of a robotic system

Simulation of the ROWER-2 robot system at the design process. The conclu-sions were that the workspace of the robot and the mobile platform weresufficient for obtaining the ROWER-2 specifications at seam tracking, but thatthe welding torch and its mounting had to be modified [50]. Further, the termsbreadth-first and depth-first simulations were defined based on data structureterminologies.

Paper II: The Asimov project

Participation in the software development and integration of the real-time sys-tem of Asimov, a modular light weight robot with 7 degrees of freedom, po-tentially useful in application areas such as assistive technologies for disabledpeople and space [41, 49]. The purpose was to introduce and test novel tech-niques using virtual prototyping tools within the design process. The Asimovrobot was successfully validated by physical experiments.

Page 22: Development of intelligent robot systems based on sensor control

6 Contributions

Paper III: FUSE

An integrated software system was built based on Envision (robot simulator,Delmia Inc.) and Matlab (MathWorks Inc.) [43]. The integration was performedby compiling these software applications and their libraries into one singlesimulation environment. Thereby the process of mechanical virtual prototyp-ing and software prototyping could be closer linked, saving time and effort.Today, though such unified environments exist for generic mechanical simu-lation systems, such as Adams (Mechanical Dynamics Inc.), no solution doesexist explicitly for robotic systems. FUSE was used in the design of the 3Dseam tracking system in the ROWER-2 project and the 6D seam tracking sys-tems described in papers V-VI and Appendix.

Paper IV: The design and validation of a robotic seam trackingsystem based on arc sensing

The design and validation of a robotic seam tracking system in the ROWER-2project, based on arc sensing at arc welding for joining large sections, such asdouble-hull constructions in shipbuilding [44, 45]. The system was developedin FUSE, implemented in the real-time robot control system and successfullyvalidated by physical experiments on-site at a shipyard in Venice, Italy.

Paper V: Design and validation of an intelligent 6D robot con-trol system for seam tracking based on laser scanning

A universal 6D seam tracking system was designed based on laser scanningat robotic welding [48]. This system showed basically to be able to seam trackany 3D spline curve and eliminate the need for robot programming and geo-metrical databases at robotic welding. Three things that presumably speakin favor of this method compared to previous ones for laser scanning, arethat: 1) stability and reliability has been the highest priority here, 2) whilethis method has been carefully evaluated by simulation experiments, moreor less identical to the real system and based on real experiments, no evalu-ations have been found for previous methods, and 3) since 6D seam trackingsystems based on laser scanning have not been the commercial success thatwere anticipated, this may raise the question of what went wrong, since thereexists obviously a tremendous need today for increased levels of intelligencein robotic automation.

Page 23: Development of intelligent robot systems based on sensor control

7

Paper VI: Design and validation of an intelligent 6D robot con-trol system for seam tracking based on arc sensing

A 6D seam tracking system was designed based on arc sensing for robotic arcwelding [47]. This system is basically able to seam track any 3D spline curveand eliminates the need of robot programming and geometrical databases forrobotic welding. Today, there exist no solutions for 6D seam tracking basedon arc sensing. The current solutions are typically limited to 2D. Since systemsbased on arc sensing typically cost 1/10 compared to systems based on laserscanning, this system may in the future replace laser scanning systems, unlessthe price of laser scanners are significantly reduced.

Patent I: Eliminating position errors caused by kinematics sin-gularities in robotic wrists

Patent application. A new method by which kinematics singularity problemsmay be addressed in robotics, among others by the elimination of position er-rors near and on inner singularity points, based on knowledge of kinematicsand dynamics constraints of the total robot system including its surroundingenvironment [42]. One interpretation of this method is that the stronger mo-tors in the manipulator are allowed to compensate the errors caused by weakerwrist motors. Singularity problems are especially an issue in intelligent sys-tems and telerobotics applications, since the actual movement of the robot isnot often known in advance in such systems. This new method is howeveralso useful for traditional industrial robot applications, since it makes robotcontrol systems less sensitive to kinematics singularities.

Appendix (Patent II)

Patent application, 6D seam tracking [46]. Here, additional code is includedcompared to papers V-VI and the 6D seam tracking systems are treated inparallel.

Page 24: Development of intelligent robot systems based on sensor control

8 Contributions

Page 25: Development of intelligent robot systems based on sensor control

Chapter 3

Application

3.1 Intelligence in manufacturing systems

Higher customer demands on economy and product diversification requireshigher flexibility and intelligence in the manufacturing process. Today, thegreatest challenge for many manufacturers are to be able to deliver custom-ized products that meet the needs of each client. Customization however iscostly, since it often requires reconfiguration of the production line and re-programming of machines and robots. One specific case is robotic welding,where today in general only large product series are manufactured. Increasedautonomy would however make one-off production as economic as productionof larger series and thereby allow higher diversification of products manufac-tured by a single company.

High intelligence in manufacturing systems may also introduce automation innew areas. To decrease manufacturing costs for example, often materials withrelatively large tolerances are allowed. In traditional robot welding, large tol-erances may cause displacement of the seam, which in turn may ruin the weld.This is especially common in shipbuilding, where large hull-sections are builtand where displacement of the weld may be several centimeters compared tothe computer models.

Seam tracking, based on laser scanning or arc sensing is therefore often usedin robotic welding. In laser scanning, a laser beam is used to retrieve the profileof the seam in advance of the welding torch. In arc sensing, the arc current ismeasured, while simultaneously an oscillating motion is added to the tool-tipof the torch. By retrieving the seam profile, the robot is able to correct theposition of the torch continuously, and thereby insuring high welding quality.

Seam tracking is presently in general performed only in 2D. Often the position

Page 26: Development of intelligent robot systems based on sensor control

10 Application

of the torch is corrected in a plane orthogonal to the direction of the seam. Byincreasing the adaptability of a seam tracking system to include any seam thatfollows an arbitrary 3D spline, compensation of both the position and orient-ation of the torch may be performed in full 6D, which increases the flexibilityin robotic welding. By the implementation of such systems, robot trajectoryprogramming and geometrical databases will no longer be necessary, whichsaves time and effort. This makes one-off production economic, and therebyalso the manufacturing of customized products.

3.2 The ROWER-2 project

The application that inspired the development of the 6D seam tracking sys-tems was the European ROWER-2 project. Though the implemented seamtracking algorithm in this project was according to the project specificationsonly limited to 2D, the need for such solutions become obvious, since not allseams in a ship are linear.

The objective of the ROWER-2 project was to automate the welding processin shipbuilding, specifically for joining double hull sections in super-cruisersand oil tankers. The double hull construction is imperative for such struc-tures to ensure high safety at incidents. The joining of double hull sectionsis however associated with large environmental problems since the welderswork in a relatively hot and enclosed environment, filled with poisonous gas.By automation of the welding process, allowing the workers to mount a ro-bot system inside the cell and supervise the welding process from a remotedistance, the work environmental problems may be solved.

According to the specifications in the ROWER-2 project, every cell is equippedwith a manhole with the dimensions 600× 800 mm, through which the robotmay be transported, see Fig. 3.1. Since the robot has to be transported manu-ally, the constraint on the robot is that each part is allowed to weigh no morethan 50 kg. Further on, the robot system is designed to operate in standardcells with predefined variations in dimension. To be able to meet the specific-ations, a robot manipulator was built based highly on aluminum alloy. Therobot was mounted on a mobile platform with 1 degree of freedom to increaseits workspace. The welding method was GMAW using through-arc sensing atseam tracking [29].

Fig. 3.2 shows the ROWER-2 robot system at the final physical validation phaseand Figs. 3.3-3.5 show ROWER-2 simulations, including 3D as well as one 6Dseam tracking experiment. The 6D seam tracking experiments, based on bothlaser scanning and arc sensing, was designed based on experience and resultsgained from the development and validation of the 3D seam tracking systemin the ROWER-2 project.

Page 27: Development of intelligent robot systems based on sensor control

3.2 The ROWER-2 project 11

Figure 3.1 A double hull-cell in ROWER-2 project with a manhole through whichthe robot equipment is manually transported. One of the walls and the floorhave been excluded from the picture.

Figure 3.2 The ROWER-2 robot system. The system is mounted on a mockupthat is about 3.5 m high, belonging to the hull-cell of a ship.

Page 28: Development of intelligent robot systems based on sensor control

12 Application

Figure 3.3 Example of a simulation performed in the process of virtual pro-totyping of the robot manipulator and the mobile platform. The aim of thisspecific simulation was to make sure that the suggested robot design was ableto fulfill the ROWER-2 specifications, before it was manufactured.

Figure 3.4 Example of ROWER-2 seam tracking simulation, analyzing the be-havior of the SGRC system at very large linear deviations.

Page 29: Development of intelligent robot systems based on sensor control

3.2 The ROWER-2 project 13

Figure 3.5 Example of a 6D seam tracking experiment based on arc sensing,performed by the ROWER-2 robot.

Page 30: Development of intelligent robot systems based on sensor control

14 Application

Page 31: Development of intelligent robot systems based on sensor control

Chapter 4

Flexible Unified SimulationEnvironment

4.1 Virtual prototyping

4.1.1 Software prototyping of real-time systems

Software prototyping is essential for the design of reliable software and in-creases productivity of critical and complex systems in robotics where errorsmay cause casualties, damage to expensive equipment or costly productiondelays [21, 36, 56, 133]. Software prototyping plays also an important role insoftware maintenance and evolution, a process that often requires more effortthan the initial design process [26, 77, 101, 109]. It is in addition in some areasessential in the research work, such as in evolutionary software development,automatic programming or brain building [13, 31, 61]. There exists no spe-cific software prototyping language today for robot applications, but effortshas been made to design generic robot programming languages [81, 99]. In-stead, common programming languages are often used together with specificrobot application libraries, such as the Matlab Robot Toolbox [30], used in thisthesis.

An example of a high-level software prototyping environment is Matlab [3].Matlab is an environment for numerical calculation and offers more than20.000 functions in more than 50 application toolboxes ranging over applic-ation areas such as neural networks, signal and image processing, automaticcontrol and database management. The programming language of Matlab iscalled Matlab script or simply Matlab and is interpreted in run-time.

In Matlab there exist environments for development of RT (Real-Time) embed-ded systems [40] such as mechatronic and robotic systems [28, 62, 78], called

Page 32: Development of intelligent robot systems based on sensor control

16 Flexible Unified Simulation Environment

Simulink and Stateflow1[25, 32]. These sub-applications are RT software pro-totyping enviroments that via Real-Time Workshop [86, 87] are able to convertstandard or user-written RT simulation blocks to C-code, for immediate com-pilation and download to an embedded computer [54, 102, 116]. By workingon a higher level of abstraction, it becomes possible to design, modify, main-tain and evolve complex RT systems in a fraction of the time that it ordinarytakes using traditional RT programming methods.

4.1.2 Virtual mechanical prototyping and CAR systems

Virtual mechanical prototyping is today essential in mechanical design [39,75]. Applications for virtual prototyping provide tools for analysis coveringareas such as structural mechanics, dynamics and logistics. Similar to soft-ware prototyping, there exists in addition research applications where the pro-totyping tool is in charge of the mechanical design process, using evolutionaryprototyping methods [128].

Within robotics research and manufacturing industry there exist a numberof CAR (Computer Aided Robotics) systems, such as Envision (Igrip) [1] andRobCad [5]. These graphical simulation applications are equipped with RTgraphical 3D engines and kinematics and geometrical libraries covering mostcommercial robots. These applications provide in addition also many special-ized functions, such as robot dynamics and collision detection.

In Envision, the user is constrained to a relatively limited set of mathematicalfunctions. These functions may be expanded by auxiliary C-libraries. It ishowever a cumbersome job to develop mathematical models using C and C-libraries, since C was never primarily designed with matrix manipulation andthe use of higher mathematics in mind. What is needed here is a mathemat-ical engine that gives easy access to mathematical methods for modeling andexperimentation in robot applications.

4.1.3 Integration of software and mechanical prototyping

There is today a clear trend towards integrating software products, since in-tegration often results in new software systems that are more powerful thanthe sum of the component applications alone. This phenomenon is due to thenon-linear nature of computer programs2.

By linking Matlab, Simulink or Stateflow to mechanical simulators, the func-tionality of these applications are increased. Examples of such applicationsare Adams [4, 97] and DynaWorld Professional [2, 64]. Though these tools are

1Simulink is basically continuous while Stateflow is time-discrete.2Integration is always per definition non-linear, since in linear systems the basic components

are independent and not integrated.

Page 33: Development of intelligent robot systems based on sensor control

4.2 FUSE 17

in general very useful for the design and analysis of dynamic systems, they arehowever of limited use for robot simulation compared to the more specializedrobot simulators Envision and RobCad.

4.1.4 Virtual sensors

The development of sensor guided control systems may be accelerated us-ing virtual sensors in the simulation control loop. Since a system’s state isoften easily obtained in simulation, assuming that the simulation model issufficiently accurate compared to the real system, this provides for a betterfoundation for analysis of well defined processes than in general is possible inphysical experiments. Further on, as stated before, insertion of a sensor in acontrol loop may cause damage to expensive equipment, unless the behaviorof the entire sensor guided control system is precisely known, which is rarelythe case at the development stage.

Virtual sensors are used in many application areas, such as robotics, aerospaceand marine technologies [19, 38, 55, 96, 103]. Development of new robot sys-tems, such as for seam tracking may be accelerated by application of sim-ulation [49, 50]. In general, the design methodology of virtual sensors mayvary due to their specific characteristics. If a sensor model is not analyticallydefinable, artificial neural networks or statistical methods may be used to findappropriate models [11, 98].

4.2 FUSE

4.2.1 FUSE design

FUSE is the joint kernels and libraries of Envision and Matlab linked into onesimulation environment, using Matlab Engine [82, 83, 84, 85] and the openarchitecture of Envision. It is a unified environment in robotics for integratedvirtual prototyping and software simulation, see Paper III. FUSE was implemen-ted on IRIX, SGI and validated by the development and simulation of the 3Dand 6D seam tracking algorithms, presented in Papers I, IV-VI and Appendix.The objective of FUSE is to be used during the whole virtual prototyping pro-cess of robot design, both for mechanical virtual prototyping and for softwaredevelopment all the way from concept to automatic C-code generation for finalcompilation and download to RT embedded computers, see chapter 6.

4.2.2 Linking method

Matlab provides direct access to its environment for integration with other ap-plications by the API (Application Programmer’s Interface), called the Matlab

Page 34: Development of intelligent robot systems based on sensor control

18 Flexible Unified Simulation Environment

Engine Library. This library gives Envision full access to Matlab’s environ-ment as a virtual user. The integration is performed using the open archi-tecture structure of Envision by compiling and linking its application kerneland shared libraries [35] with Matlab, using pipes on Unix and ActiveX on Mi-crosoft Windows. The open architecture structure of Envision is presentedin Fig. 2, paper III. The communication between the process of Envision andMatlab was experienced by the author as both reliable and fast. On Unix, if theMatlab process is executed on a remote computer, remote shell (rsh) is usedinstead of pipes. This makes it possible to run distributed Matlab applicationsfrom Envision over the network.

Matlab functions may also be called from other applications by automatictranslation and compilation of Matlab code using the Matlab Compiler Suite.Since not all Matlab functions are supported, and compilation is required eachtime a change is made in the Matlab code, this method was however not con-sidered for the design of FUSE, in favor of Matlab Engine, which provides forall Matlab commands in run-time.

4.2.3 Program structure

In the same way that Envision has run-time access to the functionality of Mat-lab, including Simulink, Stateflow and Real-Time Workshop, Matlab may usethe main functionality of Envision by sending RT Command Line Interpreter(CLI) commands [33] for execution in Envision. CLI is a command languagethat interacts with Envision as a virtual user. Since Envision is also a virtualuser from the perspective of Matlab with full access to the Matlab environ-ment, none of these applications will automatically take charge unless theyare explicitly programmed to. In the implementation of FUSE by which theSGRC algorithms in the ROWER-2 project were simulated and developed, Mat-lab was in charge via Envision Motion Pipeline [34], using Envision basicallyas a clocking device, inverse kinematics calculator and graphical engine. Thiswas sufficient for the development of SGRC algorithms in FUSE, but if FUSE ishowever intended to be used as an on-line and RT application, there will beneed for external clocking for the synchronization of FUSE to the external RTprocess.

4.2.4 Envision Motion Pipeline

Envision MP (Envision Motion Pipeline) consists of the sequence of functionsthat are executed in order to cause motion in a device with multiple DOF(Degrees of Freedom). These functions include inverse kinematics, motionplanning and execution, dynamic analysis and simulation. MP is written inC and the user may partially or completely replace the source-code deliveredby Delmia with user-written code. Each robot in Envision may be assigned

Page 35: Development of intelligent robot systems based on sensor control

4.2 FUSE 19

a unique user-written MP. Each time MP is modified, FUSE has to be rebuilt.FUSE, according to the present implementation, uses MP to link Envision withMatlab. When the user loads a workcell into Envision containing a robot thatuses a specific MP component connected to Matlab, Matlab Engine is instantlyactivated.

4.2.5 Flow structure

Feedback data from virtual

sensor

Envision Motion Pipeline

Matlab Engine

Next pose(robot manipulator)

Envision Motion Pipeline

Matlab Engine

Figure 4.1 In the present implementation of FUSE, Matlab is in charge, acting onsensor information received from Envision. Envision is used mainly as a graph-ical engine, inverse kinematics calculator and clocking device. When Envisionasks Matlab to perform a calculation, the Envision process becomes blocked inUnix and Envision waits until Matlab is finished before it proceeds further.

In our applications, FUSE was clocked by MP and Matlab was addressed everytime MP was executed, see Fig. 4.1. The execution frequency of MP depends onthe processor power of the computer it runs on. If some part of the processhowever slows down the system, such as a workcell with too many polygons orheavy Matlab calculations, the execution will adapt to the delays in the totalsystem instead. If RT execution is not required, MP is used as the processtimer, why the calculations are allowed to take the time needed to finish ajob without effecting the outcome of the simulation. When Envision asks forthe calculation of the next joint values of the robot, MP is executed, whichmakes a call to Matlab to take over the process. When Matlab is assigned atask by Envision, it blocks the Envision process in Unix. This allows the Matlabprocess to finish its task, before Envision continues the execution.

Envision has reading and writing access to all workspace variables in Matlab atrun-time. In this context, Envision may also import CLI strings from Matlab forexecution. It is hence not obvious which application that should be in charge.The robot control system and basically the intelligence is presently implemen-ted in the Matlab environment. As illustrated in Fig. 4.1 all sensor information

Page 36: Development of intelligent robot systems based on sensor control

20 Flexible Unified Simulation Environment

is provided by the graphical environment of Envision and passed to the Matlabworkspace for calculation, each time Matlab is called. The inverse kinematicsof the ROWER-2 robot is calculated by user-written C-code implemented in MP.Though Envision provides for many generic inverse kinematics algorithms, auser-written implementation may in some cases improve the execution speedor enable development and validation of new inverse kinematics algorithms.

4.2.6 FUSE Console

The Matlab prompt in FUSE is literally replaced by the MP. FUSE console inFig. 4.2 may thereby be used for run-time communication between the user,and the Matlab and Envision environments within FUSE.

FUSE Matlab Global Workspace

FUSE Matlab Commandline

User Input at Run−Time Giving Access to

FUSE EnvisionCLI Commandline

Figure 4.2 Commands written in FUSE Console are executed in the FUSE envir-onment in run-time.

4.2.7 Run-time interactive programming

Since Matlab is an interpreted language, programs written in Matlab withinFUSE may be modified in run-time as the simulation progress. The same isnot applicable for C-code written in MP, which has to be recompiled each timethe code has been modified.

Page 37: Development of intelligent robot systems based on sensor control

Chapter 5

Development and implementation ofseam tracking algorithms

5.1 Seam tracking at robot welding

5.1.1 Robot welding

Robot automation increases industrial productivity and frees man from un-healthy and monotonous labor. In the future robots are also anticipated tocontribute to education, entertainment, health care, mining, building, house-hold and many other areas [88]. One application area today is automatic robotwelding which improves the welding quality and the environmental condi-tions for welders. This applies especially to automatic welding in shipbuild-ing where large structures are welded, including the joining of double-hullsections [65, 67, 104, 105, 110].

5.1.2 Seam tracking

Seam tracking [18, 22, 27, 50, 95] is essential for automation in shipbuildingat manufacturing large passenger and cargo ships, such as super-cruisers andoil-tankers, where high tolerances in the sheet material are allowed to min-imize manufacturing costs. The study of the patents issued during the lastcentury shows that research within this application area goes back at least 40years, covering seam tracking systems based on among others mechanical, in-ductive, electrical, infrared, optical and electron beam sensors. Today the pre-dominating sensors are laser scanners [70, 92, 118, 123] and arc-sensors [12,29, 112, 120], both for industrial applications and in research with few excep-tions, such as in [106, 66].

Page 38: Development of intelligent robot systems based on sensor control

22 Development and implementation of seam tracking algorithms

5.1.3 Arc sensing

Seam tracking using through-arc sensing, or simply arc sensing, was intro-duced at the beginning of the 1980s. In this method the welding currentis used to estimate the position of the torch. Since the approximate distancefrom the electrode to the workpiece may be calculated analytically by a simpleformula, the geometrical profile of the workpiece may by this method be ob-tained by weaving the torch across the seam, see Fig. 5.1.

nx

y

zoa

TCP

Figure 5.1 Definition of TCP, and the orthonormal coordinate system noa.Weaving is performed in n direction, o is opposite to the direction of weldingand a is the direction of approach.

5.1.4 Laser scanning

Vision systems are basically only used to estimate the position of the work-piece, prior of seam tracking [71, 112, 45]. During seam tracking, due to theintensity of the arc light, usually only optical sensors such as laser scannersare used. A laser scanner is in general built using a diode laser and a mirror,scanning the beam across the seam, which is reflected back from the work-piece and analyzed by an optical system containing a CCD camera. To min-imize interference of the laser light with the welding process, a laser with anappropriate wavelength has to be chosen [10] and a robust method adoptedfor the extraction of the seam profile [69].

5.1.5 Comparison between arc sensors and laser scanners

Laser scanners are expensive, accurate and decrease the workspace of therobot since they have to be mounted on the torch. Arc sensors are inex-pensive and inaccurate. Since welding of large structures requires relatively

Page 39: Development of intelligent robot systems based on sensor control

5.2 Implementation of seam tracking in the ROWER-2 project 23

low accuracy, this may be one explanation why the majority of the patentsthat have been issued during the last 10 years for seam tracking at arc weld-ing [15, 60, 63, 113, 119] are based on arc sensing. Another explanation is thatwhile the accuracy of laser scanners are sufficient for seam tracking, large im-provements are still needed to increase the robustness of the systems basedon arc sensors. Today, improvements on laser scanners are primarily focusedon making them less expensive [57, 58, 72, 122, 124].

5.1.6 Power source settings and adaptive control

Besides the seam geometry in seam tracking, considerations have to be madeof process related welding parameters [9, 127]. The welding process containsmany parameters, such as arc voltage, current, wire feed speed and wire con-sumable. The objective is to determine feasible parameters for welding beforeseam tracking starts. This may be performed experimentally or by the use ofknowledge based systems [17, 126]. If it is not possible or desirable to keepthese settings constant throughout the seam at seam tracking for maintainedhigh welding quality [53, 59, 76, 119], for instance due to the characteristicsof the power source [14, 114, 45] adaptive control may be introduced into thesensor guided control loop. To increase the intelligence or robustness in seamtracking, many seam tracking systems using arc sensing are based on neuralnetworks [74, 93, 94, 121] and fuzzy control [16, 68, 125, 129]. Since thephysical experiments in the ROWER-2 projects were successful using analyt-ical methods alone, the implementation of these were however not considered.

5.2 Implementation of seam tracking in the ROWER-2 project

In the European ROWER-2 project, also presented in Papers I and IV, the ob-jective was to develop a robot system for automatic and adaptive welding ofdouble-hull sections in super-cruisers and oil tankers. The idea was that theworkers only had to mount the robot inside the hull-cell and would be able tosupervise the welding process from a remote distance. A robot system was de-veloped and physically validated along with a SGRC system for seam tracking,using COM-ARC III [130] as an initial reference for such design.

5.3 6D seamtracking

5.3.1 Background

Seam tracking is in general only performed in 2D by moving the torch withconstant velocity in x direction while a continuous correction is performed in

Page 40: Development of intelligent robot systems based on sensor control

24 Development and implementation of seam tracking algorithms

y and z directions to keep constant relative distance from TCP to the seamwalls, see Fig. 5.1. In 3D seam tracking another DOF is added to the SGRCsystem, performed by for instance correction in negative o direction, or modi-fication of an orientation axis. The seam tracking system implemented in theROWER-2 project is essentially a 2D system, plus an additional DOF due tothe vector booster method, that also modifies the direction of the nominaltrajectory, see Paper IV. Examples of 3D systems where laser scanners and arcsensors are used are [131, 132, 37].

There is today a great need for 6D seam tracking systems. Such systems aretheoretically able to correct the TCP in not only x, y and z directions, but alsoaround roll, pitch and yaw1, and is per definition able to follow any continuous3D seam with moderate curvature.

There exist no record today of the design of any 6D seam tracking systembased on arc sensors. There exist however records of suggestions of systemsbased on laser scanners [63, 90, 91] and force sensors [23, 24]. In additions,there are companies that claim to have commercial systems based on laserscanning for 6D seam tracking. Since no evaluations of these systems havebeen found by the author, and these systems have not become the commercialsuccess that one would anticipate judging from the needs of the market, wehave to question how robust these systems actually are. A preliminary studyof the few publications that exist in this area showed that the stability prob-lems have not been prioritized in the solutions. This makes this thesis, to theknowledge of the author, the first publication to present evaluation data for6D seam tracking systems based on arc sensing and laser scanning. These newsystems are, to the knowledge of the author, also the first 6D seam trackingsystems based on the same, where stability and reliability issues have beenprioritized already from the initial development stage.

5.3.2 Development and validation of 6D SGRC systems

The design and development of the 6D seam tracking systems presented inthis thesis was performed mainly in FUSE, both for the fundamental operationof systems based on laser scanning and arc sensing. While a virtual genericarc sensor was used in the arc sensing experiments, a virtual model of theM-SPOT [108] was used at laser scanning.

Regarding the system based on laser scanning, the performed simulation wasconsidered so accurate compared to real physical validation, that no physicalvalidation was considered as necessary. The performance of the seam trackingsystem using arc sensors however is dependent on the power-source. Since theuse of the Migatronic power-source showed to be successful in the ROWER-2project despite intrinsic arc control, the implementation of the seam track-

1Roll, pitch and yaw are in this thesis defined as rotation around z, y and x, in sequentialorder

Page 41: Development of intelligent robot systems based on sensor control

5.3 6D seamtracking 25

ing system is estimated to be successful for any automation-friendly power-source.

In the case of the laser scanning system, the computational costs were estim-ated to less than 250 + 50N floating point operations (without considerationto memory operations), where N is the number of curve fitting points, whichfor a typical value of N = 10 gives 750 operations per control cycle. At a rateof maximum 30 scans per second, this gives less than 22.500 floating pointoperations per second.

Page 42: Development of intelligent robot systems based on sensor control

26 Development and implementation of seam tracking algorithms

Page 43: Development of intelligent robot systems based on sensor control

Chapter 6

Systems development methodology

6.1 Development methodologies

6.1.1 Software development

At the end of the last decade, only less than half of software projects were de-livered and the average cost and time overruns were 200% [89, 117]. Further,only a small fraction of delivered software was actually used. This problemmay be addressed by implementation of software methodologies, acting as theinfrastructure behind all development process at an organization, increasingthereby the chances of a good quality product by decreasing the overall com-plexity of the software engineering effort [80, 117].

A software methodology may be defined as a collection of methods appliedacross the software development life cycle and unified by some general, philo-sophical approach, where a method is defined as a disciplined process forgenerating a set of models that describe various aspects of a software systemunder development, using some well-defined notation [20].

An example of a software development methodology is described in [20]. Thesteps in this methodology are (1) conceptualization, or establishing core re-quirements, (2) analysis, or development of a model of the desired behavior,(3) design, or creation of an architecture, (4) evolution, or evolvement of theimplementation and, (5) maintenance, or management of post-delivery evolu-tion.

Page 44: Development of intelligent robot systems based on sensor control

28 Systems development methodology

6.1.2 Mechanical virtual prototyping

Simulation in the mechanical design process saves time, effort and expenses.It reduces the need of physical prototypes in the design process, decreases therisk of design failure and improves communication between the design teamsand its members [50, 49, 115]. Currently, a typical mechanical prototypingmethodology includes the following major steps: (1) conceptual design, (2)simulation, (3) detailed design, (4) manufacturing of a physical prototype and(5) experimental validation [79].

6.1.3 Systems development methodologies in robotics

A typical systems development methodology in robotics, using the mechan-ical virtual prototyping scheme described above, is presented in Fig. 6.1. Ingeneral, software prototyping is not integrated with mechanical virtual proto-typing. In this scheme, the software development methodology is structurallyadapted to the suggested mechanical virtual prototyping methodology.

ConceptExperimental

validationPhysicalprototype

Detaileddesign

Simulation model

CAR system

Softwaresystem N

Mechanical system N

... ...

Softwaresystem 1

Mechanical system 1

... .........

......

Figure 6.1 Presently, the link between software development and mechanicalvirtual prototyping is practically non-existent in the development methodologyof robot systems.

6.2 Formulation of a new methodology

A complex system that works is invariably found to have evolved from a simplesystem that worked. A complete system designed from scratch never works and

Page 45: Development of intelligent robot systems based on sensor control

6.2 Formulation of a new methodology 29

cannot be patched up to make it work. You have to start over, beginning witha working simple system [52].

Since in many mechatronic systems, such as advanced airplanes and SGRC sys-tems, software and mechanical design are highly dependent on each other, it isessential to use a methodology that integrates these already from start [100],see Fig. 6.2.

The idea of including software prototyping in robot design is here taken an-other step further by using Matlab and Simulink as the software prototypingenvironment, including automatic C-code generation by RTW. This makes soft-ware development more high-level oriented which is essential in both design,maintenance and evolution of robot systems. Robot systems are in generalimplemented on real-time embedded computers. Since many professionalsregard software and hardware integration in embedded systems as the mostcritical bottleneck in the whole design process of a real-time system [51], andsince development of real-time systems is one of the most resource-consumingsteps in the design process of a robotic system, inclusion of RTW in the meth-odology accelerates the development process significantly.

ConceptExperimental

validationPhysicalprototype

Detaileddesign

Simulation model

FUSE

Softwaresystem N

Mechanical system N

Softwaresystem 1

Mechanical system 1

... .........

... .........

Figure 6.2 The suggested methodology includes software prototyping in robot-ics design along with automatic C-code generation for instant compilation anddownload to the final target.

The design process of a robot system is here divided into the following fivesteps:

1. A conceptual model of the robot system is iteratively developed, includ-ing both mechanical parts and software.

2. A simulation model is built covering the mechanical and the softwaresystem. If the design iterations fail, return is made to the conceptual

Page 46: Development of intelligent robot systems based on sensor control

30 Systems development methodology

stage.

3. Blueprints are produced for the mechanical system. In the software sys-tem, Matlab code is converted to C/C++ code or Simulink blocks.

4. The physical robot and mechanical accessories are manufactured. Themanually translated C/C++ code or the automatically generated C-codefrom Simulink blocks are compiled and downloaded to the final target.

5. The robot system is tested to meet the specifications formulated in thebeginning of the development process. If the specifications are met, theiteration process ends, otherwise return is made to a previous step. Inpractice, to avoid the redesign of the mechanical system, much effort isspent on mechanical virtual prototyping.

6.3 Application in the ROWER-2 project

Although the suggested methodology was not applied in its fullest extent inthe ROWER-2 project, it did minimize the development effort in the design ofthe 3D SGRC system, see Paper IV. The most essential concept of the suggestedmethodology is the simulation of the robot manipulator and accessories to-gether with the software, which in this application saved both time and effort.Figure 6.3 displays the application of FUSE in the Rower-2 project.

Robot

Softwaresystem

ConceptExperimental

validationPhysicalprototype

Detaileddesign

Simulation model

FUSE Embedded system Physical robot & MP

Mobile platform

Figure 6.3 The methodology was applied in the ROWER-2 project by the integra-tion of a model of the SGRC system with the virtual prototype of the mechanicalmanipulator and the mobile platform.

The applied methodology contained the following steps:

• Drafts were made of the robot system based on the specifications.

• The basic design of the robot manipulator and the mobile platform wereverified as suggested by the designers. Due to initial simulation results

Page 47: Development of intelligent robot systems based on sensor control

6.4 Development of the 6D seam tracking systems 31

the orientation of the torch was modified and the mobile platform wasdecided to only have one DOF. At a later stage a model of the SGRCwas developed integrated with a model of the mechanical design of thesystem.

• Detailed blueprints for the mechanical parts were produced based on thevirtual prototypes. The software model of the SGRC system was manu-ally translated from Matlab to C++ to be included in the robot controlsystem.

• The physical robot manipulator and mobile platform were built and mech-anically validated. The software was compiled and downloaded to theembedded computer. Since compilation is a relatively fast process com-pared to the manufacturing of the mechanical parts of the robot, thesoftware system could be iterated several times between simulation andexperimental validation.

• The robot system was experimentally validated along with further simu-lations and software modifications until the specifications were reached.

6.4 Development of the 6D seam tracking systems

The major development work of the 6D seam tracking systems was performedaccording to the methodology used in the ROWER-2 project, as describedabove. In the case of the arc sensing system, it was developed and perfec-ted in FUSE prior to the final translation from a combination of Matlab andC-code to pure Matlab code. The arc sensing system implemented in pureMatlab code was used as a template for the design of the laser scanning sys-tem, by some modifications of the template and the addition of a virtual laserscanner.

Page 48: Development of intelligent robot systems based on sensor control

32 Systems development methodology

Page 49: Development of intelligent robot systems based on sensor control

Chapter 7

Discussion

It is remarkable that a robot development environment such as FUSE is notavailable on the market already, since similar systems exist for integrationof generic mechanical simulation systems with calculation engines such asMatlab and Simulink. Mechanical simulators lack however the many specialfeatures provided by robot simulators such as Envision and RobCad. The ideato keep most of the development work of a new robot system within the sameenvironment and to be able to generate real-time C-code, for immediate com-pilation and download to the embedded computer of the robot, is appealingfor anyone who develops new robotic systems.

It is most probable that the 6D seam tracking systems presented in this thesiswould never have been developed if FUSE would not have existed. To de-velop robotic systems in Matlab only, is simply too tedious, since the VRMLtoolbox has to be used and the robot systems modeled from scratch. Onthe other hand, to perform development of such systems in Envision wouldhave required time-consuming C-programming, without the flexibility and thehigh-level approach that is provided by Matlab. The time and effort that a tra-ditional approach would have required would probably had been discouragingenough for the author to drop such enterprise already from start. This showsactually how important tools can be for development work, not only for theperformance of the work itself, but also as a source of inspiration, which isconfirmed by the expression “the road is the goal”.

From this standpoint it is clear that the road in this work comprised the de-velopment methodology as described in chapter 6 and the integrated develop-ment environment FUSE. On this foundation, an efficient development processcould take place as part of a bigger team for robot systems development withsensor guidance control. To meet both time and cost constraints in such devel-opment work it is important to be able to concurrently develop such a systemincluding mechanical system, control system and parts related to user inter-

Page 50: Development of intelligent robot systems based on sensor control

34 Discussion

faces and control of the application process including sensors. To meet goalsdefined in the ROWER-2 project, the methodology and FUSE environment wasdeveloped and validated. As a result of this work, generalized sensor guidancealgorithms were developed for 6D control including management of inversekinematics problems close to singular areas.

The validation experiments of the 6D seam tracking systems showed that bypre-scanning, any radius of curvature could be seam tracked in laser scan-ning, theoretically including sharp edges. It was further estimated that bothsystems could be able to manage a radius of curvature below 200 mm at realtime control, to be on the safe side, using any standard industrial robot ma-nipulator. In reality however, it is highly probable that much smaller radiusof curvatures are managed by these systems at real-time, perhaps even radiusof curvatures below 50 mm for a modern robot manipulator. As a next step,physical validations would be necessary to find the real limitations of thesesystems for any chosen industrial robot.

Concerning the new method for elimination of position errors caused by in-ner kinematics singularities in robotic wrists, preliminary experiments showedthat it works according to the formulated theory. If it is fully implemented,including considerations of the whole robot system and its surrounding en-vironment, this system is theoretically to be much more efficient that existingmethods today for dealing with inner singularity problems.

Page 51: Development of intelligent robot systems based on sensor control

Chapter 8

Conclusions

As pointed out in chapter 7, the integrated approach in the development workhas clearly demonstrated that the hypothesis holds. The methodology andFUSE enable simulations of SGRC systems, which require integration of bothsoftware and the mechanical models. Such integration allows for deeper ana-lysis of the interaction of the robot control system with its environment, basedon virtual sensors with predefined behavior, which in turn allows for devel-opment of systems that would have required much time and effort using tra-ditional systems development methodologies in robotics. The methodologyitself was validated by the development and the physical validation of the 3Dseam tracking system implemented in the ROWER-2 project and in the de-velopment and evaluation of the 6D seam tracking systems based on laserscanning and arc sensing.

Moreover, the following conclusions related to the objectives of the thesis aremade based on the research, development and validation work presented inthis thesis:

• The design of a unified simulation environment that integrates softwareprototyping with virtual prototyping of the mechanical parts of the ro-botic systems. This is especially useful for the development of intelligentrobotic systems, interacting with the surrounding environment based onsensor information.

• The design and validation of a seam tracking system in the ROWER-2project for robotic welding using arc sensing. The specifications of thisproject were successfully fulfilled based on physical experiments.

• It was shown by systematic simulation experiments based on physicalexperiments in the ROWER-2 project, that it is possible to design robust6D seam tracking systems based on laser scanning or arc sensing that in

Page 52: Development of intelligent robot systems based on sensor control

36 Conclusions

real-time are able to seam track a seam following an arbitrary 3D splinecurve with a radius of curvature below 200 mm. In the non-destructive“pre-scan” mode, the laser scanning system is expected to manage verysmall radius of curvature, theoretically even sharp edges.

Since 6D seam tracking based on arc sensing has not existed before,this makes arc sensors very attractive, since in real-time seam tracking,such system is able to basically do the same job as the ten times moreexpensive systems based on laser scanning.

• In addition to above, a new and novel method was designed that is ableto decrease problems caused by kinematics singularities in robotic sys-tems, which is especially important for intelligent and telerobotic sys-tems, where the motion of the robot is always predefined in advance.According to simulation of the basic functionality of this system withinthe scope of this thesis, the method worked according to the theory andin some cases large disruptions caused by inner kinematics singularitieswere entirely eliminated.

Page 53: Development of intelligent robot systems based on sensor control

Chapter 9

Future work

The 6D seam tracking systems were developed using simulation tools, basedon physical experiments in the ROWER-2 project. According to these simula-tions, by pre-scanning, any radius of curvature may basically be managed inthe case of laser scanning, theoretically including sharp edges. At real-time,to be on the safe side, it was suggested that the systems were able to manage aradius of curvature below 200 mm at real time control, using a standard robotmanipulator. In reality however, by the experience gained in the ROWER-2 pro-ject, much smaller radius of curvature should be managed by these systemsat real-time, perhaps below 50 mm for a modern robot manipulator. Physicalvalidations should therefore be performed to find the real limitations of thesesystems.

In addition, also the remaining features of the suggested method for elimin-ation of position errors caused by kinematics singularities should be imple-mented and evaluated.

Page 54: Development of intelligent robot systems based on sensor control

38 Future work

Page 55: Development of intelligent robot systems based on sensor control

Bibliography

[1] Delmia Inc., www.delmia.com.

[2] DynamiTechs, www.dynamitechs.com.

[3] Mathworks Inc., www.mathworks.com.

[4] Mechanical Dynamics Inc., www.adams.com.

[5] Tecnomatix Inc., www.tecnomatix.com.

[6] Waterloo Maple Inc., www.maplesoft.com.

[7] Wolfram Research, Inc., www.wolfram.com.

[8] Britannica Encyclopedia. www.britannica.com, 2002.

[9] S. Adolfsson. Automatic quality monitoring in GMA welding using signalprocessing methods. PhD thesis, Department of Production and Materi-als Engineering, Lund University, 1998.

[10] G. Agapiou, C. Kasiouras, and A. A. Serafetinides. A detailed analysisof the MIG spectrum for the development of laser-based seam trackingsensors. Optics and Laser Technology, 31(2):157–161, 1999.

[11] M. A. A. Arroyo Leon, A. Ruiz Castro, and R. R. Leal Ascencio. An artificialneural network on a field programmable gate array as a virtual sensor. InDesign of Mixed-Mode Integrated Circuits and Applications, 1999. ThirdInternational Workshop on, pages 114–117, July 1999.

[12] K.-Y. Bae, T.-H. Lee, and K.-C. Ahn. An optical sensing system for seamtracking and weld pool control in gas metal arc welding of steel pipe.Journal of Materials Processing Technology, 120(1-3):458–465, 2002.

[13] J. Baker and W. Graves. Simulation based software development. InComputer Software and Applications Conference, 1994. COMPSAC 94.Proceedings., Eighteenth Annual International, pages 121–125, 1994.

Page 56: Development of intelligent robot systems based on sensor control

40 Bibliography

[14] D. Barborak, C. Conrardy, B. Madigan, and T. Paskell. "Through-arc"process monitoring: techniques for control of automated gas metal arcwelding. In Robotics and Automation, 1999. Proceedings. 1999 IEEE In-ternational Conference on, volume 4, pages 3053–3058, 1999.

[15] P. Berbakov, D. MacLauchlan, and D. Stevens. Edge detection and seamtracking with EMATs. Patent No. US6155117, December 2000.

[16] Z. Bingul, G. E. Cook, and A. M. Strauss. Application of fuzzy logic tospatial thermal control in fusion welding. Industry Applications, IEEETransactions on, 36(6):1523–1530, 2000.

[17] G. Bolmsjö. Knowledge based systems in robotized arc welding. InInternational Journal for the Joining of Materials, volume 9 (4), pages139–151, 1997.

[18] G. Bolmsjö. Sensor systems in arc welding. Technical report, Robot-ics Group, Department of Production and Materials Engineering, LundUniversity, Sweden, 1997.

[19] G. Bolmsjö. Programming robot welding systems using advanced simu-lation tools. In Proceedings of the International Conference on the Joiningof Materials, pages 284–291, Helsingør, Denmark, May 16-19 1999.

[20] G. Booch. Object-Oriented Analysis and Design with Applications. TheBenjamin/Cummings Publishing Company, Inc., 2nd edition, 1997.

[21] A. Bredenfeld and G. Indiveri. Robot behavior engineering using DD-Designer. In Robotics and Automation, 2001. Proceedings 2001 ICRA.IEEE International Conference on, volume 1, pages 205–210, 2001.

[22] K. Brink, M. Olsson, and G. Bolmsjö. Increased autonomy in industrialrobotic systems: A framework. In Journal of Intelligent and RoboticSystems, volume 19, pages 357–373, 1997.

[23] H. Bruyninckx, J. De Schutter, and B. Allotta. Model-based constrainedmotion: a. modeling, specification and control. In Advanced Robotics,1991. ’Robots in Unstructured Environments’, 91 ICAR., Fifth Interna-tional Conference on, volume 2, pages 976–981, 1991.

[24] H. Bruyninckx, J. De Schutter, and B. Allotta. Model-based constrainedmotion: b. introducing on-line identification and observation of the mo-tion constraints. In Advanced Robotics, 1991. ’Robots in UnstructuredEnvironments’, 91 ICAR., Fifth International Conference on, volume 2,pages 982–987, 1991.

[25] S. E. Bryant and K. Key. Redefining the process for development of em-bedded software. In Computer Aided Control System Design, 1999. Pro-ceedings of the 1999 IEEE International Symposium on, pages 261–266,1999.

Page 57: Development of intelligent robot systems based on sensor control

Bibliography 41

[26] L. Burd. Using automated source code analysis for software evolution.In Source Code Analysis and Manipulation, 2001. Proceedings. First IEEEInternational, Workshop on, pages 204–210, 2001.

[27] P. Cederberg, M. Olsson, and G. Bolmsjö. A generic sensor interface in ro-bot simulation and control. In Proceedings of Scandinavian Symposiumon Robotics 99, pages 221–230, Oulu, Finland, October 14-15 1999.

[28] M. Colnaric. State of the art review paper: advances in embedded hardreal-time systems design. In Industrial Electronics, 1999. ISIE ’99. Pro-ceedings of the IEEE International Symposium on, volume 1, pages 37–42,1999.

[29] G. E. Cook, K. Andersen, K. R. Fernandez, M. E. Shepard, and A. M. Wells.Electric arc sensing for robot positioning control. Robot Welding, pages181–216. Editor: J. D. Lane. IFS Publications Ltd, UK, Springer-Verlag,1987.

[30] P. Corke. A robotics toolbox for MATLAB. IEEE Robotics and AutomationMagazine, 3(1):24–32, Mar. 1996.

[31] H. de Garis, A. Buller, M. Korkin, F. Gers, N. E. Nawa, and M. Hough. ATR’sartificial brain ("CAM-Brain") project: A sample of what individual "CoDi-1 Bit" model evolved neural net modules can do with digital and analogI/O. In Evolvable Hardware, 1999. Proceedings of the First NASA/DoDWorkshop on, pages 102–110, 1999.

[32] E. C. de Paiva, S. S. Bueno, S. B. V. Gomes, J. J. G. Ramos, and M. Ber-german. A control system development environment for AURORA’ssemi-autonomous robotic airship. In Robotics and Automation, 1999.Proceedings. 1999 IEEE International Conference on, volume 3, pages2328–2335, 1999.

[33] Delmia Inc., www.delmia.com. CLI Reference Manual.

[34] Delmia Inc., www.delmia.com. Motion Pipeline Reference Manual.

[35] Delmia Inc., www.delmia.com. Shared Library Reference Manual.

[36] M. Deppe, M. Robrecht, M. Zanella, and W. Hardt. Rapid prototyping ofreal-time control laws for complex mechatronic systems. In Rapid Sys-tem Prototyping, 12th International Workshop on, pages 188–193, 2001.

[37] P. U. Dilthey and J. Gollnick. Through the arc sensing in GMA-weldingwith high speed rotating torch. In Industrial Electronics Society, 1998.IECON ’98. Proceedings of the 24th Annual Conference of the IEEE,volume 4, pages 2374–2377, 1998.

Page 58: Development of intelligent robot systems based on sensor control

42 Bibliography

[38] W. Doggett and S. Vazquez. An architecture for real-time interpretationand visualization of structural sensor data in a laboratory environment.In Digital Avionics Systems Conferences, 2000. Proceedings. DASC. The19th, volume 2, pages 6D2/1–6D2/8, Oct. 2000.

[39] P. Drews and M. Weyrich. A system for digital mock-up’s and virtualprototype design in industry: ’the virtual workbench’. In Industrial Elec-tronics, 1997. ISIE ’97., Proceedings of the IEEE International Symposiumon, volume 3, pages 1292–1296, 1997.

[40] C. D. French and P. P. Acarnley. Simulink real time controller imple-mentation in a DSP based motor drive system. DSP Chips in Real TimeMeasurement and Control (Digest No: 1997/301), IEE Colloquium on,pages 3/1–3/5, 1997.

[41] M. Fridenfalk. Software development and integration of the Asimov ro-bot. Master’s thesis, Department of Production and Materials Engineer-ing, Lund University, 1998.

[42] M. Fridenfalk. Eliminating position errors caused by kinematics singu-larities in robotic wrists for use in intelligent control and teleroboticsapplications. Patent application SE0203180-5, October 2002.

[43] M. Fridenfalk and G. Bolmsjö. The Unified Simulation Environment —Envision telerobotics and Matlab merged in one application. In the Pro-ceedings of the Nordic Matlab Conference, pages 177–181, Oslo, Norway,October 2001.

[44] M. Fridenfalk and G. Bolmsjö. Design and validation of a sensor guidedcontrol system for robot welding in shipbuilding. In the Proceedingsof the 11th International Conference on Computer Applications in Ship-building, Malmö, Sweden, September 2002.

[45] M. Fridenfalk and G. Bolmsjö. Design and validation of a sensor guidedrobot control system for welding in shipbuilding. International Journalfor the Joining of Materials, 14(3/4):44–55, December 2002.

[46] M. Fridenfalk and G. Bolmsjö. Intelligent system eliminating trajectoryprogramming and geometrical databases in robotic welding. Patent ap-plication SE0203239-9, November 2002.

[47] M. Fridenfalk and G. Bolmsjö. Design and validation of a universal 6Dseam tracking robot control system based on arc sensing. Submitted tothe Journal of Advanced Robotics, 2003.

[48] M. Fridenfalk and G. Bolmsjö. Design and validation of a universal 6Dseam tracking robot control system based on laser scanning. Submittedto the Journal of Industrial Robot, 2003.

Page 59: Development of intelligent robot systems based on sensor control

Bibliography 43

[49] M. Fridenfalk, U. Lorentzon, and G. Bolmsjö. Virtual prototyping andexperience of modular robotics design based on user involvement. Inthe Proceedings of the 5th European Conference for the Advancement ofAssistive Technology, pages 308–315, Düsseldorf, Germany, November1999.

[50] M. Fridenfalk, M. Olsson, and G. Bolmsjö. Simulation based design of arobotic welding system. In Proceedings of the Scandinavian Symposiumon Robotics 99, pages 271–280, Oulu, Finland, October 14-15 1999.

[51] S. Gal-Oz and M. V. Isaacs. Automate the big bottleneck in embeddedsystem design. IEEE Spectrum, 35(8):62–67, August 1998.

[52] J. Gall. Systemantics: How Systems Really Work and How They Fail. AnnArbor, MI: The General Systemantics Press, 2nd edition, 1986.

[53] B. Ågren and P. Hedenborg. Integration of sensors for robot and weld-ing power supply control. In Proceedings of Fifth International WeldingComputerization Conference, pages 198–204, Golden, USA (AWS), Au-gust 1994.

[54] C. Guger, A. Schlogl, C. Neuper, D. Walterspacher, T. Strein, andG. Pfurtscheller. Rapid prototyping of an EEG-based brain-computerinterface (BCI). In Neural Systems and Rehabilitation Engineering, IEEETransactions on, volume 9, pages 49–58, 2001.

[55] G. Hailu and A. Zygmont. Improving tracking behavior with virtualsensors in the loop. In Aerospace Conference, 2001, IEEE Proceedings.,volume 4, pages 1729–1737, 2001.

[56] E. Hallberg, I. Kaminer, and A. Pascoal. Development of the rapid flighttest prototyping system for unmanned air vehicles. In American ControlConference, 1998. Proceedings of the 1998, volume 2, pages 699–703,1998.

[57] K. Haug and G. Pritschow. Robust laser-stripe sensor for automatedweld-seam-tracking in the shipbuilding industry. In Industrial Electron-ics Society, 1998. IECON ’98. Proceedings of the 24th Annual Conferenceof the IEEE, volume 2, pages 1236–1241, 1998.

[58] K. Haug and G. Pritschow. Reducing distortions caused by the weldingarc in a laser stripe sensor system for automated seam tracking. InIndustrial Electronics, 1999. ISIE ’99. Proceedings of the IEEE InternationalSymposium on, volume 2, pages 919–924, 1999.

[59] P. Hedenborg and G. Bolmsjö. Task control in robotics arc welding byuse of sensors. In Proceedings of the Tampere International Conferenceon Machine Automation — Mechatronics Spells Profitability, IFAC/IEEE,pages 437–445, Tempre, Finland, February 15-18 1994.

Page 60: Development of intelligent robot systems based on sensor control

44 Bibliography

[60] K. Hedengren and K. Haefner. Seam-tracking apparatus for a weld-ing system employing an array of eddy current elements. Patent No.US5463201, October 1995.

[61] F. O. Heimes, W. S. Ford II, and W. Land Jr. Application of evolution-ary computation to aircraft control law design. In Evolutionary Com-putation, 2001. Proceedings of the 2001 Congress on, volume 2, pages1040–1046, 2001.

[62] F. Hessel, P. Coste, P. LeMarrec, N. Zergainoh, J. M. Daveau, and A. A.Jerraya. Communication interface synthesis for multilanguage specific-ations. In Rapid System Prototyping, 1999. IEEE International Workshopon, pages 15–20, 1999.

[63] J. P. Huissoon and D. L. Strauss. System and method for tracking afeature on an object using a redundant axis. Patent No. US5465037,November 1995.

[64] M. Innocenti and L. Pollini. A synthetic environment for simulation andvisualization of dynamic systems. In Proceedings of the American Con-trol Conference, 1999, volume 3, pages 1769–1773, 2-4 June 1999.

[65] Y. B. Jeon, B. O. Kam, S. S. Park, and Sang Bong Kim. Seam trackingand welding speed control of mobile robot for lattice type welding. InIndustrial Electronics, 2001. Proceedings. ISIE 2001. IEEE InternationalSymposium on, volume 2, pages 857–862, 2001.

[66] S. Jorg, J. Langwald, J. Stelter, G. Hirzinger, and C. Natale. Flexible robot-assembly using a multi-sensory approach. In Robotics and Automation,2000. Proceedings. ICRA ’00. IEEE International Conference on, volume 4,pages 3687–3694, 2000.

[67] B.-O. Kam, Y.-B. Jeon, and S.-B. Kim. Motion control of two-wheeledwelding mobile robot with seam tracking sensor. In Industrial Elec-tronics, 2001. Proceedings. ISIE 2001. IEEE International Symposium on,volume 2, pages 851–856, 2001.

[68] Y. Kaneko, T. Iisaka, A. Tanaka, M. Peijun, S. Yamane, and K. Ohshima.Sensing of the weld pool depth with neural network and fuzzy controlof seam tracking. In Industrial Electronics, Control, and Instrumenta-tion, 1993. Proceedings of the IECON ’93., International Conference on,volume 1, pages 424–429, 1993.

[69] J. S. Kim, Y. T. Son, H. S. Cho, and K. I. Koh. A robust method for vision-based seam tracking in robotic arc welding. In Intelligent Control, 1995.,Proceedings of the 1995 IEEE International Symposium on, pages 363–368, 1995.

Page 61: Development of intelligent robot systems based on sensor control

Bibliography 45

[70] J. S. Kim, Y. T. Son, H. S. Cho, and K. I. Koh. A robust visual seam trackingsystem for robotic arc welding. Mechatronics, 6(2):141–163, 1996.

[71] M. Y. Kim, H. S. Cho, and J.-H. Kim. Neural network-based recognitionof navigation environment for intelligent shipyard welding robots. InIntelligent Robots and Systems, 2001. Proceedings. 2001 IEEE/RSJ Inter-national Conference on, volume 1, pages 446–451, 2001.

[72] W. Kölbl. Affordable optical seam tracking — meta torch systems breakthe price barrier. Industrial Robot: An International Journal, 22(6):19–21, 1995.

[73] K. Knight. Connectionist ideas and algorithms. Communications of theACM, 33(11):58–74, 1990.

[74] L. Kreft and W. Scheller. Arc welding seam tracking system based on arti-ficial neural networks. In Intelligent Systems Engineering, 1994., SecondInternational Conference on, pages 177–182, 1994.

[75] T. Laliberte, C. M. Gosselin, and G. Cote. Practical prototyping. IEEERobotics & Automation Magazine, 8(3):43–52, 2001.

[76] L.-O. Larsson and G. Bolmsjö. A model of optical observable parametersto control robotic GMA welding in sheet steel. In Proceedings of the FifthInternational Conference on Computer Technology in Welding, pages 2–10, paper 42, Paris, France, June 15-16 1994.

[77] P. Layzell. Addressing the software evolution crisis through a service-oriented view of software: a roadmap for software engineering andmaintenance research. In Software Maintenance, 2001. Proceedings. IEEEInternational Conference on, page 5, 2001.

[78] M. S. Loffler, D. M. Dawson, E. Zergeroglu, and N. P. Costescu. Object-oriented techniques in robot manipulator control software develop-ment. In American Control Conference, 2001. Proceedings of the 2001,volume 6, pages 4520–4525, 2001.

[79] U. Lorentzon and G. Bolmsjö. A methodology using computer tools inthe product development process. In Proceedings of 8th MechatronicsForum International Conference, Mechatronics 2002, Twente, The Neth-erlands, June 2002.

[80] T. Maginnis. Engineers don’t build. IEEE Software, 17(1):34–39, 2000.

[81] M. Makatchev and S. K. Tso. Human-robot interface using agents com-municating in an XML-based markup language. In Robot and HumanInteractive Communication, 2000. RO-MAN 2000. Proceedings. 9th IEEEInternational Workshop on, pages 270–275, 2000.

Page 62: Development of intelligent robot systems based on sensor control

46 Bibliography

[82] MathWorks, www.mathworks.com. Application Program Interface Ref-erence.

[83] MathWorks, www.mathworks.com. Application Program Interface User’sGuide.

[84] MathWorks, www.mathworks.com. MATLAB Application Program Inter-face Guide.

[85] MathWorks, www.mathworks.com. MATLAB Application Program Inter-face Reference Manual.

[86] MathWorks, www.mathworks.com. Real-Time Workshop User’s Guide.

[87] MathWorks, www.mathworks.com. Target Language Compiler.

[88] IFR 2001., editor. World Robotics 2001, statistics, analysis and forecasts.United Nations and International Federation of Robotics.

[89] S. McConnell. Avoiding classic mistakes. IEEE Software, 13(5):111–112,1996.

[90] N. Nayak and A. Ray. An integrated system for intelligent seam trackingin robotic welding. Part I. Conceptual and analytical development. InRobotics and Automation, 1990. Proceedings., 1990 IEEE InternationalConference on, volume 3, pages 1892–1897, 1990.

[91] N. Nayak and A. Ray. An integrated system for intelligent seam trackingin robotic welding. Part II. Design and implementation. In Robotics andAutomation, 1990. Proceedings., 1990 IEEE International Conference on,volume 3, pages 1898–1903, 1990.

[92] Y. Ogawa. Image processing for wet welding in turbid condition. In Un-derwater Technology, 2000. UT 00. Proceedings of the 2000 InternationalSymposium on, pages 457–462, 2000.

[93] K. Ohshima, M. Yabe, K. Akita, K. Kugai, T. Kubota, and S. Yamane.Sensor fusion using neural network in the robotic welding. In IndustryApplications Conference, 1995. Thirtieth IAS Annual Meeting, IAS ’95.,Conference Record of the 1995 IEEE, volume 2, pages 1764–1769, 1995.

[94] K. Ohshima, S. Yamane, M. Yabe, K. Akita, K. Kugai, and T. Kubota. Con-trolling of torch attitude and seam tracking using neuro arc sensor. In In-dustrial Electronics, Control, and Instrumentation, 1995. Proceedings ofthe 1995 IEEE IECON 21st International Conference on, volume 2, pages1185–1189, 1995.

[95] M. Olsson. Simulation and execution of autonomous robot systems. PhDthesis, Division of Robotics, Department of Mechanical Engineering,Lund University, 2002.

Page 63: Development of intelligent robot systems based on sensor control

Bibliography 47

[96] M. Oosterom and R. Babuska. Virtual sensor for fault detection and isol-ation in flight control systems — fuzzy modeling approach. In Decisionand Control, 2000. Proceedings of the 39th IEEE Conference on, volume 3,pages 2645–2650, Dec. 2000.

[97] F. Palcak and A. Cervenan. Optimisation of the robot with use of CAEtool. In Proceedings of the Fourth International Symposium on Meas-urement and Control in Robotics, pages 313–318, Slovak Tech. Univ.,Bratislava, Slovakia, 1995.

[98] H. Pasika, S. Haykin, E. Clothiaux, and R. Stewart. Neural networksfor sensor fusion in remote sensing. In Neural Networks, 1999. IJCNN’99. International Joint Conference on, volume 4, pages 2772–2776, July1999.

[99] J. Peterson, G. D. Hager, and P. Hudak. A language for declarative roboticprogramming. In Robotics and Automation, 1999. Proceedings. 1999 IEEEInternational Conference on, volume 2, pages 1144–1151, 1999.

[100] J. Pollack, H. Lipson, P. Funes, S. Ficici, and G. Hornby. Coevolutionaryrobotics. In Proceedings of the First NASA/DoD Workshop on, pages 208–216, 1999.

[101] V. T. Rajlich. Software evolution: A road map. In Software Maintenance,2001. Proceedings. IEEE International Conference on, page 6, 2001.

[102] O. Ravn. Using the adaptive blockset for simulation and rapid prototyp-ing. In Computer Aided Control System Design, 1999. Proceedings of the1999 IEEE International Symposium on, pages 255–260, 1999.

[103] P. Ridao, J. Battle, J. Amat, and M. Carreras. A distributed environmentfor virtual and/or real experiments for underwater robots. In Roboticsand Automation, 2001. Proceedings 2001 ICRA. IEEE International Con-ference on, volume 4, pages 3250–3255, May 2001.

[104] B. Rooks. Robot welding in shipbuilding. The Industrial Robot,24(6):413–417, 1997.

[105] S. Sagatun and S. Hendseth. Efficient off-line programming of robot pro-duction systems. In Proceedings of the 27th International Symposium onIndustrial Robots. Robotics Towards 2000, pages 787–792, CEU-CentroEsposizioni UCIMU, Cinisello Balsamo, Italy, 1996.

[106] D. J. Schmitt, J. L. Novak, and G. P. Starr. Real-time seam tracking forrocket thrust chamber manufacturing. In Robotics and Automation,1994. Proceedings., 1994 IEEE International Conference on, volume 3,pages 2261–2266, 1994.

Page 64: Development of intelligent robot systems based on sensor control

48 Bibliography

[107] R. R. Seban. An overview of object-oriented design and C++. In AerospaceApplications Conference, 1994. Proceedings., 1994 IEEE, pages 65–86,1994.

[108] Servo-Robot, Inc., www.servo-robot.com. M-SPOT User’s Manual, 1993.

[109] H. Sneed. Software evolution. A road map. In Software Maintenance,2001. Proceedings. IEEE International Conference on, page 7, 2001.

[110] P. Sorenti. Efficient robotic welding for shipyards — virtual reality sim-ulation holds the key. The Industrial Robot, 24(4):278–281, 1997.

[111] P. Stevens and R. Pooley. Using UML — Software Engineering with Objectsand Componenets. Pearson Education Limited, 2000.

[112] Y. Sugitani, Y. Kanjo, and M. Murayama. CAD/CAM welding in steelbridge panel fabrication. Industrial Robot: An International Journal,21(6):22–29, 1994.

[113] Y. Suzuki. Method and apparatus for determining seam tracking controlof arc welding. Patent No. EP1077101, February 2001.

[114] J. Tao and P. Levick. Assessment of feedback variables for through thearc seam tracking in robotic gas metal arc welding. In Robotics andAutomation, 1999. Proceedings. 1999 IEEE International Conference on,volume 4, pages 3050–3052, 1999.

[115] F. E. H. Tay, Y. P. Khanal, K. K. Kwong, and K. C. Tan. Distributed rapidprototyping — a framework for internet prototyping and manufactur-ing. Integrated Manufacturing Systems, 12(6):409–415, 2001.

[116] F. C. Teng. Real-time control using Matlab Simulink. In Systems, Man,and Cybernetics, 2000 IEEE International Conference on, volume 4, pages2697–2702, 2000.

[117] L. Trussell. Essential software development methodology. In PowerEngineering Society 1999 Winter Meeting, IEEE, volume 1, pages 357–361, 1999.

[118] C. Umeagukwu and J. McCormick. Investigation of an array techniquefor robotic seam tracking of weld joints. In Industrial Electronics, IEEETransactions on, volume 38, pages 223–229, 1991.

[119] J. Villafuerte. Arc welding torch. Patent No. US6130407, October 2000.

[120] J. Wang, K. Kusumoto, and K. Nezu. Arc sensing detection approach andits application for cutting torch tracking. International Journal for theJoining of Materials, 12(1), 2000.

Page 65: Development of intelligent robot systems based on sensor control

Bibliography 49

[121] J. Wang, Y. Le, and H. Pan. A welding joint tracking detection algorithmfor arc sensor based on neuro clustering. International Journal for theJoining of Materials, 11(2):42–48, 1999.

[122] M. Wilson. Vision systems in the automotive industry. Industrial Robot:An International Journal, 26(5):354–357, 1999.

[123] M. Wilson. The role of seam tracking in robotic welding and bonding.Industrial Robot: An International Journal, 29(2):132–137, 2002.

[124] J. Wu, J. S. Smith, and J. Lucas. Weld bead placement system for multi-pass welding using transputer-based laser triangulation vision system.In Science, Measurement and Technology, IEE Proceedings, volume 143,pages 85–90, 1996.

[125] G. Xiangdong, M. Yamamoto, and A. Mohri. Application of fuzzy logiccontroller in the seam tracking of arc-welding robot. In Industrial Elec-tronics, Control and Instrumentation, 1997. IECON 97. 23rd InternationalConference on, volume 3, pages 1367–1372, 1997.

[126] M. Xie and G. Bolmsjö. Using expert systems in arc monitoring andcausal diagnosis in robotic arc welding. In International Journal for theJoining of Materials, volume 4 (4), 1992.

[127] M. Xie and G. Bolmsjö. Quality assurance and control for robotic GMAwelding — part 1: QA model and welding procedure specification. InJoining Sciences, volume 1 (4), 1993.

[128] J. H. Yakey, S. M. LaValle, and L. E. Kavraki. Randomized path planningfor linkages with closed kinematic chains. Robotics and Automation,IEEE Transactions on, 17(6):951–958, 2001.

[129] S. Yamane, Y. Kaneko, A. Hirai, and K. Ohshima. Fuzzy control in seamtracking of the welding robots using sensor fusion. In Industry Applica-tions Society Annual Meeting, 1994., Conference Record of the 1994 IEEE,volume 3, pages 1741–1747, 1994.

[130] Yaskawa, www.yaskawa.com. Arc Sensor COM-ARC III Instructions.

[131] J.-Y. Yu and S.-J. Na. A study on vision sensors for seam tracking ofheight-varying weldment. Part 1: Mathematical model. Mechatronics,7(7):599–612, 1997.

[132] J.-Y. Yu and S.-J. Na. A study on vision sensors for seam tracking ofheight-varying weldment. Part 2: Applications. Mechatronics, 8(1):21–36, 1998.

[133] M. V. Zelkowitz and L. Rus. The role of independent verification andvalidation in maintaining a safety critical evolutionary software in a

Page 66: Development of intelligent robot systems based on sensor control

50 Bibliography

complex environment: the NASA Space Shuttle program. In SoftwareMaintenance, 2001. Proceedings. IEEE International Conference on, pages118–126, 2001.

Page 67: Development of intelligent robot systems based on sensor control

CONTRIBUTIONS

Page 68: Development of intelligent robot systems based on sensor control

52

Page 69: Development of intelligent robot systems based on sensor control

Paper I

Simulation based design of a robotic weldingsystem

Mikael Fridenfalk, Magnus Olsson, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University

Box 118, SE-221 00 Lund, SWEDENwww.robotics.lu.se

In Proceedings of the Scandinavian Symposiumon Robotics 1999, October 14-15, Oulu, Finland

Page 70: Development of intelligent robot systems based on sensor control
Page 71: Development of intelligent robot systems based on sensor control

Simulation based design of a robotic weldingsystem

Mikael Fridenfalk, Magnus Olsson, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University

Box 118, SE-221 00 Lund, SWEDENwww.robotics.lu.se

Abstract

Simulation and a virtual prototyping method is applied in the designprocess of a robot system for arc-welding of ship hulls. Simulation isused for the identification of design problems at an early stage, by aniterative problem-solving approach, integrating the work of designers andsimulation engineers all the way from idea to product. This is of vitalimportance especially in shipbuilding where the complicated geometricalparts of the ship hull makes it difficult to develop robot systems that arevalid for all hull-cells. Initially, the robot system has to be adapted to aspecific reference cell, representing a general cell hull in a submarine ora supertanker. An embryonic, well functioning virtual sensor for weldingwith weaving using through-arc sensing has been created and integratedwith the simulation environment.

KeyWords: arc-welding, robot, simulation, shipbuilding, virtual prototyping,virtual sensor, virtual reality.

1 Introduction

New areas in welding large structures in ship-building include joining largesections such as double-hull constructions. Joining these sections create greatproblems for a manual welder since the welding takes place in a closed areawith associated work environmental problems. The accessibility to the work-ing area is limited to a man-hole and the use of robots for welding such struc-tures requires new robot design that are adapted for the task as well as theadditional requirements of one-off production.

The present paper will describe work and result related to design of a robotsystem for joining ship sections, specifically welds that must be made inside

Page 72: Development of intelligent robot systems based on sensor control

56

the structure. Within the scope of this work there are some obvious reasonsfor automation that include not only quality and productivity issues, but workhealth for the welders where the welding is performed in a closed environment.

Robot welding in shipbuilding is a relatively new application area in compar-ison with robot welding in automobile industry. Due to geometrical complex-ities of ship hulls [1], especially in the fore and aft of the ships, programmingis usually an awkward task. Simulation and off-line programming tools aremost useful in this aspect for virtual prototyping and recollection of CAD-geometries for robot welding [2, 3, 4, 5, 6, 7, 8].

Additional problems include the difficulty of transportation and assembly ofthe robot to the working place and the design or selection of a robot systemwith a working space that is large enough for the hull panels. In presentcase the main issue is to build a robot that is light enough to be transportedand assembled manually, is fixable in the reference hull cell and that has thereachability and power to perform the required welding operations.

The design of a new robot in the present case is fundamental for obtainingthe defined prerequisites of this project, formulated by the shipyards. Therobot that will be designed and built reminds strongly of a Motoman K3. Thedifferences are mainly that the new robot will have extended range and willbe made of aluminum alloy for the achievement of high performance charac-teristics and low weight.

The physical requirements on the robot system besides of what has been men-tioned above are that it can be taken inside the structure through a man holeof 600× 800 mm, weighting not more than 50 kg per part and assembled in-side the cell. The structure of the cell is such that the robot must be mountedon a mobile platform, see Fig. 4.

Based on requirements defined by shipyards, the design work includes the useof Envision TR1 for analysis and evaluation of the robot performance duringwelding tasks. Of importance is the robustness of the system that relatesto the ability to cope with the range of geometrical dimensions as well asfluctuations in tolerances. This must include the capability to perform thetask within the joint limits of the robot system (including the movable robotbase) and avoidance of kinematics singularities within the motions of the weldtask that may jeopardize the result.

The paper will focus on the following:

1. Virtual design of a robot system using simulation tools such as EnvisionTR.

2. A systems description of the robot system and the application area spe-cification.

1Deneb Robotics Inc.

Page 73: Development of intelligent robot systems based on sensor control

57

3. Methods to simulate the robot system including sensor interaction. Inthis case specific macros will be included that uses search techniqueswith the torch (mechanical/electrical contact) and weaving (tracking us-ing through-the-arc sensing principle). Simulation of the system includ-ing the sensor interaction and tolerances is an important part to validatethe robustness of the design.

Work presented within this paper is part of an on-going BRITE/EURAM projectROWER-2.

2 Simulation development process

The use of simulation in the process of the design of complex mechanical sys-tems, decrease the risk of design errors and saves time, money, equipmentand extensive specialist knowledge [9]. The process of innovation is drastic-ally empowered by the aid of virtual prototyping, since building of a physicalprototype often slows down the innovation process and is usually quite ex-pensive. Further more, a physical model do not allow parameterization, anindispensable feature that allows mapping of the areas of robustness in a fastand efficient way, which is fundamental for optimization.

It is a disadvantage to build a physical model in one place and to convey theinformation back to co-designers in an another place of the world. An optimaldesign process requires a functioning model of the system to be availableduring the whole design iteration process. The transportation of a physicalmodel from one place to another is unthinkable and it is not always possibleto gather the necessary assembly of experts under the banner of one singlecity or country, yet less in one building. In this work the simulation techniqueis used as a tool to support the design process. During a pre-study, functionalrequirements and design parameters were identified to be used in the firstconceptual design for simulation.

After a study of the nature of virtual prototyping, the conclusion is that thereexist in a sense two mutually opposite simulation strategies: the “breadth firstsimulation” strategy [10] and the “depth first simulation” strategy [11, 12].Breadth first simulation consists of five major steps, see Fig. 1. The buildingprocess implies the design of a simple but functional CAD-model of the entiremechanical system. The model should include the basic parts of the mechan-ical system and the kinematics relationships between the parts. The testing ofthe model reassures that the model is working accordingly. Validation impliesthat the model is tested against real data. If the model fails to match reality, ithas eventually to be refined and the procedure of validation is repeated untilthe simulation match with real data. Parameterization and definition of vari-ables prepare the model for experiments and robustness tests. Optimizationimplies the identification of the areas of robustness and determination of the

Page 74: Development of intelligent robot systems based on sensor control

58

BUILDING

TESTING

VALIDATION AND REFINEMENT

PARAMETERIZATION

ITERATION AND OPTIMIZATION

BREADTH FIRST SIMULATION

Figure 1 The breadth first simulation strategy.

parameters that give the most efficient or robust system configuration.

The second strategy, depth first simulation is the crawl-walk-run method thatin the most extreme form implies full simulation of every part of the system,before the integration of the entire system is made.

SUB−MODEL 1

SUB−MODEL 2

SUB−MODEL N

...

DEPTH FIRST SIMULATION

THE CRAWL−WALK−RUN METHOD

Figure 2 The depth first simulation strategy.

The normal approach to simulation is often an unconscious combination ofthese two strategies, often in practice a random combination of these two.However, a careful examination should be made after the pre-study to identifythe critical parts of the system that need to be analyzed and modeled in moredetail. This is important to speed up the iterative process during the designof the system.

Page 75: Development of intelligent robot systems based on sensor control

59

3 Simulation environment

Envision TR, running on SGI, was used as the primary tool during this work forall simulation tasks. Pro/ENGINEER2 was used for modeling the modified K3robot. Envision TR is a Deneb product for telerobotics and offers a graphicalreal-time simulation environment with the most sophisticated features forsimulation, control and virtual prototyping of robots.

The internal programming language, called Graphical Simulation Language(GSL) [13] is a script language used to execute tasks within the program. Link-ing the simulation environment with the real world is possible by the use ofLow Level Telerobotics Interface (LLTI) and Axxess. LLTI [14] provides an en-vironment in which real-world devices may be programmed and controlleddirectly from the simulated workcell. LLTI automatically maintains the accur-acy and integrity of the simulated workcell according to subtle changes in thereal-world environment through the use of real-time sensory feedback.

For more advanced telerobotics applications, Envision TR also provides a flex-ible API (Application Programmer Interface) framework called Axxess [15], inwhich it is possible to link the kernel of Envision TR with user-code written inC. It is also possible to access any functionality within Envision, whether it isthe user interface, the geometric database, the graphics display or any otherfunctionality and to replace the source code of Envision TR with user-writtencode.

The Axxess library of functions provides a comprehensive set of routines thatthe developer may use to build his own application, a feature that has beenused during this work to build a basic and straight-forward sensor model;an embryo of a virtual sensor to be developed in future work. The Axxessopen architecture framework is based on Dynamic Shared Objects (DSOs), anExecutable and Linking Format object file (ELF), relocatable at run time, andvery similar to an executable program but with no main routine. It has a sharedcomponent, consisting of shared text and read-only data, a private component,consisting of data and an indirection table also know as the Global Offset Table(GOT). It has further several sections that hold information necessary to loadand link the object and a liblist, the list of other shared objects referenced bythis object. Most UNIX (including IRIX used by SGI) software vendors supplytheir libraries in DSO form. In Windows NT, these libraries are in DLL format.

4 Experimentals

A model was created of the welding robot, the mobile platform and the doublehull reference cell based on a conceptual design of the robot system, see Fig. 3.

2Parametric Technology Corp.

Page 76: Development of intelligent robot systems based on sensor control

60

Welding paths, specified by the shipyard were created and attached to the hullcell.

Figure 3 Reference cell used in the design process.

Since the welding robot is similar to the Motoman K3 robot, see Fig. 4, a modelof the K3 robot was used as the base structure of the welding robot. The imple-mentation of the inverse kinematics of the welding robot was easily realizedby inheritance of the optimized inverse kinematics routines of the MotomanK3 model.

All the parts of the K3 model were redesigned to fit the new specifications anda welding torch model was designed and attached to the robot. The model ofthe welding torch in Fig. 5, was designed in accordance with the first sugges-tion of the co-designers of the robot system. Early simulation work provedhowever that the welding torch was almost aligned with the sixth axis of therobot, creating a singularity. The weakness of the first torch was that the anglebetween its welding axis and the sixth axis of the robot was too small, or morespecifically only 7 degrees. It was suggested that a welding torch with an anglebetween 25-45 degrees should be used, see Fig. 4.

To be able to reach all stiffeners, the welding robot was mounted on a mobileplatform (MP), movable in the vertical direction, see Fig. 6. This platformenhanced the number of the degrees of freedom for the robot from 6 to 7,increasing thereby its reachability. Since the MP does not move during weldinghowever, it will take no active part in the welding process.

A reference cell with the dimensions of 1950× 2600× 3500 mm was used in

Page 77: Development of intelligent robot systems based on sensor control

61

Figure 4 Robot design for joining large structures.

Figure 5 The welding torch model in accordance with the first suggestion of theco-designers.

Page 78: Development of intelligent robot systems based on sensor control

62

the simulation. In practice every ship has a huge amount of different types ofcells. The reference cell represents a typical cell in any ship and the main ideais that once the robot system is fully adapted to the reference cell, it will alsobe able to operate, with some programming modifications, in the other cellsas well.

The recommended design process for this project at this particular phase, isto first set up a priority list over the parameters in the robot system that maybe modified. In some cases for instance, the robot is already selected andmay not easily be exchanged. In this project the robot is not built yet, butthe design process has been terminated. It is in the present case not possibleto make any suggestions for improvements, unless a serious flaw is foundin the robot design. Yet another reason for not changing the robot design,is that the design of industrial robots in general are highly optimized andthe possibilities of making an improvement of the design of a K3 robot, forinstance, is quite limited in comparison with the labor that such an attemptrequires.

In this project, the design of the MP and the selection of the welding torchis however currently under progress. The parameters of interest for the MPare among others, dimensional values and values of maximum torque in dif-ferent directions. Some important parameters for the welding torch are thedimensional values and the maximum current capacity.

The application of the breadth first simulation in this context might very wellbe the fact that it was decided before the start of the project, on a draft level,how the robot system should practically be realized. No consideration wasmade to the very details of the realization of the project. The application ofdepth first simulation in its pure form might be the development of the robotmanipulator in advance and without influence of the design of the rest of therobot system.

An important factor in choosing simulation strategy is to identify interde-pendent parameters within and between the sub-systems. The principal ruleis, wherever there exist independent parameters to be optimized, depth-firstsimulation is applicable. On the other hand, when the parameters to be op-timized are dependent on each other, breadth first simulation is the methodof choice.

At present the design of the MP is in the iteration and optimization phase, seeFig. 7. Different suggestions has been made for the design of the MP. Theseinclude the L-shaped platform, envisaged to allow the robot manipulator to useits full mobility. In order to minimize the weight and maximize the stiffness ofthe MP, also a diagonal MP has been considered. Note that since two walls hasto be welded in each cell, the MP has to be reversed in direction or switchedto its mirror counterpart. If the MP is not able to switch direction, a pair ofMPs has to be designed, each for one direction.

Page 79: Development of intelligent robot systems based on sensor control

63

Figure 6 The mobile platform (MP) in simulation.

G

H

D

d

α

ROBOT POSITION

STIFFENER

Figure 7 A sketch of the MP and the parameters to be optimized, top view.

Page 80: Development of intelligent robot systems based on sensor control

64

At the beginning the suggested values for simulation were 500 mm both for Dand H. After selection of the first weldgun, D and H were changed to 718 and397 mm respectively. The conflict of interest between different parameters isvery likely to occur in the design of robot systems. In the current case a torchhad to be selected and the dimensions of the MP determined to avoid (duringwelding):

• collision with the hull,

• collision with the MP or itself,

• singularities.

All simulation work was carried out using the collision detection and the sin-gularity notification features of Envision TR. Since the torch geometry wasthe cause of singularity and collision problems, a new torch was selected andthe parameters D and H were iterated once again. The best values found forthe new torch at present, are D = 617 and H = 450 mm. There are of coursesome restrictions to the dimensions of the MP. The MP itself consists of anoverhead, carrying the L-shaped platform. The overhead, called GLC-103, isa commercial linear guide system, able to move a specific payload. Thoughit has a high dynamic capacity and maximum speed, its momentum capacityis however quite modest. Since large values of D and H might overload thelinear guide, not every value of D and H is acceptable. At present the newvalues achieved from the simulation are examined by the sub-designers of theMP to ensure that the linear guide is not overloaded.

5 Implementation of virtual sensor

One of the reasons to develop a virtual sensor model was to attain independ-ence of the welding equipment used in this project. Since the co-designers ofthe robot system are spread over different countries in Europe, the equipmentmay not at all be present when needed. Other factors are the possibilities totest and verify the functionality of a sensor that will be used in the system.Thus, controller software as well as sensor functionality can be developed andtested [16, 17].

Adding to all the advantages that has been related to the virtual prototyp-ing method, the sensor model also increases the possibilities to optimize thedesign conditions of the robot system in a systematic manner, much due tothe feature of parameterization and variable definition in virtual prototyping.

The foundation of a virtual sensor model was laid during this work. A simplelinear sensor model was integrated with the kernel of Envision TR. The vir-tual sensor measures the current in the welding equipment, a technique called

3NIASA, Spain

Page 81: Development of intelligent robot systems based on sensor control

65

through-the-arc sensing. The principle behind through-the-arc sensing is basedon the assumption that the current is inverse proportional to the distance ofthe welding electrode and the work piece during weaving. In reality, the re-lation between the current and the distance is non-linear and stochastic (in-cluding noise).

It is thus possible to dispense with torch sensors such as laser-beam sensorsfor distance measuring. Since a vision camera will be placed on the weldingtorch, there is no room for an additional sensor. A hall sensor that monitorwelding current, do not have to be placed on the robot and saves crucial space.The weaving is a sine-curve motion perpendicular to the bisects of the planesof the workpiece, see Fig. 8. The weaving motion is executed by the GSL-scriptbelow:

PROGRAM wmotVARcurrent : realmax_current_comp : realBEGIN MAIN

$weave = on$WEAVE_AMPLITUDE = 10$WEAVE_FREQUENCY = 4$speed = 10$MOTYPE = AUX1current = 50max_current_comp = 25MOVE TO ww2

END wmot

The main programming however, has been done in C and linked with the kernelof Envision TR. The flow diagram in Fig. 9 gives the primary structure of thecode.

6 Results

The first design suggestion of a robot for arc-welding in ship hulls was testedand modified by use of simulation and the virtual prototyping method. Im-portant but not insurmountable obstacles for realization of the robot designwere found in the first design iteration and the information was communic-ated back to the designers of the sub-systems. The problems found relateto the welding torch and the mobile platform. Simulation showed that thewelding torch was almost aligned with the six:th joint of the robot, creatinga singularity. The change in the simulation environment of the torch angleto 25-45 degrees proved to be a better solution. An embryonic virtual sensor

Page 82: Development of intelligent robot systems based on sensor control

66

was created measuring the weld current in weaving. The virtual sensor wasintegrated with the simulation environment. The concept of the “breadth firstsimulation” and the “depth first simulation” strategies was introduced.

7 Future work

In future work, the embryonic virtual sensor model is planned to be extendedby the introduction of neural networks or another similar method. The modelwill be based on empirical data from the welding equipment.

Besides the use of simulation performed in this work, it could also be usedwithin the following areas in future work:

• Examination of workcell with updated information about the geometryof the robot and the MP.

• Dynamic analysis of the joint torques and forces on the robot and theMP.

• Introduction of link flexibility in simulation and analysis of eigenfrequen-cies at weaving.

• Simulation of the welding cables to avoid undesired collisions and twist-ing accidents.

• Simulation of vision system.

• Integration of the welding software and the virtual model for softwaredebugging.

Acknowledgment

The ROWER-2 project is funded by European Community under the BRITE/EURAMprogram.

References

[1] F. Hollenberg Off-line programming of arc welding robots in shipbuilding.In Proceedings of the 8th International Conference on Computer Applica-tions in Shipbuilding, volume 8, pages 27-40, 1994.

[2] S. Sagatun and S. Hendseth. Efficient off-line programming of robot pro-duction systems. In Proceedings of the 27th International Symposium onIndustrial Robots. Robotics Towards 2000, pages 787-792, 1996.

Page 83: Development of intelligent robot systems based on sensor control

67

[3] P. Drews, D. Matzner, O. Knudsen and E. Gallier. Smart welder—parallelhigh performance computing applied in shipbuilding. In Proceedings of1995 Transputer Applications and Systems, Amsterdam, Netherlands.

[4] B. Rooks. Robot welding in shipbuilding. The Industrial Robot, 24(6):413-417, 1997.

[5] H. C. Larsen Automation of welding processes in modern shipbuilding.Svetsaren (English-Edition), 49(1):21 23, 1995.

[6] F. Palcak and A. Cervenan. Optimisation of the robot with use of CAE tool.In Proceedings of the Fourth International Symposium on Measurement andControl in Robotics, pages 313-318, Slovak Tech. Univ., Bratislava, Slovakia,1995.

[7] P. Sorenti Efficient robotic welding for shipyards—virtual reality simula-tion holds the key. The Industrial Robot, 24(4):278 281, 1997.

[8] M. Fridenfalk, U. Lorentzon and G. Bolmsjö. Virtual prototyping and ex-perience of modular robotics design based on user involvement. In Pro-ceedings of the 5th European Conference for the Advancement of AssistiveTechnology, Düsseldorf, Germany, November 1999.

[9] D. Ullman The Mechanical Design Process. McGraw Hill Inc., 1992.

[10] Mechanical Dynamics, Inc. Using ADAMS View, 1997.

[11] Mechanical Dynamics Sweden AB, Inc. Adams Basic Training, 1999.

[12] J. Kingston. Algorithms and Data Structures. Addison Wesley, 1997.

[13] Deneb Robotics Inc. GSL Reference Manual, 1998.

[14] Deneb Robotics Inc. In LLTI Reference Manual, 1998.

[15] Deneb Robotics Inc. In AXXESS Reference Manual, 1998.

[16] P. Cederberg, M. Olsson and G. Bolmsjö. A generic sensor interface in ro-bot simulation and control. In Proceedings of the Scandinavian Symposiumon Robotics 99, Oulu, Finland, 1999.

[17] M.Olsson, P.Cederberg and G.Bolmsjö Integrated system for simulationand real-time execution of industrial robot tasks In Proceedings of theScandinavian Symposium on Robotics 99, Oulu, Finland, 1999.

Page 84: Development of intelligent robot systems based on sensor control

68

Figure 8 Simulation of robot weaving with virtual sensor.

Page 85: Development of intelligent robot systems based on sensor control

69

IF MIN OF THE WEAVING CYCLE HAS BEEN REACHED, LOOK FOR MAX.

MEASURE MINIMUM DISTANCE BETWEEN TOOL−TIP AND WORKPIECE.

C−CODE LINKED WITH ENVISION TR MOTION

PIPELINE

ENVISION TR MOTION PIPELINE(A NEW CALCULATION OF THE POSITION

IS MADE 30 TIMES PER SECOND)

IF DISTANCE < MIN_DISTANCE, USE TRESHOLD VALUES.

CALCULATE ADJUSTMENTS OF THE TOOL−TIP RELATIVE TO THE NOMINAL TRAJECTORY.

SAVE DATA IN LOG−FILE. THE FILE CAN BE OPENED IN MATLAB FOR ANALYSIS.

IF MAX OF THE WEAVING CYCLE HAS BEEN REACHED, CALCULATE AVERAGE CURRENT BY

INTEGRATION. THEN LOOK FOR MIN.

ENVISION TR MOTION PIPELINE

Figure 9 Present C-code linked with the kernel of Envision TR.

Page 86: Development of intelligent robot systems based on sensor control

70

Page 87: Development of intelligent robot systems based on sensor control

Paper II

Virtual prototyping and experience of modularrobotics design based on user involvement

Mikael Fridenfalk, Ulf Lorentzon, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University

Box 118, SE-221 00 Lund, SWEDENwww.robotics.lu.se

In Proceedings of the AAATE 99 Conference,November 1-2, 1999, Düsseldorf, Germany

Page 88: Development of intelligent robot systems based on sensor control
Page 89: Development of intelligent robot systems based on sensor control

Virtual prototyping and experience of modularrobotics design based on user involvement

Mikael Fridenfalk, Ulf Lorentzon, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University

Box 118, SE-221 00 Lund, SWEDENwww.robotics.lu.se

Abstract

Robotics in rehabilitation provides considerable opportunities to im-prove the quality of life for physically disabled people.

However, practical results are limited, mainly due to the lack of devel-opment of new robotics concepts where people are working together withrobots in highly unstructured environments. A way to meet these condi-tions is to apply a modularized approach that allows for customizationof the robot on an individual basis.

This paper highlights the user requirements and how they affectedthe technical demands in the robotics design. In the design process the“virtual prototyping” method was used. The robot design is in a tech-nical validation phase and the results related to the user requirementsare promising. The presented work shows a technique that make it pos-sible to customize a robot called Asimov for a disabled user dependingon the individual need.

1 Introduction

Robotics with a high degree of autonomy is a challenge in robotics researchand can advance the technology within the health care area. The present statusof robot systems, in general those within industrial automation, is that theyare more or less used as programmable automatic machines.

Most robots used in rehabilitation today have similarities with industrial ro-bots, such as the RT-series robots and SCORBOT which originally were de-veloped for educational purposes and have been used within projects fordisabled people [1, 2] often with adaptations for rehabilitation purposes asHandy-1 [3] which is used to assist in eating [4] and DeVar which uses a PUMArobot for assisting disabled at home or at vocational workplaces [5].

Page 90: Development of intelligent robot systems based on sensor control

74

An example of a commercial robot designed for mounting on a wheelchair isthe relatively advanced designed Manus robot which can be considered to bethe most successful robot for disabled people of its kind [6].

New designs are on the way that will introduce the use of compact and flexiblearms as well as new drives/actuators. Examples of this include the Tou soft(flexible) assistant arm [7], the pneumatically driven Inventaid arm [8, 9] andthe compliant actuator Digit Muscle [10]. The interest in combining a wheel-chair and a manipulator is increasing, not only the manipulator itself, butalso through the enhancements of the wheelchair control by providing it withsensors and a control system as with any mobile robotic base [11]. There arecurrently research within the MOBINET project which will integrate innovativedesign solutions of a mobile base with a robot arm. Developments describedin this paper form a subsystem within MOBINET [12, 13, 14].

This paper will address specific problems associated with the development ofthe robot system Asimov for disabled people and the technology behind it.

An important issue related to the human machine interface is the relativelycomplex control pattern needed to define a robot motion even with a propor-tional joystick with two degree of freedom. The different characteristic config-urations including six degree freedom in space require additional user interac-tion similar to developments in adapting user interfaces for omni-directionalwheelchairs [15, 16].

The main contribution shown in this paper can be summarized as (1) thedesign solutions resulting in a modular approach of customizing a robot and(2) the method of applying virtual prototyping techniques with extensive useof advanced simulation tools during the design process and for validation ofthe functionality.

2 Modular design and virtual prototyping

The starting point to create a robot for disabled people was to identify a setof user requirements that can form a basis for technical specifications. Therequirements were formed by rehabilitation centers in cooperation with po-tential end-users such as people injured in working life and people with highspinal injuries. Based on the user requirements and the resulting technicalspecification, solutions were generated and tested against the specified needs,see table 1.

It is clear that any robot to be used by a disabled person will be a challengein terms of usefulness related both to the manipulative functionality and tothe human machine interface. However, the aim was not to identify a userspecification and design one robot that would fit any disabled person but todevelop innovative solutions that make it possible to customize a robot fordisabled people on an individual basis. The customization aspect is important

Page 91: Development of intelligent robot systems based on sensor control

75

Table 1 Technical specification.

Technical SpecificationWeight (robot arm) Low total weight (10 -15 kg)Payload 1 kgWorking range More than 1 mMaximum joint speed 90 degrees/sMaximum Cartesian speed 1.5 m/sPower 24 VDC (Wheelchair battery)Configuration Redundant with revolute joints

since there are different needs based on both the individual requirementsand the disability which in itself may need specific considerations includingchanges over time.

One way to meet these specifications was to design a robot where each joint isan independent module. Each joint module consists of one or several motorsand the peripherals needed for driving the joint(s) and make communicationwith the rest of the arm possible. The complete robot will be built from anumber of standardized joint modules and connecting links. The modularapproach makes it possible to adapt the robot to a specific user and situationsince it will allow a great flexibility to the configuration and size of the robot.For example a robot mounted on a desk could have a 5-axis configurationwith short connecting links which would give a small working space. A robotmounted on a wheelchair would need a larger working range and more jointsto be able to reach a desk surface or a book shelf. It must also be able to foldup in a position within the chair’s outer limits. With the Modular approach itis possible to adapt the robot for these applications.

To be able to design and analyze a complex system such as a robot it is im-portant to use modern techniques where simulations and mechanical designrun in parallel with the development of the controller hardware and software.

A design method that lately have been introduced is the virtual prototypingmethod where the designers work closely together with simulation engineersand exchange models and experiences during the whole process. Thus, a teamof experts is able to cooperate in the design process and continuously assessa large number of features by using virtual prototypes and simulation models[17].

A virtual prototyping system, as described in figure 1, consists of a solidmodeler and a simulation software. The models created in the CAD sys-tem are transferred to the simulation system and the complete mechanismis assembled and simulated. The simulation software makes it possible toanalyze and calculate vital engineering information, such as motor torquesand interlink forces. The virtual prototyping method makes it possible to de-

Page 92: Development of intelligent robot systems based on sensor control

76

velop and test a large number of virtual prototypes at low cost thus reducingdevelopment time and costs. Within this project, the softwares used werePRO/Engineer for the solid modeling and Envision TR for the simulations.

CAD system

CAE system

Designer

Simulation Engineer

Product data model

Virtual Prototyping system

* realistic presentation

* intuitive manipulation

Numerical data

Figure 1 A schematic showing the principals of virtual prototyping. [18]

When using the virtual prototyping method the designers are able to explore awider range of possible solutions without making a single physical prototypeuntil the design is finalized. This makes it possible to explore unorthodoxdesign solutions that normally would have been rejected at an early stage ofthe design process. The visualization of the models in the simulation systemmakes it possible for people without an engineering background to get a fairlygood understanding of the model which is useful in discussions with futureusers and customers.

3 Technical development

The Asimov robot, described in figure 2, is a telerobotic arm with 7 joints,primarily designed to be attached to a wheelchair (from the manufacturerPermobil in Sweden) and controlled by a joystick. To fulfill the specificationsformulated in table 1, the following measures had to be taken to obtain max-imum efficiency, minimum weight and minimum power consumption:

• Components of lightweight material

Page 93: Development of intelligent robot systems based on sensor control

77

MOTORCONTROL

MOTORCONTROL

CAN

CAN

CAN

CAN

USERINPUT

PROGRAMINPUT

MOTOR MOTORMANIPULATOR

CANINTERFACE

MOTIONPLANNER

Figure 2 Schematic depiction of the robot system. One of the modules is high-lighted and extracted from the manipulator.

Page 94: Development of intelligent robot systems based on sensor control

78

• Selection of brushless D.C. motors

• Selection of planetary gear boxes

The power source of the robot and the wheelchair is a dual set of standard carbatteries (12 VDC) providing for 24 VDC. Since modular design require thatthe motors and the electronics have to be enclosed into the minimal space ofthe robot arm, special electronic circuits have been designed. The principalscheme of the electronic system of a single module is presented in figure 3.

H−BRIDGE

HALL

ENCODER

PREAMPLIFIER

PAL

MOTION CONTROL

MICROCONTROLLER

MOTOR

FEEDBACK

Figure 3 Scheme of the information flow in the electronics.

Every module contains a microcontroller from Permobil and is connected bythe CAN-bus to the main computer. Only four wires are running through therobot arm: two for the power-supply and two for the CAN-bus communication.The motion control unit contains besides a motion generator, also PID-controland generates pulse width modulated (PWM) signals that runs the motor afterthe commutation order defined by the programmable array logics (PAL) andamplification (H-Bridge). There are some feedback signals from the motorwhere the hall signal is essential for motor commutation and the encodersignal for the motor positioning.

The information flow is also presented at a higher level in figure 2 and needsno comments besides that the program input refers to Envision TR.

4 Validation of technical specification and user requirementsusing VR-tools

Virtual prototyping was used as design process which helped to make sure thatthere would not occur any miscommunication between the end-users and thedevelopment team concerning the product.

The use of virtual prototyping tools showed to be a successful strategy ofvisually clarifying the potential of the project (see figure 4). After establishinga list of requirements, based on consultation with potential users, a refined

Page 95: Development of intelligent robot systems based on sensor control

79

Figure 4 The manipulator mounted on the electric wheel chair. Snapshots fromthe simulation scenarios using the software package IGRIP (Deneb Robotics Inc.).

model was produced, that should be able to, at least in theory, to fulfill theuser requirements.

The models of Envision TR [19], precisely represent the actual geometry, mo-tion and dynamic characteristics associated with the real world system. Envi-sion TR delivers (with the dynamic option) dynamic data such as forces andtorques for arbitrary revolute and prismatic joints. While kinematics modelinggives a good hint of how the physical body of the robot should be dimensioned,the dynamic modeling deliver information essential for motor dimensioning.

This information simplified and accelerated the iterative process of designthroughout the project and secured that great consideration of the user re-quirements was taken. Furthermore, the simulator was used to, in real-time,generate and perform the motion control of the physical robot during tech-nical validation and testing.

The major problem during the testing proved to be some instability and os-cillation. The problem was however temporarily solved by the replacement ofthe base motor with a higher torque.

At start, Asimov was designed to have 8 degrees of freedom, excluding thegripper, for optimum reachability. At a later stage the removal of one motorwas scrupulously examined in Envision TR and a new and better solution wasfound, a task that would have required much more time and calculations toaccomplish, without the ability of VR-prototyping.

The possibility to develop entire motion procedures and working scenarios

Page 96: Development of intelligent robot systems based on sensor control

80

using simulations and VR-tools showed to be a useful working method forcross-disciplinary teamwork. Fields of interests for Asimov are such opera-tions as opening/closing doors, putting/taking a book from a shelf, grabbinga glass and drink from it, etc. A number of successful simulations have beenperformed concerning this kind of operations.

5 Results

A modular robot was designed by the virtual prototyping method and a sevenaxis prototype has been built. The physical robot showed to be very wellcorrelated to the virtual model. The initial technical validation of the robotproved well to match the basic initial specifications.

The weight of the robot is approximately 13 kg and the working range about1.5 m. The maximum joint speed surpass the specified. It is possible toincrease the ratio of the reducers by a factor of four and still preserve themaximum joint speed of 90 degrees/s. Some instability problems that wereidentified will most likely be solved in the next generation of Asimov by theimplementation of adaptive control and the reduction of the size of the robotarm.

The power consumption proved to be very low, about 50 W at typical perform-ance, which is quite insignificant in comparison with the power consumptionof the rest of the wheelchair.

6 Discussion

Although much was achieved by the virtual prototyping approach, there isstill need for improvements of the next generation of Asimov. Below followsa list of suggestions, based on an interview with a potential user of Asimov(during a workshop held recently) and experimental results received from thetrial run of the robot:

• Reduction of the length of the robot arm by a factor of 2/3.

• Reduction of the size of the electronics. The present controller card andthe motion controller circuits should be replaced by a microcontrollerchip. This work is in progress.

• Introduction of current monitoring for the motor phases. This enablestorque control and eliminates the motor fuses.

• Study the possibility to increase of the gear box ratio by a factor of 2-3.

Page 97: Development of intelligent robot systems based on sensor control

81

• Implementation of full modularity and unlimited rotations by introduc-tion of slip ring connectors.

• Introduction of adaptive control.

• Reconfiguration of the robot arm to resemble a human arm.

Other more innovative solutions are the replacement of the CAN-bus withradio communication and the implementation of a “plug and play” strategy,where the modules could be connected and unconnected by a simple opera-tion. The modules would instantly recognize the new configuration, changethe kinematics model and update the adaptive control.

Acknowledgments

This work was funded by AMS (The Swedish National Labor Market Board),RFV (National Social Insurance Board), NUTEK (Swedish National Board forIndustrial and Technical Development) and the European Commission withinthe TMR Programme, project MOBINET.

References

[1] J. L. Dallaway, R. D. Jackson, and P. H. A. Timmers. Rehabilitation Roboticsin Europe. IEEE Transactions on Rehabilitation Engineering, 3:35–45, 1995.

[2] G. Bolmsjö, H. Eftring, and H. Neveryd. Robotics in rehabilitation. IEEETransactions on Rehabilitation Engineering, 3:77–83, 1995.

[3] M. Topping and J. Smith. The development of handy 1, a rehabilitationrobotic system to assist the severely disabled. Industrial Robot, 25(45):316–320, 1998.

[4] J. Hegarty. Rehabilitation robotics: The users perspective. In Proceedingsof the 2nd Cambridge Workshop on Rehabilitation Robotics, Cambridge, UK,April 1991.

[5] M. van der Loos, J. Hammel, and L. Leifer. DeVar transfer from R&D to vo-cational and educational settings. In Proceedings of the Fourth Int. Conf. onRehabilitation Robotics, pages 151–155, Wilmington, Delaware, June 14-161994.

[6] T. Øderud, J. E. Bastiansen, and S. Tyvand. Experiences with the MANUSwheelchair mounted manipulator. In Proceedings of ECART 2, page 29.1,Stockholm, May 26-28 1993.

Page 98: Development of intelligent robot systems based on sensor control

82

[7] A. Casals, R. Villà, and D. Casals. A soft assistant arm for tetraplegics. InProceedings of the 1st TIDE Congress, pages 103–107, Brussels, 6-7 April1993. Studies in Health Technology and Informatics, Vol. 9, IOS Press.

[8] R. D. Jackson. Robotics and its role in helping disabled people. EngineeringScience and Educational Journal, pages 267–272, December 1993.

[9] J. Hennequin and Y. Hennequin. Inventaid, wheelchair mounted manip-ulator. In Proceedings of the 2nd Cambridge Workshop on RehabilitationRobotics, Cambridge, UK, April 1991.

[10] S. Greenhill. The digit muscle. Industrial Robot, 20(5):29–30, 1993.

[11] S. Levine, D. Bell, and Y. Koren. An example of a shared-control system forassistive technologies. In Proceedings of the 4th Int. Conf. on Computersfor Handicapped Persons, Vienna, Austria, September 1994.

[12] H. Heck. User needs for a mobile service robot in health care and rehab-ilitation. In S.Tzafestas, D.Koutsouris, and N.Katevas, editors, Proceedingsof the 1st MobiNet Symposium, pages 35–44, Athens, Greece, May 15-161997.

[13] B. J. F. Driessen, J. A. v. Woerden, M. W. Nelisse, and G. Overboom. Mobilerobotics and MANUS: A feasible approach? In S.Tzafestas, D.Koutsouris,and N.Katevas, editors, Proceedings of the 1st MobiNet Symposium, pages115–120, Athens, Greece, May 15-16 1997.

[14] O. Buckmann, M. Krömker, G. Bolmsjö, U. Lorentzon, and F. Charrier.Design of health care robotics components by use of the rapid prototyp-ing technologies. In N.Katevas, editor, Proceedings of the 2nd MobiNetSymposium, pages 75–84, Edingburgh, UK, July 23 1998.

[15] C. Buhler, H. Heck, and W. Humann. User-driven human-machine in-terface configuration for a wheelchair with complex functionality. InG.Anogianakis, C.Buhler, and M.Soede, editors, Proceedings of the 4thEuropean Conf. for the Advancement of Assistive Technology, pages 375–380. IOS Press, 29 September - 2 October 1997.

[16] H. Kwee. Integrating control of Manus and wheelchair. In Proceeding ofthe 5th Int. Conf. on Rehabilitation Robotics, ICORR’97, pages 91–94, Bath,UK, April 14-15 1997.

[17] U. Lorentzon and M. Olsson. Eight axis leightweight service manipulator:Virtual prototyping of mechanical structure and controller software. InJ.G.Loughran and S.A.Domanti, editors, Proceedings of the World CongressManufacturing Technology Towards 2000, Cairns, North Queensland, Aus-tralia, September 15-17 1997.

Page 99: Development of intelligent robot systems based on sensor control

83

[18] F. Dai. On the role of simulation and visualization in virtual prototyp-ing. In Prodceedings of the Int. Workshop on Virtual Reality and ScientificVisualization, pages 23–26, Hangzhou, China, April 1995.

[19] Deneb Robotics Inc. ENVISION User Manual and Tutorials, 1998.

Page 100: Development of intelligent robot systems based on sensor control

84

Page 101: Development of intelligent robot systems based on sensor control

Paper III

The Unified Simulation Environment - EnvisionTelerobotics and Matlab merged into one

application

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University

Box 118, SE-221 00 Lund, SWEDENwww.robotics.lu.se

In Proceedings of the Nordic Matlab Conference,October 17-18, 2001, Oslo, Norway

Page 102: Development of intelligent robot systems based on sensor control
Page 103: Development of intelligent robot systems based on sensor control

The Unified Simulation Environment - EnvisionTelerobotics and Matlab merged into one

application

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University

Box 118, SE-221 00 Lund, SWEDENwww.robotics.lu.se

Abstract

The kernels of Matlab and Envision Telerobotics (TR) were merged onthe SGI platform, taking advantage of Matlab Engine and the open archi-tecture of Envision TR, designing a new robot simulation environmentcalled the Flexible Unified Simulation Environment (FUSE). Envision TR isa time-continuous graphical robot simulation program from Delmia Inc.

In FUSE, Matlab is able to take advantage of geometrical libraries formost commercial robots on the market and the optimized graphical ker-nel for real-time 3D applications offered by Envision TR. Envision TR isgained by full access to Matlabs vectorized calculation environment accel-erating either the speed of execution at run-time, in some cases above thespeed that compiled C-code offers, or at debugging and software simula-tion, offering the capability to modify the software at run-time, alteringthe program flow instantly in the middle of program execution. In ad-dition, development of new robot systems are optimized by the abilityof using specialized Matlab toolboxes in diverse application areas, suchas statistics, automatic control and artificial neural networks, minimiz-ing the software development time, effort and risks taken in major robotprojects.

FUSE was used in the ROWER-2 project, a European Union projectaimed to develop a robot system for automatic arc-welding of hulls intanker ships. The seam-tracking and sensor control software developedin FUSE were implemented in the target QNX embedded system.

KeyWords: Robotics, Simulation, IGRIP, Envision TR, Matlab Engine, softwaredevelopment, systems development, virtual sensor.

Page 104: Development of intelligent robot systems based on sensor control

88

1 Introduction

1.1 Integration of systems

Integration of systems saves time, effort and money. The requirement of uni-versality makes software and hardware products to either improve their com-patibility with current standards or other products, or to increase their mar-ket shares enough to be able to establish their own standards. One example isMatlab (www.mathworks.com), an application for numerical calculations andone of the standard applications used for scientific research and development.Though Matlab is regarded to be user-friendly and to posses many optimizedmathematical tools, the real strength lies however in the open architecture andprovision of an environment where the users easily make addition to the ap-plication [5,6]. Matlab has also established compatibility with many softwareand hardware standards on the market during its history, aiding the process ofdata analysis, software prototyping and automatic code generation for amongothers, embedded systems [7].

In contrast to the strategy to gain market shares by offering a high amount ofcompatibility and open architecture, some companies have established univer-sal standards by using their market dominance. These software are usuallynot provided with open architecture and do not promote seamless integra-tion with similar applications on the market, neither do they in general offercompatibility with file formats from other competing products. They offerhowever as a rule a great amount of integration and automation within theirrespective product family.

1.2 Integration in robotics

There is presently a great need in robotics for seamless integration of CAD/-CAM/CAE and robot simulation applications into one single and working solu-tion. There exists however no such system today. Within the application areaof robot simulation the most common applications are IGRIP (www.delmia.com)and RobCAD (www.tecnomatix.com). These programs offer graphical real-timeenvironments and geometrical libraries for most common commercial robotson the market. They also provide for kinematics models for all provided robotmodels, as well as many accessories, such as welding torches and fixtures.

1.3 FUSE

The design of the Flexible Unified Simulation Envorinment (FUSE), done by mer-ging the open architecture version of IGRIP, called Envision Telerobotics (TR),with Matlab, is in no wise an attempt to create a unified system integratingrobot simulation and with CAD/CAM/CAE. It is a unified system consisting

Page 105: Development of intelligent robot systems based on sensor control

89

of the integration of robot simulation with software prototyping. Since themain development time designing a new robot, as a rule is spent on systemsdevelopment, the integration of robot simulation with a software prototyp-ing environment seemed to be the most efficient way to improve the overallproductivity in robotics design.

1.4 Objective

We designed FUSE to be used in the development of a number of sensor con-trol systems for seam-tracking at arc-welding in the European ROWER-2 pro-ject. The aim of the ROWER-2 project was to develop a robotics system forautomatic welding of hulls in ships [4,8], mainly super-sized cruisers and oiltankers. The sensor control models were originally based on the concept ofthe Yaskawa COM-ARC III [9].

2 Materials and methods

2.1 Envision TR

Envision TR, is a real-time graphical robot simulation program. It contains,apart from a nearly optimal graphical engine, also full geometrical and kin-ematics libraries covering most commercial robots and many robot accessor-ies on the market. Figure 1 displays the Graphical User’s Interface (GUI) ofEnvision TR, including the robot Galileo designed specially for the ROWER-2project by Tecnomare, Italy.

In contrast to robot simulation programs from Delmia such as IGRIP, EnvisionTR is shipped with open architecture, and provides the source code for a signi-ficant part of the application. Figure 2 shows the open architecture structureof Envision TR. The user may modify or replace many libraries with user writ-ten code and add new features to the application. The application is createdby compiling and linking all libraries to Envision TR.

2.2 Matlab

Matlab is an application and a programming language widely used for numer-ical calculation in education and research, both in university and in the in-dustry. The extensive libraries offered by Matlab makes it usable for softwaredevelopment and testing within a wide range if engineering fields. Programswritten in Matlab may be converted to C/C++ and compiled for faster execu-tion or integration with other products. The Application Programmers Inter-face (API) of Matlab, also called Matlab Engine may be linked to user-writtensoftware in C/C++.

Page 106: Development of intelligent robot systems based on sensor control

90

Figure 1 Envision TR GUI

Page 107: Development of intelligent robot systems based on sensor control

91

User data

libvmappch.so*

vmap.exe

libaxspre.so

libapp.so

libvmap.so*

Delmia data

libd

nd

usr

.so

libax

s.so

*

Figure 2 Envision TR Open Architecture Structure. The libraries marked withan asterix are not modifiable by the user.

Page 108: Development of intelligent robot systems based on sensor control

92

2.3 FUSE

We designed FUSE application kernel by compiling and linking Matlab Enginewith Envision TR and their share libraries [1,2,3], also known as DynamicShared Objects (DSOs) or Dynamic Link Libraries (DLLs). The compilation wasperformed on the SGI, IRIX platform, compiling for the o32 compiler architec-ture and FUSE was successfully tested for use on an O2 and an Octane com-puter using 32 bit and 64 bit architectures respectively. The same mergingprocedure was theoretically possible, but was not carried out, on PC Windowsenvironments, but also on all other platforms that supported Envision TR andMatlab Engine.

Envision TR was modified and adapted to the new application structure bymodification in the Motion Pipeline[2]. The Motion Pipeline, written in C, waspart of the libdnbusr.so DSO and was used for calculation of new positionsfor the robot joints about 30 times per second. Envision TR and Matlab is aserver-client system. Envision TR plays the role of a passive client and Matlabthe role of an active server. During the calculations, the Motion Pipeline ismade to halt and have to wait for Matlab to finish its task before it is allowedto continue the execution. If the calculations in Matlab takes more than about1/30s, the simulation rate becomes dependent on Matlab instead of EnvisionTR. By merging these application kernels, Matlab is received full access to thegraphical engine of Envision TR and the geometrical robot libraries. And viceversa, Envision TR is given access to the Matlab environment, including allavailable Matlab Toolboxes. This includes also automatically full integrationof Envision TR with Simulink, Stateflow and consequently also with Real-TimeWorkshop (www.mathworks.com).

This section is concluded by an example showing how to transfer a transform-ation matrix (4x4) from Envision TR Motion Pipeline (C-code) to the Matlabworkspace, within the FUSE environment. The reverse process is performedin a similar manner. It should be remarked, that though the transformationmatrix in the example below, per definition always had to be transposed inEnvision TR, with respect to the first and second index parameters, it endedup right in the Matlab environment automatically by the transfer.

.../*--------------Sample code--------------*//*--------------Declaration--------------*/...static Engine *ep;...mxArray *P_init_mxArray = NULL,...double P[4][4];.../*--------------Assignment---------------*/

Page 109: Development of intelligent robot systems based on sensor control

93

.../* Assigning numerical values to P */.../*---Transferring a 4x4 Matrix from Motion Pipeline to Matlab---*/

/* Create and load P_init into Matlab */P_init_mxArray = mxCreateDoubleMatrix(4, 4, mxREAL);mxSetName(P_init_mxArray, "P_init");for ( i = 0; i < 4 ; i++ )

for ( j = 0; j < 3 ; j++ ) P[i][j] = initial_location->xform[i][j];

/* Assigning perspective/scale vector

for explicit order notification */P[0][3] = 0; P[1][3] = 0;P[2][3] = 0; P[3][3] = 1;memcpy((void *)mxGetPr(P_init_mxArray),

(void *)P, sizeof(P));engPutArray(ep, P_init_mxArray);

/*---------------------------------------*/

3 Results

We integrated Envision TR with Matlab on the SGI platform, resulting in thenew application kernel FUSE. The fusion was successful and was made bylinking and compiling the source code of Envision TR with Matlab Engine. Thisnew application was fully stable and the only time it crashed was occasionallyat the startup process due to some basic bugs in Matlab v 6.0 R12 for IRIX,SGI. This bug may most likely be removed in the next version of Matlab.

FUSE can theoretically run in two major modes. In the real-time executionmode, Matlab’s highly vectorized functions were used to improve executionspeed, even in some cases compared to optimized C/C++ code. In the soft-ware prototyping mode, modification in the software in the middle of programexecution led to instant change of the program flow. In both modes it wasof course possible to use the wide range of specialized Matlab toolboxes indiverse application areas, such as statistics, automatic control and artificialneural network, to minimize the software development time, effort and riskstaken in major projects.

In general, FUSE led, compared to traditional robot system development tools,also to the need of fewer programmers and administrators, cutting the costs,minimizing management and thereby increased the control over the code,which indeed also resulted in safer programs. The improvement of productiv-ity that FUSE was provided for, made us to develop, instead of only one sensor

Page 110: Development of intelligent robot systems based on sensor control

94

algorithm, a great number of sensor control algorithms in the ROWER-2 pro-ject, many of them more competent and robust than any similar product onthe market. The prototype of one of the sensor control algorithms was fi-nally selected, translated to C++ and implemented on a Industrial PC board, aPIA-662 with QNX operating system.

4 Discussion

The full integration of Envision TR with Matlab shows on one level to savemuch time and effort in development work. In practice however, the totaldevelopment time and effort seems in a way not to be reduced but ratherincreased. The reason is that FUSE encourages development of complex solu-tions, based on experiments and analysis that usually never is performed dueto lack of time and human resources. In our case, we would have most likelyonly developed one single algorithm for seam-tracking and sensor control andwe would have not taken any risks using more advanced and complex solu-tions than absolutely necessary.

The strength of FUSE is in this context that it encourages the user to use high-tech solutions since it makes software prototyping powerful and fun. Thoughthe software development rate is in general increased by many times usingFUSE, the documentation rate still remains the same. In the development ofthe sensor control algorithms for instance, the documentation rate is presentlyestimated to be more than five times slower than the actual development timefor the software, which is a bottleneck. The good news is however that iffuture software prototyping environments will be like FUSE, perhaps the realprogrammers may in the future not dislike real programs any longer, but onlythe documentation of real programs.

5 Acknowledgements

The ROWER-2 project is funded by European Commission under the BRITE/EURAMprogram.

References

[1] Delmia Inc., www.delmia.com. AXXESS Reference Manual.

[2] Delmia Inc., www.delmia.com. Motion Pipeline Reference Manual.

[3] Delmia Inc., www.delmia.com. Shared Library Reference Manual.

Page 111: Development of intelligent robot systems based on sensor control

95

[4] G. E. Cook, K. Andersen, K. R. Fernandez, M. E. Shepard, and A. M. Wells.Electric arc sensing for robot positioning control. Robot Welding, pages181–216. Editor: J. D. Lane. IFS Publications Ltd, UK, Springer-Verlag, 1987.

[5] MathWorks Inc., www.mathworks. MATLAB Application Programmers In-terface User’s Guide.

[6] MathWorks Inc., www.mathworks. MATLAB Application Programmers In-terface Reference Manual.

[7] MathWorks Inc., www.mathworks. Real-Time Workshop User’s Guide.

[8] M. Fridenfalk, M. Olsson, and G. Bolmsjö. Simulation based design of arobotic welding system. In Proceedings of the Scandinavian Symposium onRobotics 99, pages 271–280, Oulu, Finland, October 14-15 1999.

[9] Yaskawa Inc. Arc Sensor COM-ARC III Instructions.

Page 112: Development of intelligent robot systems based on sensor control

96

Page 113: Development of intelligent robot systems based on sensor control

Paper IV

Design and validation of a sensor guided robotcontrol system for welding in shipbuilding

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

In International Journal for the Joining of Materials,Volume 14, Number 3/4, December 2002

Page 114: Development of intelligent robot systems based on sensor control
Page 115: Development of intelligent robot systems based on sensor control

Design and validation of a sensor guided robotcontrol system for welding in shipbuilding

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

Abstract

New areas in welding large structures in shipbuilding include joininglarge sections such as double-hull constructions. Joining these sectionscreate great problems for a manual welder since welding takes place in aclosed area with associated work environmental problems. The accessib-ility to the working area is limited to a man-hole and the use of robots forwelding such structures requires new robot design that are adapted forthe task as well as the additional requirements of one-off production.

This paper will describe research work and results within the ROWER-2project. The aim of the project was to design a robot system for joiningship sections in the final stage when ship sections are to be assembledtogether in dry dock. Due to a high degree of manual work involved in theassembly procedure of the ship, the project addressed both productivityand quality issues. An important part within the project was to developcontrol algorithms for seam tracking during welding based on through-arc sensing. The aim was to be able to cope with tolerances in the jointsafter manual setup and tack welding of the structure.

A special software system, FUSE, was developed for this purpose thatseamlessly integrates commercial available software tools such as Matlaband Envision (robot simulator). Simulation in FUSE showed that the majorpart of the development of sensor guided robot control algorithms shouldbe performed by simulation, since it cuts time, expenses and efforts, es-pecially when software simulation is included in the methodology.

1 Introduction

1.1 Robot welding

Since manual labor is a highly limited resource, especially when it comes toskilled craftsmen, robot automation is essential for future industrial expan-sion. One application area is presently robot welding, by which the welding

Page 116: Development of intelligent robot systems based on sensor control

100

quality and the environmental conditions for welders are improved and theproductivity is increased. This applies especially to robot welding in ship-building [23, 24, 25] where large structures are welded, including the joiningof large double-hull sections.

1.2 Seam tracking

Seam tracking [5, 7, 8, 13, 19] is essential for automation in shipbuilding formanufacturing of large passenger and cargo ships, such as super-cruisers andoil-tankers, where high tolerances in the sheet material are allowed to minim-ize manufacturing costs.

A great number of Sensor Guided Robot Control (SGRC) systems for seamtracking at arc welding have been developed. The patents within this applic-ation area during the last 40 years indicates that a there is a clear tendencythat old methods using mechanical, inductive, electrical and infrared sensorsare becoming less important along with the use of electron beams and camerasystems. Today laser scanners and arc-sensors mainly replace these systems.

Systems based on laser scanners and arc sensors differ in accuracy, geometryand price. Laser scanners provide for a more accurate signal than arc sensors,which contain much noise due to the interference of the welding process. Onthe other hand, laser scanners have to be mounted on the torch, decreasing theworkspace of the robot. Laser scanners are also significantly more expensivethan arc sensors, which perhaps is one of the reasons why the majority of thepatents that have been issued during the last 10 years for seam tracking atarc welding [3, 16, 17, 26, 27] are based on through-arc sensing, while systemsbased on laser scanners are hardly even represented.

1.3 Process control

Besides seam geometry at seam tracking, considerations have to be made ofprocess related welding parameters [1, 29]. The welding process containsmany parameters, such as the arc voltage, wire speed and wire material. Theaim is to determine feasible parameters for a welding procedure before seamtracking. This may be performed experimentally or by the use of knowledgebased systems [4, 28]. If it is however not possible or desirable to keep thesesettings constant throughout the seam [14, 15, 18, 27], for instance due tothe characteristics of the power-source, adaptive control may be introducedinto the seam tracking procedure for maintaining the desired welding quality.

Page 117: Development of intelligent robot systems based on sensor control

101

n

oa

TCP

Figure 1 Figure 1: Definition of Tool Center Point (TCP), and the orthonormalcoordinate system noa. Weaving is performed in n direction, o is opposite tothe direction of welding and a is the direction of approach.

1.4 Through-arc sensing

The usual methods used for automated arc welding are gas metal arc welding(GMAW), flux-cored arc welding (FCAW) and submerged arc welding (SAW). InGMAW, metal parts are joined together by heating them with an arc establishedbetween a continuous, consumable filler metal electrode and the workpiece.The filler metal is either transferred to the workpiece in discrete drops underthe influence of electromagnetic forces and gravity or in the form of moltenelectrode produced by repetitive short-circuiting.

Through-arc sensing was introduced in the beginning of the 80th and is de-scribed by among others G. E. Cook et al. [9]. According to experimental res-ults, the approximate relationship between arc voltage V , arc current I and thenearest distance between the electrode and the workpiece l, for an electrodeextension ranging between 5-15 mm, is expressed by the equation:

V = β1I + β2 + β3/I + β4l (1)

where the constants β1−β4 are dependent on factors such as wire, gas and thepower-source. Theoretically, if the power-source is adjusted for keeping thecurrent at a constant level, and succeeds to do so, V will be a linear function ofl. Practically, the voltage and current readings of the arc contain much noise,why the signal data has to be filtered by a low-pass filter.

In through-arc sensing the welding is performed parallel to the seam-walls, seeFig. 1. By weaving the arc across the weld joint, the geometrical profile of theworkpiece is obtained, since the distance from the tooltip perpendicular to thenearest wall, is a function of the arc current and the voltage, as approximatelyexpressed in Eq. 1.

Page 118: Development of intelligent robot systems based on sensor control

102

1.5 Control algorithms for seam tracking

Template matching

The first suggested method in [9] is template matching. In this method thewidth and centering corrections are made proportional to ea and en, wheret(x) and s(x) are the template signal and the measured arc signal as a func-tion of displacement x with respect to the center of the weld joint. The tem-plate signal is the measured arc current at welding, when the position of theworkpiece is optimal. A denotes in Eq. 2 and 3 the weaving amplitude:

ea =∫ A−A|t(x)− s(x)|dx (2)

en =∫ 0

−A|t(x)− s(x)|dx −

∫ A0|t(x)− s(x)|dx (3)

The template signal may here be analytically or empirically determined. Otherexamples of error calculations are by using the integrated difference and theintegrated difference squared errors. Control in a and n directions in Fig. 1is performed by comparing the average value of s(x) at, and near the weavecenter position with a reference value.

Differential control

The second method consists of differential control. It is computationally moresimple and has proven to be quite reliable for control in a and n directions.Sampling is only made at the turning points in the weaving trajectory. Meas-uring the arc-signal, i.e. the current in the case of GMAW, FCAW or SAW, theerror ea in a direction will be proportional to the difference between the av-erage current sampled at the center of the oscillation i(0), and the referencecurrent value Iref :

ea = Ka[i(0)− Iref ] (4)

In similar manner, the difference between the two samples is proportional tothe magnitude of the error en in n direction:

en = Kn[i+A − i−A] (5)

Page 119: Development of intelligent robot systems based on sensor control

103

where i+A and i−A are the average measured current at a pair of adjacentextreme points and A is the weaving amplitude. The parameters Ka and Knare dependent on the weld joint geometry and other process parameters suchas shielding gas and wire feed rate. Since these parameters will be known inadvance, Ka and Kn may be defined for any welding application.

1.6 Simulation using virtual sensors

Virtual sensors are presently used in many application areas, such as robot-ics, aerospace and marine technologies [6, 10, 20, 22]. Development of newrobot systems, such as for seam tracking may be accelerated by application ofsimulation [13, 12]. In general the design methodology of virtual sensors mayvary due to their specific characteristics. If the characteristics are not knownfor the design of analytical sensor models, artificial neural networks may beused [2, 21].

1.7 ROWER-2 application

The objective of the European ROWER-2 project was to automate the weldingprocess in shipbuilding, specifically for joining double-hull sections in super-cruisers and oil tankers. According to the specifications the workers have toable to mount the robot system inside the hull-cell and supervise the weldingprocess from a remote distance. Every hull-cell should be equipped with amanhole, through which the robot may be transported, see Fig. 2. Since therobot has to be transported manually, the constraint on the robot is that eachpart is allowed to weigh no more than 50 kg. Further on, the robot systemis designed to operate in standard hull-cells with predefined variations in di-mension. To be able to meet the specifications, a robot manipulator was builtbased highly on aluminum alloy. The robot was mounted on a mobile platformwith 1 degree of freedom to increase its workspace.

The method chosen for automatic welding in the ROWER-2 project was GMAWusing through-arc sensing at seam tracking [9]. A simple seam tracking al-gorithm was developed at an early stage of the project [13]. By the applica-tion of the Flexible Unified Simulation Environment (FUSE) [11], this algorithmwas optimized and evolved into many new algorithms. FUSE is an integratedsoftware system based on Envision (robot simulator, Delmia Inc.) and Mat-lab (MathWorks Inc.). The algorithms were initially designed to contain thebasic functionality of the Yaskawa COM-ARC III sensor [30] but were furtherdeveloped to meet the ROWER-2 specifications along with others added by theauthors.

Since one of the simple algorithms showed by simulation to meet the ROWER-2 specifications, it was chosen for implementation in the project. The imple-

Page 120: Development of intelligent robot systems based on sensor control

104

Figure 2 A double hull-cell in ROWER-2 with manhole through which the robotequipment is manually transported. One of the walls and the floor have beenexcluded from the picture.

mentation was performed in C++, running on a QNX-based1 embedded system.The algorithm was further developed to be able to handle long welds by usinglinear interpolation. A method was additionally developed to compensate forthe power-source controller that interfered with the seam tracking process bydisabling control in negative a-direction of the TCP. An automatic delay de-tection for synchronization between the control system and the data from thearc-sensor was also designed to secure control in n-direction of TCP.

2 Materials and methods

2.1 Systems development methodology

The methodology is based on the assumption that software development inrobotics is the part that requires most time, money and effort at developmentof robot systems. To optimize the procedure of robot design from virtualprototyping and systems development to the final integration of the robotin a production line, a new software system was designed that can be usedduring the whole process. FUSE fills the gap that traditionally exists betweenCAD/CAM/CAE, robot simulation systems and software simulation environ-ments.

1QNX is an operating system for embedded controllers.

Page 121: Development of intelligent robot systems based on sensor control

105

Figure 3 presents the methodology that was developed and used for the designand validation of a seam tracking algorithm in the ROWER-2 project. In thisfigure design of model denotes the design of a model that is focused on thefunctionality of the algorithm. Simulation and verification of the model de-notes the process of estimating the potential and finding the limitations ofthe algorithm. Software structure simulation is not bound to the model itselfbut to the implementation of the model at a later stage.

Yes

No

Design of model based on specifications

Simulation and verification of model

Simulation of software structure

Evaluation of simulation model

Are the specifications met?

Implementation ofsoftware code in C++

Physical experiments and validation

QN

XE

XP

FU

SE

FU

SE

FU

SE

FU

SE

Figure 3 The methodology developed for the design and physical validation ofthe seam tracking algorithm in the ROWER-2 project.

It is an awkward task to debug complex algorithms after they have been imple-mented in a robot system. At an early development stage, however, a detailedsimulation of the execution flow, supported by automatic testing programswill most likely isolate the problems. The test programs may be used to sys-tematically find the limits of the algorithm and make sure that it behaves asexpected. Any significant change in the algorithm structure in the verifica-tion phase implies similar changes in the simulation model of the software tomirror the program flow in the robot system. If the algorithm has been de-veloped with care using software simulation to begin with, such modification

Page 122: Development of intelligent robot systems based on sensor control

106

will require minimal effort.

By physical validation, deficiencies may be found in the simulation model.If the specifications are not met, the model is modified and new model andprogram structure simulations are performed. When the specifications aremet by physical validation, the process is terminated by a final evaluationof the simulation model using the optimized parameters found by physicalvalidation.

2.2 Joint profiles

The simulation was performed on fillet and V-groove joints, according to thespecifications in the ROWER-2 project. Due to these, the algorithm had to beable to make compensations for deviations of ±300 mm in y and z directions(see Fig. 4) during a 15 m long weld. This was redefined as a maximum devi-ation of ±20 mm per meter weld, or ±2% expressed in percentage. Profiles offillet and V-groove joints are presented in Fig. 4. Since the start point of theseam is always found by image processing before seam tracking is performed,no special consideration is taken at start.

3−6 mm

70−1100 20−30 0

Fillet joint

V−groove joint

Plate thickness: 10−15 mm

y

zy

z

Figure 4 Fillet and V-groove joints were used in simulation and physical val-idation. The lengths of the sample plates in the experiments were 600 mm(orthogonal to the plane of the figure).

2.3 Experimental setup

The functionality of the seam tracking model and the essential features of theprogram structure were simulated in FUSE on SGI workstations and manuallytranslated to C++ for implementation in the robot system running on QNXOS, using a real-time industrial PC. As power-source for welding, a MigatronicBDH STB Pulse Sync 400 for MIG/MAG was chosen operating together with aPlanetics Mars-501 push-pull unit. The push-pull unit is suited for welding

Page 123: Development of intelligent robot systems based on sensor control

107

cables up to 25 meters between the units and another 16 meters betweenthe pull unit and the welding torch. OK Autrod 12.51 was used as weldingwire together with 80Ar/20CO2 shielding gas. Figure 5 displays the ROWER-2system.

Figure 5 The ROWER-2 robot system. The system is mounted on a mockup thatis about 3.5 m high, belonging to the hull-cell of a ship. The metallic box nearthe torch is a camera that is used to identify the start point of the seam.

3 Experimental results

3.1 Simulation experiments

Overview

A number of seam tracking algorithms were developed and implemented inFUSE. The differential algorithm, which was considered to be the easiest oneto implement and required the lowest amount of computational power, yetsatisfying the ROWER-2 specifications, was chosen for implementation andphysical validation in the ROWER-2 project. The simulation and validation

Page 124: Development of intelligent robot systems based on sensor control

108

work presented in this section is based on the differential algorithm presentedin the introduction.

FUSE simulations

The simulations in FUSE were performed using FUSE Wizard, see Fig. 6. Thewizard lists the available algorithms and makes suggestion of parameters forthe selected one before the start of the simulation.

The first algorithm in the wizard is a simple differential algorithm using dis-tance measurements directly retrieved from the FUSE environment. This wasan early prototype of the differential algorithm, but worked principally in thesame way as the calibrated differential, implemented in the robot system. Thecalibrated differential algorithm uses a virtual arc-sensor and is calibrated byFUSE Wizard. At calibration, the value of the main current is calculated andsaved. The calibration value may be changed before the start of any new ex-periment. In simulation the same calibration value showed to work properlyfor both fillet and V-groove welds. In physical experiment however, differentcalibration values for fillet and V-groove welds were used for optimal perform-ance.

The two last methods in the wizard input dialog are two algorithms that usestatistical methods to find the 2D profile of the seam, perpendicular to thedirection of motion. The last algorithm is in addition able to adapt to theorientation of the joint perpendicular to the direction of motion by rotationof the TCP around the o-axis.

Initial simulations showed that the differential algorithm was theoreticallyable to meet the specifications. By physical experiments, the model was mod-ified due to changes of parameters such as welding speed, weaving amplitudeand frequency. These data, along with other data such as nominal voltage,were optimized by a series of tests performed by an experienced welding spe-cialist tuning the welding system. To evaluate the simulation model comparedto real experiments, a final series of simulations were performed using thepower-source parameters that had shown to give high quality welds.

According to the evaluation, the theoretical limits for the algorithm is a devi-ation in the interval between –10% and 30% withKa = 0.01 andKn = .005. Thisis better than ±2% (specifications). The conclusion is that the ideal case, whenthe current has very low amount of noise and Eq. 1 is valid, the maximum al-lowed deviation is ±20% (moving the offset of the nominal welding trajectoryto the middle). The asymmetrical performance (–10% to 30%) is most likelyrelated to the present control system.

Figures 7-17 present a few simulations performed for evaluation purpose afterthe verification of the algorithm by series of real experiments. In all theseexperiments, Kn was 50% of Ka, which is an empirically derived optimal value

Page 125: Development of intelligent robot systems based on sensor control

109

Figure 6 The FUSE Wizard. For each algorithm a special set of parametersare suggested that may be modified by user input before simulation. New al-gorithms may be added or removed from the list throughout the developmentprocess.

Page 126: Development of intelligent robot systems based on sensor control

110

verified by robot welding experiments.

The criterion for a good weld is that the final result should at ocular examina-tion be similar to a straight weld without seam tracking. To be able to producesuch result, the smallest gains that still made it possible to meet the specific-ations had to be found, minimizing the instability that occurred during SGRC.At too low gains, the algorithm does not compensate enough. On the otherhand, if the gain is too large, instability will occur resulting in low weldingquality. So the trick is to find the largest gains for Ka and Kn both for positiveand negative deviations and both for fillet and V-groove welds at which theseam tracking remains stable and converges smoothly to the seam. At thesegains the maximum possible deviations are experimentally found by increas-ing the deviation step by step until the algorithm has reached the limits of itsperformance.

Vector booster method

To increase the performance of the algorithm, a feature was implementedcalled the vector booster. The vector booster method consists of a pre-calculationof seam direction at the beginning of seam tracking and subsequent altera-tion of nominal trajectory according to the initial prognosis for the rest of theseam. This method showed to increase the performance of the algorithm bya factor of up to two. The vector booster is not optimized for dealing withsmall deviations, but rather extreme deviations, many times larger than thespecifications, see Figs. 7-8. The vector booster does theoretically increase thewelding quality at large deviations, due to the need of lower gains in the SGRCsystem. With larger expected deviations, higher gains have to be set by theoperator. And high SGRC gains may in turn decrease the welding quality dueto higher instability at seam tracking, by causing small oscillations.

3.2 Validation by robot welding

Overview

Initial simulations showed that the algorithm used less than 1% of the compu-tational power it was assigned, which equals 1/2000 of the total computationalpower of the real-time embedded system, so the efforts to produce efficientcode was successful. The simulation and validation loop described in Fig. 3was repeated twice and concluded with the evaluation simulations. The firstphysical experiments consisted of a number of fillet, V-groove and flat surfacewelds performed by the robot, with and without seam tracking. The primarytask was to find the set of parameters for the power-source that resulted ingood welding quality.

The second robot welding experiments were performed on author requests

Page 127: Development of intelligent robot systems based on sensor control

111

Figure 7 Extreme deviations like this (60%) are not used in practice. Accordingto specification, the algorithm has to handle a deviation of 2%, which is 30 timessmaller than the one in the picture. The picture demonstrates however the effectof the vector booster, working especially well for large deviations. Although thesimulation succeeded, A deviation of 30% or smaller is recommended using thevector booster for maximum quality of the weld.

Figure 8 The same experiment as previous figure. The vector booster showedto be of limited interest at moderate deviations. The method inspired howeverthe design of the power-source compensation component in the algorithm, de-scribed later in this section.

Page 128: Development of intelligent robot systems based on sensor control

112

Figure 9 The seam tracking algorithms were first developed using workpieceslike this one.

Figure 10 At a later stage, when it turned out that all seams in practice werestraight, only such workpieces were used for simulation experiments. A V-gooveweld simulation is displayed in this picture.

Figure 11 Seam tracking with deviations by 30% in y and z directions, usingthe vector booster. Left: at Ka = 0.010 the seam tracking process is very stable,which is a prerequisite for high welding quality. Right: Ka = 0.020 gives highinstability.

Page 129: Development of intelligent robot systems based on sensor control

113

Figure 12 Seam tracking with deviations of –20% (left) and –10% (right) in yand z directions, with Ka = 0.010, using the vector booster. The process is fullystable, and at –10% the convergence is good.

Figure 13 The same as previous picture, but without the vector booster. Asexpected the convergence is even slower at –20% than before (left), but is stillvery good at –10% (right).

Figure 14 Seam tracking with deviations of 30% (left) and 40% (right) in y andz directions with Ka = 0.010. There is no instability and the convergence is fine,but there is some clippings at 40%. The theoretical positive limit without usingthe vector booster is thus 30%.

Page 130: Development of intelligent robot systems based on sensor control

114

Figure 15 Seam tracking with deviations of 20% in y and z directions, using thedifferential method with direct measurement in FUSE. Ka = 0.010. The processis stable and the convergence is fine.

Figure 16 Seam tracking with deviations of 20% in y and z directions, Ka =0.010. The stability is high, but in the left picture the weaving is too near oneof the walls. In the right picture convergence is fine, since the vector boosteris used. In both experiments current calibration was used, which is default forthe algorithm.

Page 131: Development of intelligent robot systems based on sensor control

115

Figure 17 Seam tracking with deviations of 10% (left) and –10% (right) in y andz directions. Ka = 0.010. Good results, but these are the limits in positive andnegative directions.

Figure 18 Example of fillet weld.

Page 132: Development of intelligent robot systems based on sensor control

116

Figure 19 Example of V-groove weld. These workpieces are single sided andthe seams have to be covered from the back by ceramic material to preventwelding flux from passing through the seams, and thereby cause low weldingquality.

and consisted of about 25 fillet and 25 V-groove joints. Also as a final evalu-ation of the implemented algorithm 10 fillet and 10 V-groove welds were car-ried out. In addition, an uncounted number of fillet and V-groove welds wereperformed to find good parameter settings for the power-source. The over-head experiments showed to be many compared to the pure seam tracking ex-periments. About 120 fillet and 75 V-groove joint workpieces were estimatedto have been used during the two experimental occasions. These overhead ex-periments showed to be very important for the development of the algorithm.Without precise parameter settings of the power-source seam tracking wouldhave worked, but without producing high quality welds.

Anomalies caused by power-source

In theory, seam tracking requires that the relation between arc voltage andcurrent is known, such as by Eq. 1. Usually the voltage is held constant, whilecurrent changes due to the distance between wire and workpiece. In synergicwelding however, both current and voltage are modified by the control systemof the power-source, causing disruption in the control system of the algorithm.Early experiments with the Migatronic power-source, using the pure synergicmode showed that this mode disabled algorithm control in the negative ap-proach direction of the TCP.

Page 133: Development of intelligent robot systems based on sensor control

117

Initial seam tracking experiments using manual mode showed that the currentconstantly decreased throughout the weld, making compensation in negativea direction impossible. A thorough examination of the current sensor showedthat the current measurements were both stable and sufficiently accurate forthe application and that the decrease of current throughout the weld was notdue to measurement errors. The conclusion was therefore that the Migatronicpower-source controller was most likely controlling the current also at manualmode. The reason is assumed to be that modern power-sources also includesome adaptive control in manual mode to assist humans in performing highquality welds.

Compensation for power-source control

Addition of a constantly increasing offset to the nominal trajectory solved theproblem caused by the adaptive behavior of the power-source. The solutionhad similarities with the vector booster method, but the modification of thetrajectory increased constantly in negative direction of the approach axis.

To be able to handle negative deviations, Ka had to be doubled. The methodwas tested for fillet joints and showed to be a reliable and permanent solutionto this problem. Since the same principal is valid for fillet as V-groove welds,no experimental series were considered necessary to prove the validity of thepower-source compensation for V-groove welds.

Power-source parameter settings

The following data was primarily acquired and logged during the second ex-perimental series consisting of 80 fillet and V-groove experiments: (1) object-ive, (2) label, (3) Ka, Kn, (4) deviations in y and z directions, (5) welding speed,(6) weaving frequency and amplitude, (7) weaving shape, see Fig. 20, (8) nom-inal voltage and current, (9) wire speed, (10) offset magnitude calculated bythe vector booster in x, y and z directions, (11) a detailed description of theresults. The optimal parameters that were experimentally found, giving highwelding quality, are presented in Table 1.

Review of fillet welds

Example of typical fillet and V-groove samples are displayed in Figs. 18 and19. Information from these initial experiments was used for the modificationof the algorithm, followed by a new series of simulations. Some selected filletand V-groove experiments are presented by photos in Figures 21 and 24.These experiments are further commented and reviewed below.

1013. 1013. Deviation in z direction by 8%, followed by multipass welding.

Page 134: Development of intelligent robot systems based on sensor control

118

Sine Sinesquare Square

Sawtooth Triangle Ellipse

Figure 20 The implemented weaving shapes in the algorithm. Sinesquare waschosen for fillet and sawtooth for V-groove welds for the achievement of max-imum welding quality after a series of experiments and consultation with weld-ing expertise. The sinesquare consists of a sine wave with an amplitude of 2units, truncated at 1 unit.

Parameter Fillet weld V-groove jointNominal voltage (V) 28 28.5Ka 0.015 0.012Kn 50% of Ka 50% of KaWeaving frequency (Hz) 3 2Weaving shape Sinesquare SawtoothWelding speed (mm/s) 8 3.5Wire speed (m/min) 9 7Weaving amplitude (mm) 3 4

Table 1 Recommended parameters using the Migatronic power-source, derivedby experiments.

Page 135: Development of intelligent robot systems based on sensor control

119

Figure 21 Pictures of some selected fillet welds.

Page 136: Development of intelligent robot systems based on sensor control

120

Since Ka and f were too small, seam tracking failed by a deviation of 2.3%. Ka= 0.05, f = 1.5 Hz.

1026. Deviation in y direction by 4%. Kn was too high and caused oscillationsin n direction. Ka = 0.05, f = 3 Hz.

1027. Reference welding without seam tracking, nearly optimal with f = 3 Hz.The distinct edges of the weavings indicate too high nominal voltage, cuttingtoo deep into the workpiece.

1032. Deviations in y and z directions by 8%. High welding quality, but thisis the limit for positive deviations. Ka = 0.025, f = 3 Hz.

1057. Deviations in both y and z directions by 2% (specifications). Highwelding quality in both root and multipass layers. Ka = 0.0125, f = 3 Hz.

1062. Final test. Deviation in y direction by 2% (specifications), using power-source compensation. High welding quality. Some instability occurred at thebeginning of the seam and the stick-out was 3 mm too large at the end of theseam. Ka = 0.015, f = 3 Hz.

1063, 1065-1066 Final tests. Deviations of -2% for 1063 and 2% for 1065-1066in y and z directions (specifications), using power-source compensation. Highwelding quality. Ka = 0.015, f = 3 Hz. Multipass welding in 1066. The stick-out was 3 mm too large throughout the seam for 1063 and 3 mm too largeat the end of 1065-1066. Due to interference with the root layer at multipasswelding, the last of the three layers deviated by 2 mm at the end of the seam.

Additional information and summary of the selected fillet experiments inFig. 21 follows below:

1. High weaving frequency gives fast control response. When the weav-ing frequency was increased from 1.5 Hz in 1013 to 3 Hz in 1032, thealgorithm performance (compensation ability) was doubled.

2. The experiments proved that Kn should be 50% of Ka or less for stablecontrol, which was previously found by simulation.

3. In the presented fillet experiments, Kn was 50% of Ka in all cases exceptin 1013 where it was 75%. Due to problems in the newly developed robotsystem (regarding tuning of the motor control unit) the highest weavingfrequency that could be used was 2-3 Hz. At 3 Hz, the weaving amplitudewas not able to exceed 3 mm despite the setting of 5 mm.

4. The fluctuation that occurred in the beginning of the seam in 1062 wasdue to the applied automatic calibration method. It was however foundthat the same reference current value could be used in the algorithm forall fillet welds, eliminating the fluctuations at start.

5. The stick-out (of the wire) was in some experiments 3 mm larger thandesired, according to the welding specialist. This did not effect welding

Page 137: Development of intelligent robot systems based on sensor control

121

quality in the root layer, but effected the multi-pass layers, since sub-sequent layers interfered with the root layer. This interference was notlarger than about 2 mm, but for high welding quality at multipass weld-ing the algorithm should be fine-tuned to be able to reduce the stick-outsome millimeters. One way to achieve this is to slightly increase the ref-erence current value. The power-source compensation should also beamplified to avoid collision of the torch with the workpiece.

6. The most important result from this experimental series was that power-source compensation showed to work properly.

0 50 100 150 200 250 3000

200

400

600

800

1000Fillet weld 1057

150 160 170 180 190 200 210 220 230

200

250

300

350

ticks

Arc

cur

rent

(A

)

Figure 22 Current samples from the first 6 seconds of a fillet weld experiment,using power-source compensation. The unit of the horizontal axis is ticks, whichequals 1/50 seconds. The current values are filtered by an active 4:th orderBessel low-pass filter with f0 = 10 Hz, added to the current sensor device, de-livered from Migatronic. Though some transient spikes occurred in the initialpart of the welding, the remaining data had relatively low level of noise. Thelower figure is a magnification of a selected area of the upper.

Page 138: Development of intelligent robot systems based on sensor control

122

0 50 100 150 200

200

300

400

Fillet weld 1057

0 50 100 150 200

200

300

400

Arc

cur

rent

(A

)

0 50 100 150 200

200

300

400

setpoints

Ia

In1

In2

Figure 23 Average current samples throughout the weld, using power-sourcecompensation. The unit of the horizontal axis is setpoints, which in short weldsis equal to one weaving cycle, but in longer is an integer multiple of weavingcycle. The upper figure displays Ia, which is the average current over eachsetpoint. The compensation in the TCP approach axis is made by multiplyingthis factor by the gain in a direction, Ka. In the middle and lower figuresaverage currents In1 and In2 in the surroundings of the turning points of theweaving are displayed. The compensation inn direction is basically determinedby Kn multiplied with the difference of In1 and In2.

Page 139: Development of intelligent robot systems based on sensor control

123

Review of V-groove welds

Below follows comments and a review of the selected V-groove welding exper-iments presented in Fig. 24.

2031. Deviation in y direction by -2%. High welding quality. Ka = 0.015, f =2 Hz.

2033. Deviations in y and z directions by 2%. Seam tracking failed due tosmall compensation caused by the average current estimator. A 10 mm toolarge stick-out caused bubbles in the weld and a poor welding quality. Ka =0.020, f = 2 Hz.

2041. Deviations in y and z directions by 2%. High welding quality. Anti-transient current limiters were added to the algorithm and a 230 A thresholdwas added to the reference current estimator. Ka = 0.012, f = 2 Hz, U = 29 V.

2043. Final test. Deviations in y and z directions by 2%. High welding quality.Basically the same experiment as above, except for U = 28.5 V, for enhancedwelding quality. Ka = 0.012, f = 2 Hz.

2047. Final test. Deviation in y direction by 2% with two multipass layers ontop of the root layer. High welding quality. Ka = 0.015, f = 2 Hz.

Figure 24 Pictures of some selected V-groove welds.

Page 140: Development of intelligent robot systems based on sensor control

124

4 Conclusions

The ROWER-2 specifications were fulfilled by the development and physicalvalidation of a seam tracking algorithm distinguished by:

• Stable control ina andn directions for fillet and V-groove joints (negativea direction for V-grooves based on fillet weld experiments).

• Ability to perform multipass welding using interpolation.

• Maintaining high welding quality throughout the seam by good power-source parameter settings and the use of power-source compensation.

• Design and use of an auto-phase analyzer, calculating the total delay inthe robot system for accurate control in n direction (A new version ofthe FFT algorithm was designed and implemented. This special versionworked faster and was simpler to implement for this specific task thanstandard FFT).

• Design and implementation of the vector booster method.

• Automatic reference current calibration.

5 Acknowledgements

The ROWER-2 project was funded by European Commission under the BRITE/EURAMprogram. Thanks also to the project teams at Tecnomare (Venice), IAI (Madrid)and Fincantieri (venice) for support in development of the ROWER-2 systemand experimental studies.

References

[1] S. Adolfsson. Simulation and execution of autonomous robot systems.PhD thesis, Department of Production and Materials Engineering, LundUniversity, 1998.

[2] M. Arroyo Leon, A. Ruiz Castro, and R. Leal Ascencio. An artificial neuralnetwork on a field programmable gate array as a virtual sensor. In Designof Mixed-Mode Integrated Circuits and Applications, 1999. Third Interna-tional Workshop on, pages 114–117, July 1999.

[3] P. Berbakov, D. MacLauchlan, and D. Stevens. Edge detection and seamtracking with emats. Patent No. US6155117, December 2000.

Page 141: Development of intelligent robot systems based on sensor control

125

[4] G. Bolmsjö. Knowledge based systems in robotized arc welding. In Inter-national Journal for the Joining of Materials, volume 9 (4), pages 139–151,1997.

[5] G. Bolmsjö. Sensor systems in arc welding. Technical report, RoboticsGroup, Department of Production and Materials Engineering, Lund Uni-versity, Sweden, 1997.

[6] G. Bolmsjö. Programming robot welding systems using advanced simu-lation tools. In Proceedings of the International Conference on the Joiningof Materials, pages 284–291, Helsingør, Denmark, May 16-19 1999.

[7] K. Brink, M. Olsson, and G. Bolmsjö. Increased autonomy in industrialrobotic systems: A framework. In Journal of Intelligent and Robotic Sys-tems, volume 19, pages 357–373, 1997.

[8] P. Cederberg, M. Olsson, and G. Bolmsjö. A generic sensor interface inrobot simulation and control. In Proceedings of Scandinavian Symposiumon Robotics 99, pages 221–230, Oulu, Finland, October 14-15 1999.

[9] G. E. Cook, K. Andersen, K. R. Fernandez, M. E. Shepard, and A. M. Wells.Electric arc sensing for robot positioning control. Robot Welding, pages181–216. Editor: J. D. Lane. IFS Publications Ltd, UK, Springer-Verlag,1987.

[10] W. Doggett and S. Vazquez. An architecture for real-time interpretationand visualization of structural sensor data in a laboratory environment.In Digital Avionics Systems Conferences, 2000. Proceedings. DASC. The19th, volume 2, pages 6D2/1–6D2/8, Oct. 2000.

[11] M. Fridenfalk and G. Bolmsjö. The Unified Simulation Environment -Envision telerobotics and Matlab merged in one application. In the Pro-ceedings of the Nordic Matlab Conference, pages 177–181, Oslo, Norway,October 2001.

[12] M. Fridenfalk, U. Lorentzon, and G. Bolmsjö. Virtual prototyping andexperience of modular robotics design based on user involvement. Inthe Proceedings of the 5th European Conference for the Advancement ofAssistive Technology, Düsseldorf, Germany, Nov. 1999.

[13] M. Fridenfalk, M. Olsson, and G. Bolmsjö. Simulation based design of arobotic welding system. In Proceedings of the Scandinavian Symposiumon Robotics 99, pages 271–280, Oulu, Finland, October 14-15 1999.

[14] G. Ågren and P. Hedenborg. Integration of sensors for robot and weld-ing power supply control. In Proceedings of Fifth International WeldingComputerization Conference, pages 198–204, Golden, USA (AWS), August1994.

Page 142: Development of intelligent robot systems based on sensor control

126

[15] P. Hedenborg and G. Bolmsjö. Task control in robotics arc welding by useof sensors. In Proceedings of the Tampere International Conference onMachine Automation - Mechatronics Spells Profitability, IFAC/IEEE, pages437–445, Tempre, Finland, February 15-18 1994.

[16] K. Hedengren and K. Haefner. Seam-tracking apparatus for a welding sys-tem employing an array of eddy current elements. Patent No. US5463201,October 1995.

[17] J. Huissoon and D. Strauss. System and method for tracking a featureon an object using a redundant axis. Patent No. US5465037, November1995.

[18] L.-O. Larsson and G. Bolmsjö. A model of optical observable parametersto control robotic GMA welding in sheet steel. In Proceedings of the FifthInternational Conference on Computer Technology in Welding, pages 2–10, paper 42, Paris, France, June 15-16 1994.

[19] M. Olsson. Simulation and execution of autonomous robot systems. PhDthesis, Division of Robotics, Department of Mechanical Engineering, LundUniversity, 2002.

[20] M. Oosterom and R. Babuska. Virtual sensor for fault detection and isol-ation in flight control systems - fuzzy modeling approach. In Decisionand Control, 2000. Proceedings of the 39th IEEE Conference on, volume 3,pages 2645–2650, Dec. 2000.

[21] H. Pasika, S. Haykin, E. Clothiaux, and R. Stewart. Neural networks forsensor fusion in remote sensing. In Neural Networks, 1999. IJCNN ’99.International Joint Conference on, volume 4, pages 2772–2776, July 1999.

[22] P. Ridao, J. Battle, J. Amat, and M. Carreras. A distributed environment forvirtual and/or real experiments for underwater robots. In Robotics andAutomation, 2001. Proceedings 2001 ICRA. IEEE International Conferenceon, volume 4, pages 3250–3255, May 2001.

[23] B. Rooks. Robot welding in shipbuilding. The Industrial Robot, 24(6):413–417, 1997.

[24] S. Sagatun and S. Hendseth. Efficient off-line programming of robot pro-duction systems. In Proceedings of the 27th International Symposiumon Industrial Robots. Robotics Towards 2000, pages 787–792, CEU-CentroEsposizioni UCIMU, Cinisello Balsamo, Italy, 1996.

[25] P. Sorenti. Efficient robotic welding for shipyards—virtual reality simu-lation holds the key. The Industrial Robot, 24(4):278–281, 1997.

[26] Y. Suzuki. Method and apparatus for determining seam tracking controlof arc welding. Patent No. EP1077101, February 2001.

Page 143: Development of intelligent robot systems based on sensor control

127

[27] J. Villafuerte. Arc welding torch. Patent No. US6130407, October 2000.

[28] M. Xie and G. Bolmsjö. Using expert systems in arc monitoring and causaldiagnosis in robotic arc welding. In International Journal for the Joiningof Materials, volume 4 (4), 1992.

[29] M. Xie and G. Bolmsjö. Quality assurance and control for robotic GMAwelding - part 1: QA model and welding procedure specification. In Join-ing Sciences, volume 1 (4), 1993.

[30] Yaskawa. Arc Sensor COM-ARC III Instructions.

Page 144: Development of intelligent robot systems based on sensor control

128

Page 145: Development of intelligent robot systems based on sensor control

Paper V

Design and validation of a universal 6D seamtracking system in robotic welding using laser

scanning

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

Submitted to the Journal of Industrial Robot, January 2003

Page 146: Development of intelligent robot systems based on sensor control
Page 147: Development of intelligent robot systems based on sensor control

Design and validation of a universal 6D seamtracking system in robotic welding using laser

scanning

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

Abstract

This paper presents the design and validation of a universal 6D seam trackingsystem that reduces the need of accurate robot trajectory programming andgeometrical databases in robotic laser scanning. The system is able to followany 3D spline seam in space with relatively small radius of curvature by thereal-time correction of the position and orientation of the welding torch, usinga laser scanner in robotic arc or laser welding. This method is further usefulfor the seam tracking of any seam with a predefined signature of the seamprofile, also at non-welding applications, and is able to seam track objectswith very small radius of curvature at pre-scanning, including theoreticallyalso sharp edges.

The 6D seam tracking system was developed in the FUSE simulation environ-ment, integrating software prototyping with mechanical virtual prototyping,based on physical experiments. The validation experiments showed that thissystem was both robust and reliable and should be able to manage a radius ofcurvature less than 200 mm. In the pre-scanning mode, a radius of curvaturedown to 2 mm was managed for pipe intersections at 3 scans/mm, using alaser scanner with an accuracy of 0.015 mm.

1 Introduction

Robot automation increases productivity, the product quality and frees manfrom involuntary, unhealthy and dangerous work. While computational powerhas increased exponentially during the last decades, the limitations in flexib-ility constitute today a bottleneck in the further evolution of robotic systems.

Page 148: Development of intelligent robot systems based on sensor control

132

One application for intelligent systems is seam tracking in robotic welding.Seam tracking is among others used for the manufacturing of large ships suchas cruisers and oil tankers, where relatively high tolerances in the sheet ma-terial are allowed to minimize the manufacturing costs [3, 12]. Seam trackingis today typically only performed in 2D, by a constant motion in x and com-pensation of the errors in y and z directions, see Fig. 1. There exist solutionsfor higher autonomy that include seam tracking in 3D where compensationis also performed in o direction or around an orientation axis. These limita-tions make however seam tracking only possible for workpieces with relativelysimple geometrical shapes.

n

x

Ω

oa

TCP

z

y

Figure 1 Definition of Tool Center Point (TCP) and the orthonormal coordinatesystem n,o,a. Here, o is opposite to the direction of welding and perpen-dicular to the plane Ω, and a is the direction of approach. x,y, z is a localcoordinate system, defined for the workpiece.

The next step in the evolution of sensor systems in robotic welding is theintroduction of a full 6D sensor guided control system for seam tracking that isable to correct the TCP in x, y , z and around roll, pitch and yaw. Such ultimatesystem is per definition able to follow any continuous 3D seam with moderatecurvature. This system has many similarities with an airplane control system,where a change in either position or orientation effects all other degrees offreedom.

Though seam tracking has been performed in 2D for at least 40 years, the hy-pothetical design of 6D seam tracking systems was probably first addressedin the beginning of the last decade by suggestions of models based on forcesensors [5, 6] and laser scanners [17, 18]. It is however difficult to evaluatethese systems, since no experimental results are presented, neither any expli-cit control systems for creating balance between the subsystems.

The contribution of this paper is the invention of a universal 6D seam trackingsystem for robotic welding, validated by simulation experiments based on

Page 149: Development of intelligent robot systems based on sensor control

133

physical experiments, that proved that 6D seam tracking is possible and evenvery robust for laser scanning [13]. On a more detailed level, the contributionsof this paper are considered to be the introduction of: (1) differential controlin laser scanning, using the same techniques as used in arc sensing [7], (2)trajectory tangent vector control using differential vector interpolation, and(3) pitch control by vector interpolation. These components constitute thefoundation of the 6D seam tracking system.

The authors have found only a few references from the literature that describesimilar work, which is the introduction of a trajectory tangent vector by curvefitting [17, 18]. There exist however some essential differences between howsuch vector was used and implemented. The differences consist of (1) curvefitting by 2nd instead of 3rd degree polynomials, for faster computation andstill high control stability, (2) using an analytic solver for curve fitting of 2nddegree polynomials developed and implemented for this system, increasingthe calculation speed further and (3) using differential vector interpolationinstead of direct use of the trajectory tangent vector, which showed to beessential for maintaining control stability.

A 6D seam tracking system increases the intelligence and flexibility in man-ufacturing systems based on robotic welding using laser scanning. This re-duces the need for accurate robot trajectory programming and geometricaldatabases. The simulation results (based on physical experiments) showedthat the initial objective to design a 6D seam tracking system was reachedthat could manage a radius of curvature down to 200 mm. Further on, if low-speed seam tracking is performed without welding, to find the geometry ofthe workpiece, there is basically no limit how small radius of curvature thissystem is able to manage.

2 Materials and methods

2.1 Sensors guided robot control

Many systems have been developed during the last decades for seam trackingat arc welding. The patents and scientific publications within this applicationarea show that laser scanners and arc sensors today replace older systems.Laser scanners have in general high accuracy but are expensive and have to bemounted on the torch, thereby decreasing the workspace of the robot. Thisproblem is partly solved by introduction of an additional motion that rotatesthe scanner around the torch to keep its relative orientation constant withrespect to the seam profile. In addition, it should be mentioned that there existsolutions for digital image processing of the workpiece before welding. Theproblems are however, that small changes in the surrounding light may disturbanalysis. Further on, 3D spline curves requires the camera to follow the seamall the way to be able to capture segments that otherwise would have remained

Page 150: Development of intelligent robot systems based on sensor control

134

hidden. This applies also to very long seams. In practical applications today,solutions based on pure image processing by a CCD camera are successfullyused to find the initial point where 2D seam tracking, based on laser scanningor arc sensing should start [12].

Laser scanning

In laser scanning, the seam profile is extracted by periodically scanning thelaser beam across a plane perpendicular to the direction of the seam. Thelaser scanner is mounted on the torch in front of the direction of welding.As a general rule, the control system is based on full compensation of theposition errors, which is equal to a constant value of K1 = 1, as defined inFig. 2.

K1 (p’− p

N−1)

p’

pN

n’a’

K3 (a’− aN−1

)

α2

α2

pN−1

aN−1

a’’

nN−1

n’’

pN

v0

v1 D

Figure 2 Cross section, plane Ω. Vector interpolation used for position andpitch control at laser scanning. The input from the laser scanner system is theintersection point of the seam walls v0 and the unit vector v1 on Ω between theseam walls. D denotes the distance between v0 and p′.

Arc sensing

Another method used for seam tracking is “through-arc” sensing or simply arcsensing [7]. Here, the arc current is sampled while a weaving motion is addedto the TCP inn direction, see Fig. 1. Since the distance from TCP to the nearestseam wall may be approximately calculated as a function of the current, it ispossible to compensate for deviations in the seam by analysis of the currentmeasurements. Arc sensors are inexpensive, do not decrease the workspaceof the robot, but are compared with other sensors for seam tracking relativelyinaccurate due to the stochastic nature of arc welding. Since welding of largestructures in general require a relatively low accuracy however, arc sensorsare a competitive alternative to laser scanners. Arc sensing may also be usedas a complementary sensor to laser scanners in some cases where the laserscanner is unable to access the seam track.

Page 151: Development of intelligent robot systems based on sensor control

135

Virtual sensors

The development of sensor guided control systems may be accelerated us-ing virtual sensors in the simulation control loop. Since a system’s state isoften easily obtained in simulation, assuming that the simulation model issufficiently accurate compared to the real system, this provides for a betterfoundation for process analysis than in general is possible by physical experi-ments. Further on, insertion of a sensor in a control loop may cause damage toexpensive equipment, unless the behavior of the entire sensor guided controlsystem is precisely known, which is rarely the case at the development stage.

Virtual sensors are used in many application areas, such as robotics, aerospaceand marine technologies [4, 9, 16, 19, 21]. Development of new robot sys-tems, such as for seam tracking may be accelerated by application of sim-ulation [14, 15]. In general, the design methodology of virtual sensors mayvary due to their specific characteristics. If a sensor model is not analyticallydefinable, artificial neural networks or statistical methods may be used to findappropriate models [2, 20].

2.2 Robot system

The design of the 6D seam tracking system was based on experiments andresults from the European ROWER-2 project [1]. The objective of this pro-ject was to automate the welding process in shipbuilding by the design of arobotic system, specifically for joining double-hull sections in super-cruisersand oil tankers. Figure 3 shows the robot system developed in the ROWER-2 project and the corresponding simulation model in FUSE (described below).The implementation of the 3D seam tracking system was performed in C/C++,running on a QNX-based embedded controller (QNX is a real-time operatingsystem for embedded controllers). The robot was manufactured partly in alu-minum alloy to decrease the weight for easy transportation and mounted on amobile platform with 1 degree of freedom for increased workspace. The robotsystem was integrated and successfully validated on-site at a shipyard.

2.3 Simulation system

The development and implementation of the 6D seam tracking system wasinitially performed in the Flexible Unified Simulation Environment (FUSE) [11].FUSE is an integrated software system based on Envision (robot simulator,Delmia Inc.) and Matlab (MathWorks Inc.), including Matlab Robot Toolbox [8].The integration was performed by compiling these software applications andtheir libraries into one single software application, running under IRIX, SGI.The methodology developed for the design of the 6D seam tracking systemwas based on the integration of mechanical virtual prototyping and software

Page 152: Development of intelligent robot systems based on sensor control

136

Figure 3 The design of the robot system for 3D seam tracking based on arcsensing in the ROWER-2 project (left) was based on simulations in FUSE (right).The 6D seam tracking system presented in this paper was developed based onexperience and results gained from the ROWER-2 project.

prototyping. The 6D seam tracking simulations in FUSE were performed byusing the robot developed in the ROWER-2 project, see Figs. 3, and a ABB S4IRB2400.16 robot. After the design and successful validation of a 6D seamtracking system for arc sensing, the virtual prototyping and software models,partly written in C-code and partly in Matlab code, were translated to pureMatlab code. The Matlab models were modified to work also for laser scannersby the addition of a virtual model of the Servo-Robot M-SPOT laser scannersystem. The translation was necessary, since it saved development time forthe implementation of the evaluations system, for the accurate calculation ofposition and orientation errors during simulation.

Figure 4 Example of a 6D seam tracking experiment performed in Matlab. Eachlaser scan is plotted at each control cycle while seam tracking is performed 180around a pipe intersection. The workpiece object corresponds to the object inthe upper experiment in Fig. 7.

Page 153: Development of intelligent robot systems based on sensor control

137

2.4 Calculation of position and orientation errors

Error is defined as the difference between desired and current pose P. The po-sition errors were calculated by projection of εa and εn on a plane Ωt perpen-dicular to the desired trajectory while intersecting current TCP, Pt . Thereby,per definition εo = 0. The orientation error (the orientation part of Pδ) iscalculated by:

Pδ = P−1t · PΩt (1)

where Pt is the current TCP and PΩt is the desired TCP for any sample point t.The errors around n, o and a were calculated by the subsequent conversionof the resulting transformation matrix to roll, pitch and yaw, here definedas rotation around, in turn, a, o and n. Rotation in the positive direction isdefined as counterclockwise rotation from the perspective of a viewer whenthe axis points towards the viewer.

3 Modeling

In this section the design of the 6D seam tracking system is described. Theprincipal control scheme of this system is presented in Fig. 5. The controlsystem is based on three main components, position, trajectory tangent vec-tor and pitch control. Pitch denotes rotation around o. The main input ofthis control system is the 4×4 transformation matrix PN−1, denoting currentTCP position and orientation. The output is next TCP, PN . N is the numberof samples used for orientation control. The input parameters are N , the pro-portional control constants K1, K2 and K3, and the generic sensor input valuesξ = [v0 v1] and ζ = v1, where v0 and v1 are defined in Fig. 2. The proportionalconstants K1, K2 and K3 denote the control response to errors in position, tra-jectory tangent vector and pitch control. N is also used in trajectory tangentvector control and is a measure of the memory size of the algorithm. The seamprofile presented in Fig. 1 is a fillet joint. Since v0 and v1 may be extractedfor any joint with a predefined signature, no other joints were required in theexperiments.

3.1 Position control

The 6D seam tracking algorithm starts by the calculation of next position forthe TCP. Next position pN is calculated by vector interpolation based on K1 inFig. 2, between current position pN−1 and the position p′ extracted from thelaser scanner.

Page 154: Development of intelligent robot systems based on sensor control

138

Position control

Trajectory tangentvector control

pN−1 pN pN−2 p1

K2

Pitchcontrol

K3

K1

...

ξ

ζ

PN−1

aN

PN

nN

oN

pN

aN−1

oN−1

pN−1

1z_ 1

z_ 1

z_

oN

o × a

Figure 5 The principal scheme of the 6D seam tracking control system. PN−1

and PN denote current and next TCP. PN is calculated for each control sampledesignated for the seam tracking motion.

3.2 Trajectory tangent vector control

The positions p1...pN are used for trajectory estimation by curve fitting, seeFig. 6. To decrease calculation time, an analytical estimator was devised for1st and 2nd degree polynomials that showed to work much faster than tra-ditional matrix multiplication followed by inversion techniques using Gausselimination. The estimation of the trajectory at N gives o′. Since an attemptto abruptly change the direction of o from oN−1 to o′ would most probablycreate instability in the system, the motion between oN−1 and o′ is balancedby vector interpolation using K2. The scalar κ2 is not explicitly used in thecalculations, but used for visualization of the calculation of oN in Fig. 6.

3.3 Pitch control

The first step in pitch control is performed by vector interpolation betweenaN−1 and a′, using K3, see Fig. 2. In a second step, aN is calculated by ortho-gonally projecting a′′ on a plane perpendicular to oN and normalization ofthe resulting vector. Finally, nN is calculated by the cross product oN × aN .

Page 155: Development of intelligent robot systems based on sensor control

139

X = c0 + c1 t + c2 t 2

PastPositions

CurrentPosition

NextPosition

o’

K2 (o’ − oN−1 )

oN /κ2

oN−1

∂X∂t t = N

F = F

pN

pN−1

pN

pN−2 pN−3

pN−4

Figure 6 Trajectory tangent vector control is performed by the least squarecurve fitting of the last N trajectory points to a 2nd degree polynomial curvefor x, y and z, followed by vector interpolation. Here N = 5. X, c0, c1 andc2 denote vector entities. The trajectory tangent vector F is for a 2nd degreepolynomial equal to c1 + 2Nc2.

3.4 Code and pseudo code samples

The actual implementation of the 6D seam tracking system showed to bestraightforward. Most of the development effort was spent on the design ofthe surrounding environment of the seam tracking system, including a semi-automatic testing and evaluation system. In simulation, the plane Ω is as-sumed to intersect the TCP point. In reality however, the laser scanner ismounted on the torch, often several centimeters ahead in the direction of mo-tion. The transformation to such system at real-time control is performed bystoring the trajectory data for use with a time delay. A pseudo code repres-entation of the 6D seam tracking system in Fig. 5, follows below:

1. pN = pN−1 − rw · oN−1 + K1 · (v0 +D · v1 − pN−1), where v0, v1 and Dare defined in Fig. 2 and rw is the nominal displacement in the directionof welding during a control sample.

2. Calculation of xN by shifting xN−1 one step to the left with pN : xN−1 =[p0 p1 ... pN−2 pN−1]→ xN = [p1 p2 ... pN−1 pN].

3. Estimation of the trajectory tangent vector F by least square curve fittingof the parameterized 3D curve X = C · T with a parameter t to threeindependent second-degree polynomials (for x, y and z). F = ∂X/∂t|t=Nwith X = C · T , where C = [c0 c1 c2] = [cx cy cz]T and T = [1 t t2]T(Note that AT is the transpose function for a vector or matrix A andhas no correlation with the vector T). This gives F = C · ∂T/∂t|t=N =C ·[0 1 2t]T |t=N = C ·[0 1 2N]T . C is calculated by the estimation of theparameters ci for i = x,y, z, where ci = (ATA)−1(ATpi) and pi arethe row vectors of xN , which may also be written as xN = [px py pz]T .

Page 156: Development of intelligent robot systems based on sensor control

140

The matrix A consists of N rows of [1 t t2] with t = 1...N , where t is therow number. It should be mentioned that C may be directly calculatedby the expression C = ((ATA)−1(ATxTN))T , but we have chosen to usethe notation above for pedagogical reasons.

4. oN = normalize(oN−1 − K2·(normalize(F ) + oN−1)). Normalization of avector v denotes v divided by its length.

5. a′′ = aN−1 −K3·(aN−1 + v1).

6. aN = normalize(a′′− dot(oN,a′′)·oN ), where dot(oN,a′′) is the dot productoTN · a′′.

7. PN =

[oN × aN oN aN pN

0 0 0 1

]

The initial value of x is determined by linear extrapolation with respect tothe estimated direction at the beginning of the seam. The elements of F(i)with i = 1,2,3 for x,y, z were here calculated by the Matlab functionpfit((1:N)’,x(i,:)’) (the input parameters x and y in this function are[1 2 ... N]T and bi as defined above). The pfit function was developed in thiswork using Maple (Waterloo Maple Inc.) and gives an analytical solution to 2nddegree polynomial curve fitting for this application, that works 8 times fasterfor N = 10 and 4 times faster for N = 100 than the standard Matlab functionpolyfit. The analytical function pfit was implemented according to below:

function k = pfit(x,y)

n = length(x); sx = sum(x); sy = sum(y); sxx = x’*x; sxy = x’*y;sx3 = sum(x.ˆ3); sx4 = sum(x.ˆ4); sxxy = sum(x.ˆ2.*y);t2 = sx*sx; t7 = sx3*sx3; t9 = sx3*sxx; t12 = sxx*sxx;den = 1/(sx4*sxx*n-sx4*t2-t7*n+2.0*t9*sx-t12*sxx);t21 = (sx3*n-sxx*sx)*t15;k = 2.0*n*((sxx*n-t2)*t15*sxxy-t21*sxy+(sx3*sx-t12)*t15*sy)- ...

t21*sxxy+(sx4*n-t12)*t15*sxy+(t9-sx4*sx)*t15*sy;

The calculation speed was optimization by the application of Cramer’s ruleby an analytical inversion of ATA in Maple and the elimination of c0. In thefunction above, x and y areN×1 column vectors, x´ equals xT , sum(x) is thesum of the vector elements of x, and a point before an operator such as theexponential operator , denotes elementwise operation. This implementationmay be further optimized by reusing terms that only depend on x in pfit(which is equal to t outside pfit). Since such optimization is however outsidethe scope of this paper, we have chosen to keep this presentation as simpleas possible.

Page 157: Development of intelligent robot systems based on sensor control

141

3.5 Computational costs

The computational costs for the 6D seam tracking system based on laser scan-ning (without consideration to memory operations or index incrementation)was preliminary estimated to less than 250 + 50N floating point operationsfor the implementation presented in this paper, which for N = 10 gives 750operations per control cycle. At a rate of maximum 30 scans per second, thisgives less than 22.500 floating point operations per second.

3.6 Kinematics singularities

In sensor guided control, the manipulator may involuntarily move into areasin which no inverse kinematics solutions exist, generally called kinematicssingularities. It is however possible to minimize inner singularity problemscaused by robotic wrists. A novel method is suggested in [10] which was de-signed for the 6D seam tracking system, but works basically for any industrialrobot with 6 degrees of freedom. In this method, the stronger motors, oftenat the base of the robot, assist the weaker wrist motors to compensate forany position error. This method allows also for smooth transitions betweendifferent configurations.

In addition to this, in one-off production or for generation of a reference pro-gram, pre-scanning and trial run may be performed before welding. In thefirst step, low-speed (or high-speed) seam tracking is performed to calculatethe trajectory of the TCP, following the seam. In the second step, a trial runis performed in full speed to check the performance of the robot, and finallythe actual welding is performed.

3.7 Initial and final conditions

In the experiments, seam tracking was initiated from a proper position alreadyfrom start. The start position for seam tracking is in general found either byimage analysis using a camera or by seam tracking itself. Since previous posi-tions are unknown at start, an estimation is made of previous N−1 positions,by extrapolation of the trajectory in the direction of o. This is the reason whyin some experiments, higher orientation errors occurred at the beginning ofseam tracking. The condition to stop seam tracking may be based on the po-sition of the TCP. The stop signal may be triggered by definition of volumesoutside which no seam tracking is allowed.

Page 158: Development of intelligent robot systems based on sensor control

142

4 Experimental results

4.1 Workpiece samples

The position and orientation errors at seam tracking were calculated based onsimulation experiments of basically 8 different workpieces, see Figs. 7-11. Itshould be mentioned that the experiments in the bottom of Fig. 8 and Fig. 9are not actually possible in practice due to collision problems, but are per-formed for the analysis of the control system. The worpieces in Matlab wereapproximated by lines orthogonal to the direction of the weld and the resolu-tion was defined as the nominal distance between these lines. The resolutionwas here set to 0.02 mm. The reason why no noise was added to the profileobtained from the laser scanner (except for perhaps decreasing the numberof measurement points compared to the M-SPOT, which may use up to 512points per scan), was because large number of measurements allow for high-precision estimations of v0 and v1, with a constant offset error less than theaccuracy of the laser scanner. Since the 6D seam tracking system corrects theposition and the orientation of the TCP relative to the extracted profile, sucherror will not influence the behavior of the control system more than highlymarginally, by basically adding a small offset error to the experimental resultsin Figs. 7-11.

r = 10 mm

θ

noa

R = 2r

r = 2 mm

θ

θ = 0

0 45 90 135 180−15

−10

−5

0

5

10

15

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−0.1

−0.05

0

0.05

0.1

0.15

θ (deg)

ε (m

m)

Position Error

0 45 90 135 180−4

−2

0

2

4

6

8

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−0.02

−0.01

0

0.01

0.02

0.03

0.04

0.05

θ (deg)

ε (m

m)

Position Error

Figure 7 Seam tracking, laser scanning. Pipe intersection (top) and workpiecefor isolating control around a (bottom). K1 = 1.0, K2 = K3 = 0.8, N = 10. Errordenotes the difference between desired and current pose.

The workpieces are categorized into two groups, continuous and discrete. In

Page 159: Development of intelligent robot systems based on sensor control

143

r = 5 mm

r = 10 mm

θ

θ

noa

0 45 90 135 180−10

−8

−6

−4

−2

0

2

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−0.01

0

0.01

0.02

0.03

0.04

θ (deg)

ε (m

m)

Position Error

0 45 90 135 180−10

−5

0

5

10

15

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−0.15

−0.1

−0.05

0

0.05

θ (deg)

ε (m

m)

Position Error

Figure 8 Seam tracking, laser scanning. Isolating rotation around n, followinginner and outer curves. Note that in this figure, the case in the bottom is notvery realistic, but only included for the completeness of the experiments. K1 =1.0, K2 = K3 = 0.8, N = 10.

r = 50 mm

θ

noa

0 45 90 135 180−1

0

1

2

3

4

5

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−0.04

−0.03

−0.02

−0.01

0

0.01

θ (deg)

ε (m

m)

Position Error

Figure 9 Seam tracking, laser scanning. The same as in previous figure, exceptfor K2 = 0.1 and r = 50 mm. This shows that K2 = 0.1 works fine as wellfor continuous seams with moderate radius of curvature. Further, the sameexperiment was also performed with r = 20 mm, which showed to be basicallyidentical to this figure, but with saturation at 12 degrees instead of 4. Notethat while other laser scanning experiments were performed on the verge ofinstability to find the limits of the seam tracking system, using the M-SPOT laserscanner, the system is highly stable at moderate curvatures using low valuesfor K2.

Page 160: Development of intelligent robot systems based on sensor control

144

∆a = 5 mm

x

∆a = −5 mm

x

noa

−2 0 2 4 6 8 10−5

0

5

10

15

20

25

30

x (mm)

γ (d

eg)

Angular Error

−2 0 2 4 6 8 10−1

0

1

2

3

4

5

x (mm)

ε (m

m)

Position Error

−2 0 2 4 6 8 10−30

−25

−20

−15

−10

−5

0

5

x (mm)

γ (d

eg)

Angular Error

−2 0 2 4 6 8 10−5

−4

−3

−2

−1

0

1

x (mm)

ε (m

m)

Position Error

Figure 10 Seam tracking, laser scanning. The step is introduced at x = 0 mm.K1 = 1.0, K2 = 0.1, K3 = 0.8, N = 10.

x

30°

noa

∆n = 5 mm

x

−2 0 2 4 6 8 10−30

−25

−20

−15

−10

−5

0

5

x (mm)

γ (d

eg)

Angular Error

−2 0 2 4 6 8 10−1

0

1

2

3

4

5

x (mm)

ε (m

m)

Position Error

−2 0 2 4 6 8 10−30

−25

−20

−15

−10

−5

0

5

x (mm)

γ (d

eg)

Angular Error

−2 0 2 4 6 8 10−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

x (mm)

ε (m

m)

Position Error

Figure 11 Seam tracking, laser scanning. In this figure, the case in the bottomdenotes a step of −30 performed around o. K1 = 1.0, K2 = 0.1, K3 = 0.8, N =10.

Page 161: Development of intelligent robot systems based on sensor control

145

the experiments presented in this paper, the range was 0− 180 for continu-ous, and 12 mm for discrete workpieces. In seam tracking of a continuousworkpiece, a sudden change is interpreted as a curve with a small radius ofcurvature, see Figs. 7-8, while in seam tracking of a discrete workpiece a sud-den change is regarded as a disturbance, see Figs. 10-11. The settings of thecontrol parameters decide alone the mode of the control system: continuousor discrete. For seam tracking of objects with a large radius of curvature, thediscrete mode works well for both continuous and discrete workpieces, seeFig. 9.

The control parameters K1, K2, K3 and N used in the presented experimentsshowed to work well for each category, continuous and discrete. These set-tings were found by more than 50 laser scanning simulation experiments inMatlab, and large deviation from these parameters showed to create instabil-ity in the system, making the control components working against each otherinstead of in synergy, resulting in divergence from the seam and ultimatelyfailure at seam tracking.

The seam tracking system is theoretically able to handle any angle α (in Fig. 2)between 0 and 180. In practice however, a small angle may cause collisionbetween torch and workpiece. For simplicity, α was set to 90 for all experi-ments except for pipe intersections.

4.2 Laser scanning

The reference laser scanner, M-SPOT, used in the experiments has a sweepingangle of 28, an accuracy better than 0.02 mm at the distance of 100 mm and1.5 mm at a distance of 1 m. It is able to scan at a frequency of up to 40 Hzwith a resolution of maximum 512 points per scan. The virtual model of theM-SPOT that was designed, were programmed to have an accuracy of 0.02 mmand was mounted on the torch at a distance of approximately 80 mm fromthe workpiece. The resolution was set to 11 points per scan for a sweepingfrequency of 5 (the only exception was steps in the n direction, see Fig. 11,where 31 points per scan with a sweeping angle of 10 was used instead)and the scanning frequency to 30 Hz, which showed to be fully sufficient forsuccessful seam tracking.

In fact, the simulation experiments proved that the accuracy of M-SPOT washigher than needed for seam tracking. The accuracy is so high in close range(100 mm) that only robots with very high performance are able to use the ac-curacy of this sensor to its full extent, when seam tracking is performed inreal-time. The experiments presented in Figs. 7-8 and 10-11 show the limitsof performance for the 6D seam tracking system, when it is at the verge of in-stability. For larger radius of curvature and a lower value for K2, i.e for normaloperation, Fig. 9 gives perhaps a better indication of the efficiency of the 6Dseam tracking system. Since the performance of a robot is dependent on cur-

Page 162: Development of intelligent robot systems based on sensor control

146

rent joint values, it is further not possible to translate these values directly tojoint positions, velocities and accelerations. What these experiments provedhowever, was that it was the performance of the robot that constituted thelimitations in seam tracking based on laser scanning.

The laser scanning simulations presented in this paper were performed with aconstant welding speed of 10 mm/s (which equals 3 scans/mm at a scanningfrequency of 30 Hz) and a virtual TCP, 2 mm away from the intersection pointof the seam walls. Empirical experiments showed that by moving the TCP frompreviously D = 15 mm in Fig. 2 to only 2 mm, the control ability of the 6Dseam tracking was highly improved, especially in seam tracking of objects witha small radius of curvature. Though the parameter settings K1 = K2 = K3 = 1showed to be unstable in laser scanning, K1 = 1, K2 = K3 = 0.8 showed to berelatively stable for continuous welds. For discrete workpieces, K2 was set aslow as 0.1 to maximize control stability.

5 Results and discussion

A 6D seam tracking system was designed and validated by simulation that isable to follow any continuous 3D seam with moderate curvature, using a laserscanner, and is able to continuously correct both position and orientation ofthe tool-tip of the welding torch along the seam. Thereby the need for robottrajectory programming or geometrical databases is practically eliminated,except for start and end positions.

Simulations based on the M-SPOT laser scanner showed that 6D seam trackingwas theoretically possible of objects less than a radius of curvatures of 2 mm.Seam tracking in real-time requires however a robot with very high perform-ance. It is however possible to perform seam tracking without welding to findthe geometry of the workpiece, and perhaps even checking that the traject-ory is within the workspace of the robot, before actual welding is performed.Thereby it will be possible to perform welding of objects with small radius ofcurvature, theoretically also including sharp edges. This is much better thanthe initial objective to design a 6D seam tracking system that could managea radius of curvature down to 200 mm, which is still considered as relativelysmall for robotic welding in for instance shipbuilding applications.

6 Future work

Though the 6D seam tracking system showed to be very robust, optimizationof the algorithm is still possible. The control system worked well using pro-portional constants alone, but if needed, for instance PID or adaptive controlmay be included to enhance the performance. As an example, the control

Page 163: Development of intelligent robot systems based on sensor control

147

parameter K2 could be defined as a function of the “radius of curvature” (ormore specifically as a function of for example c2 in Fig. 6) of the 2nd degreepolynomial used for calculation of the trajectory tangent vector, thereby en-able the system to automatically change between normal operation, used formaximum control stability, and extraordinary operation with fast response,used for management of small radius of curvatures.

References

[1] European Commission. ROWER-2: On-Board Automatic Welding for ShipErection – Phase 2. Record Control Number 39785. www.cordis.lu, update2002-11-05.

[2] M. A. A. Arroyo Leon, A. Ruiz Castro, and R. R. Leal Ascencio. An artificialneural network on a field programmable gate array as a virtual sensor. InDesign of Mixed-Mode Integrated Circuits and Applications, 1999. ThirdInternational Workshop on, pages 114–117, July 1999.

[3] G. Bolmsjö. Sensor systems in arc welding. Technical report, RoboticsGroup, Department of Production and Materials Engineering, Lund Uni-versity, Sweden, 1997.

[4] G. Bolmsjö. Programming robot welding systems using advanced simu-lation tools. In Proceedings of the International Conference on the Joiningof Materials, pages 284–291, Helsingør, Denmark, May 16-19 1999.

[5] H. Bruyninckx, J. De Schutter, and B. Allotta. Model-based constrainedmotion: a. modeling, specification and control. In Advanced Robotics,1991. ’Robots in Unstructured Environments’, 91 ICAR., Fifth InternationalConference on, volume 2, pages 976–981, 1991.

[6] H. Bruyninckx, J. De Schutter, and B. Allotta. Model-based constrainedmotion: b. introducing on-line identification and observation of the mo-tion constraints. In Advanced Robotics, 1991. ’Robots in Unstructured En-vironments’, 91 ICAR., Fifth International Conference on, volume 2, pages982–987, 1991.

[7] G. E. Cook, K. Andersen, K. R. Fernandez, M. E. Shepard, and A. M. Wells.Electric arc sensing for robot positioning control. Robot Welding, pages181–216. Editor: J. D. Lane. IFS Publications Ltd, UK, Springer-Verlag,1987.

[8] P. Corke. A robotics toolbox for MATLAB. IEEE Robotics and AutomationMagazine, 3(1):24–32, Mar. 1996.

[9] W. Doggett and S. Vazquez. An architecture for real-time interpretationand visualization of structural sensor data in a laboratory environment.

Page 164: Development of intelligent robot systems based on sensor control

148

In Digital Avionics Systems Conferences, 2000. Proceedings. DASC. The19th, volume 2, pages 6D2/1–6D2/8, Oct. 2000.

[10] M. Fridenfalk. Eliminating position errors caused by kinematics singu-larities in robotic wrists for use in intelligent control and teleroboticsapplications. Patent application SE0203180-5, October 2002.

[11] M. Fridenfalk and G. Bolmsjö. The Unified Simulation Environment —Envision telerobotics and Matlab merged in one application. In the Pro-ceedings of the Nordic Matlab Conference, pages 177–181, Oslo, Norway,October 2001.

[12] M. Fridenfalk and G. Bolmsjö. Design and validation of a sensor guidedrobot control system for welding in shipbuilding. Accepted for futurepublication in the International Journal for the Joining of Materials, 2002.

[13] M. Fridenfalk and G. Bolmsjö. Intelligent system eliminating trajectoryprogramming and geometrical databases in robotic welding. Patent ap-plication SE0203239-9, November 2002.

[14] M. Fridenfalk, U. Lorentzon, and G. Bolmsjö. Virtual prototyping andexperience of modular robotics design based on user involvement. Inthe Proceedings of the 5th European Conference for the Advancement ofAssistive Technology, Düsseldorf, Germany, Nov. 1999.

[15] M. Fridenfalk, M. Olsson, and G. Bolmsjö. Simulation based design of arobotic welding system. In Proceedings of the Scandinavian Symposiumon Robotics 99, pages 271–280, Oulu, Finland, October 14-15 1999.

[16] G. Hailu and A. Zygmont. Improving tracking behavior with virtualsensors in the loop. In Aerospace Conference, 2001, IEEE Proceedings.,volume 4, pages 1729–1737, 2001.

[17] N. Nayak and A. Ray. An integrated system for intelligent seam track-ing in robotic welding. Part I. Conceptual and analytical development.In Robotics and Automation, 1990. Proceedings., 1990 IEEE InternationalConference on, volume 3, pages 1892–1897, 1990.

[18] N. Nayak and A. Ray. An integrated system for intelligent seam trackingin robotic welding. Part II. Design and implementation. In Robotics andAutomation, 1990. Proceedings., 1990 IEEE International Conference on,volume 3, pages 1898–1903, 1990.

[19] M. Oosterom and R. Babuska. Virtual sensor for fault detection and isol-ation in flight control systems — fuzzy modeling approach. In Decisionand Control, 2000. Proceedings of the 39th IEEE Conference on, volume 3,pages 2645–2650, Dec. 2000.

Page 165: Development of intelligent robot systems based on sensor control

149

[20] H. Pasika, S. Haykin, E. Clothiaux, and R. Stewart. Neural networks forsensor fusion in remote sensing. In Neural Networks, 1999. IJCNN ’99.International Joint Conference on, volume 4, pages 2772–2776, July 1999.

[21] P. Ridao, J. Battle, J. Amat, and M. Carreras. A distributed environment forvirtual and/or real experiments for underwater robots. In Robotics andAutomation, 2001. Proceedings 2001 ICRA. IEEE International Conferenceon, volume 4, pages 3250–3255, May 2001.

Page 166: Development of intelligent robot systems based on sensor control

150

Page 167: Development of intelligent robot systems based on sensor control

Paper VI

Design and validation of a universal 6D seamtracking system in robotic welding using arc

sensing

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

Submitted to the Journal of Advanced Robotics, Januari 2003

Page 168: Development of intelligent robot systems based on sensor control
Page 169: Development of intelligent robot systems based on sensor control

Design and validation of a universal 6D seamtracking system in robotic welding using arc

sensing

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

Abstract

This paper presents the design and validation of a novel and universal 6D seamtracking system that reduces the need of accurate robot trajectory program-ming and geometrical databases in robotic arc welding. Such sensor drivenmotion control together with adaptive control of the welding process is thefoundation for increased flexibility and autonomous behavior of robotic andmanufacturing systems. The system is able to follow any 3D-spline seam inspace with moderate radius of curvature by the real-time correction of theposition and orientation of the welding torch, using the through-arc sensingmethod.

The 6D seam tracking system was developed in the FUSE simulation envir-onment, integrating software prototyping with mechanical virtual prototyp-ing, based on physical experiments. The validation experiments showed thatthis system was both robust and reliable and is able to manage a radius ofcurvature less than 200 mm.

1 Introduction

Robot automation increases productivity, the product quality and frees manfrom involuntary, unhealthy and dangerous work. While computational powerhas increased exponentially during the last decades, since many new applica-tion areas require operation in relatively unstructured environments, the lim-itations in flexibility constitute today a bottleneck in the further evolution ofrobotic systems.

Page 170: Development of intelligent robot systems based on sensor control

154

One application for intelligent systems is seam tracking in robotic welding.Seam tracking is among others used for the manufacturing of large ships suchas cruisers and oil tankers, where relatively high tolerances in the sheet ma-terial are allowed to minimize the manufacturing costs [3, 12]. Seam trackingis today typically only performed in 2D, by a constant motion in x and com-pensation of the errors in y and z directions, see Fig. 1. There exist solutionsfor higher autonomy that include seam tracking in 3D where compensationis also performed in o direction or around an orientation axis. These limita-tions make however seam tracking only possible for workpieces with relativelysimple geometrical shapes.

n

x

Ω

oa

TCP

A

D na

d

d

α2

α2

z

y

TCP

rw

Figure 1 Left: Definition of Tool Center Point (TCP) and the orthonormal co-ordinate system n,o,a. Weaving, if any, is performed in n direction in arcsensing, o is opposite to the direction of welding and perpendicular to the planeΩ, and a is the direction of approach. x,y, z is a local coordinate system,defined for the workpiece. Right: The optimal position for seam tracking in arcsensing.

The next step in the evolution of sensor systems in robotic welding is theintroduction of a full 6D sensor guided control system for seam tracking that isable to correct the TCP in x, y , z and around roll, pitch and yaw. Such ultimatesystem is per definition able to follow any continuous 3D seam with moderatecurvature. This system has many similarities with an airplane control system,where a change in either position or orientation effects all other degrees offreedom.

Though seam tracking has been performed in 2D for at least 40 years, the hy-pothetical design of 6D seam tracking systems was probably first addressedin the beginning of the last decade by suggestions of models based on forcesensors [5, 6] and laser scanners [17, 18]. It is however difficult to evaluatethese systems, since no experimental results are presented, neither any expli-cit control systems for creating balance between the subsystems.

The major contribution of this paper is the invention of a universal 6D seam

Page 171: Development of intelligent robot systems based on sensor control

155

tracking system for robotic welding, validated by simulation experiments basedon physical experiments, that proved that 6D seam tracking is possible andeven very robust for arc sensing [13]. On a more detailed level, the generalcontributions of this paper is the introduction of: (1) the concept of 6D seamtracking based on arc sensing, (2) the trajectory tangent vector in arc sensing,(3) pitch control in arc sensing, and (4) new method replacing quaternion in-terpolation and thereby increasing the computational efficiency significantly.

The authors have found no references from the literature that describe 6Dseam tracking based on arc sensing, but there exist one similarity betweenthis work and previous work, which is the introduction of a trajectory tan-gent vector by curve fitting that was made in laser scanning [17, 18]. Thereexist however some differences between how such vector was used and im-plemented. The differences consist of (1) curve fitting by 2nd instead of 3rddegree polynomials, for faster computation and still high control stability, (2)an analytic solver for curve fitting of 2nd degree polynomials developed andimplemented for this system, increasing the calculation speed further and (3)using differential vector interpolation instead of direct use of the trajectorytangent vector, which showed to be essential for maintaining control stability.

A 6D seam tracking system increases the intelligence and flexibility in man-ufacturing systems based on robotic welding using laser scanning. This re-duces the need for accurate robot trajectory programming and geometricaldatabases. The simulation results (based on physical experiments) showedthat the initial objective to design a 6D seam tracking system was reachedthat could manage a radius of curvature down to 200 mm.

2 Materials and methods

2.1 Sensors guided robot control

Many systems have been developed during the last decades for seam trackingat arc welding. The patents and scientific publications within this applicationarea show that laser scanners and arc sensors today replace older systems.Laser scanners have in general high accuracy but are expensive and have to bemounted on the torch, thereby decreasing the workspace of the robot. Thisproblem is partly solved by introduction of an additional motion that rotatesthe scanner around the torch to keep its relative orientation constant withrespect to the seam profile. Arc sensors are inexpensive and compared withother sensors for seam tracking, relatively inaccurate due to the stochasticnature of arc welding. Since welding of large structures in general requires arelatively low accuracy however, arc sensors are a competitive alternative tolaser scanners. In addition, it should be mentioned that there exist solutionsfor digital image processing of the workpiece before welding. The problemsare however, that small changes in the surrounding light may disturb analysis.

Page 172: Development of intelligent robot systems based on sensor control

156

Further on, 3D spline curve requires the camera to follow the seam all the wayto be able to capture segments that otherwise would have remained hidden.This applies also to very long seams. In practical applications today, solutionsbased on pure image processing by a CCD camera are successfully used tofind the initial point where 2D seam tracking, based on laser scanning or arcsensing should start [12].

k1

s

n1 nnmax

k2

n2

smax

PN−1

PN−2

n A

−A

ai

niPini

oi

Pi

α

n−A nA

Figure 2 Arc sensing. Left: Cross section, perpendicular to the approach vectora. The weaving is added to the interpolated motion between PN−2 and PN−1,including changes in position and orientation. Right: Cross section, planeΩ. Bythe measurement of the current from start to end, the profile of the workpieceis approximately determined.

Arc sensing

In “through-arc” sensing or simply arc sensing [7], the arc current is sampledwhile a weaving motion is added to the TCP in n direction, see Fig. 1. Weavingis useful in arc welding due to process related factors and was used alreadybefore the introduction of arc sensing in the beginning of the 1980s. In arcsensing, by keeping the voltage constant, it is possible to calculate the dis-tance from TCP to the nearest seam wall at each current sample, and therebyduring a weaving cycle obtain the profile of the seam. According to exper-imental results [7] the approximate relationship between arc voltage V , arccurrent I and the nearest distance between the electrode and the workpiecel for an electrode extension ranging between 5-15 mm, may be expressed bythe relation:

V = β1I + β2 + β3/I + β4l (1)

Page 173: Development of intelligent robot systems based on sensor control

157

where the constants β1 − β4 are dependent on factors such as wire, shieldinggas and power-source. In practice the voltage and current readings of the arccontain much noise, why the signal has to be low-pass filtered before it isused for feedback control. An example of such filtered signal, sampled in theROWER-2 project is presented in Fig. 3.

0 10 20 30 40 50 60 70 80 90 100

200

250

300

350

Fillet weld sample

ticks

Arc

cur

rent

(A

)

Figure 3 Example of a filtered arc signal at seam tracking, sampled in theROWER-2 project. Though anomalies are common at the ignition of the arc,only minor deviations from this pattern showed to occur during continuousseam tracking. The unit of the horizontal axis is ticks, which equals 20 ms. Thecurrent values were filtered by an analogue active 4:th order Bessel low-passfilter with f0 = 10 Hz before sampling.

In automatic control, error is defined as the difference between desired (input)value u and current (output) value y . The control algorithms in arc sensingare based on methods such as template matching or differential control [7].In template matching the errors may for instance be calculated by integratingthe difference of the template u and the arc signal y , where A denotes theweaving amplitude:

Ea = εa =∫ A−A(u(n)− y(n))dn (2)

En = εn =∫ 0

−A(u(n)−y(n))dn−

∫ A0(u(n)−y(n))dn (3)

Another example includes using integrated difference squared errors. Theequation above use n as variable to simplify the notation. Actually, morespecifically, the integration between −A and A is performed in negative odirection, using an auxiliary variable ψ denoting the state of weaving in ndirection, integrated over [ψ−A,ψA], where ψ−A and ψA denote the turningpoints of weaving. In differential control, which has proven to be quite reliablefor control in a andn directions, current sampling is performed at the turningpoints of the weaving. Here, the errors in a and n directions are obtained by:

Page 174: Development of intelligent robot systems based on sensor control

158

εa = Iref − i+A + i−A2(4)

Ea = Kaεa (5)

εn = i+A − i−A (6)

En = Knεn (7)

where Iref is a reference current value and i+A and i−A are the average meas-ured current values at a pair of adjacent extreme points. The parameters Iref ,Ka and Kn are dependent on the weld joint geometry and other process para-meters such as shielding gas and wire feed rate. Since these parameters willbe known in advance, Ka and Kn may be defined for any welding application.An example of how Ea and En are used for position control follows below:

pN = pN−1 + Enn− rwo+ Eaa (8)

where pN−1 and pN denote current and next positions and rw is nominal dis-placement during a weaving cycle, also in Fig. 1, which is equal to weldingspeed (without consideration to the weaving motion) divided by weaving fre-quency.

Virtual sensors

The development of sensor guided control systems may be accelerated us-ing virtual sensors in the simulation control loop. Since a system’s state isoften easily obtained in simulation, assuming that the simulation model issufficiently accurate compared to the real system, this provides for a betterfoundation for process analysis than in general is possible by physical experi-ments. Further on, insertion of a sensor in a control loop may cause damage toexpensive equipment, unless the behavior of the entire sensor guided controlsystem is precisely known, which is rarely the case at the development stage.

Virtual sensors are used in many application areas, such as robotics, aerospaceand marine technologies [4, 9, 16, 19, 21]. Development of new robot sys-tems, such as for seam tracking may be accelerated by application of sim-ulation [14, 15]. In general, the design methodology of virtual sensors may

Page 175: Development of intelligent robot systems based on sensor control

159

vary due to their specific characteristics. If a sensor model is not analyticallydefinable, artificial neural networks or statistical methods may be used to findappropriate models [2, 20].

2.2 Robot system

The design of the 6D seam tracking system was based on experience and res-ults from the European ROWER-2 project [1]. The objective of this project wasto automate the welding process in shipbuilding by the design of a roboticsystem, specifically for joining double-hull sections in super-cruisers and oiltankers. Figure 4 shows the robot system developed in the ROWER-2 projectand the corresponding simulation model in FUSE (described below). The imple-mentation of the 3D seam tracking system was performed in C/C++, runningon a QNX-based embedded controller (QNX is a real-time operating systemfor embedded controllers). The robot was manufactured partly in aluminumalloy to decrease the weight for easy transportation and mounted on a mobileplatform with 1 degree of freedom for increased workspace. The robot systemwas integrated and successfully validated on-site at a shipyard.

Figure 4 The design of the robot system for 3D seam tracking based on arcsensing in the ROWER-2 project (left) was based on simulations in FUSE (right).The 6D seam tracking system presented in this paper was developed based onexperience and results gained from the ROWER-2 project.

2.3 Simulation system

The development and implementation of the 6D seam tracking system wasinitially performed in the Flexible Unified Simulation Environment (FUSE) [11].FUSE is an integrated software system based on Envision (robot simulator,Delmia Inc.) and Matlab (MathWorks Inc.), including Matlab Robot Toolbox [8].

Page 176: Development of intelligent robot systems based on sensor control

160

The integration was performed by compiling these software applications andtheir libraries into one single software application, running under IRIX, SGI.The methodology developed for the design of the 6D seam tracking systemwas based on the integration of mechanical virtual prototyping and softwareprototyping. The 6D seam tracking simulations in FUSE were performed byusing the robot developed in the ROWER-2 project and a ABB S4 IRB2400.16robot, see Figs. 4 and 5. After the design and successful validation of the 6Dseam tracking system for arc sensing, the virtual prototyping and softwaremodels, partly written in C-code and partly in Matlab code, were translatedto pure Matlab code. The translation was necessary, since it saved develop-ment time for the implementation of the evaluation system, for the accuratecalculation of position and orientation errors during simulation.

Figure 5 Examples of 6D seam tracking experiments performed in FUSE, Envi-sion, using a model of the ROWER-2 robot.

2.4 Calculation of position and orientation errors

Error is defined as the difference between desired and current pose P. The po-sition errors were calculated by projection of εa and εn on a plane Ωt perpen-dicular to the desired trajectory while intersecting current TCP, Pt . Thereby,per definition εo = 0. The orientation error (the orientation part of Pδ) iscalculated by:

Pδ = P−1t · PΩt (9)

where Pt is the current TCP and PΩt is the desired TCP for any sample pointt. The errors around n, o and a are calculated by the subsequent conversionof the resulting transformation matrix to roll, pitch and yaw, here defined asrotation around, in turn, a, o and n. Rotation in the positive were defined

Page 177: Development of intelligent robot systems based on sensor control

161

as counterclockwise rotation from the perspective of a viewer when the axispoints towards the viewer.

3 Modeling

In this section the design of the 6D seam tracking system is described. Theprincipal control scheme of this system is presented in Fig. 6. The controlsystem is based on three main components, position, trajectory tangent vec-tor and pitch control. Pitch denotes rotation around o. The main input ofthis control system is the 4×4 transformation matrix PN−1, denoting currentTCP position and orientation. The output is next TCP, PN . N is the numberof samples used for orientation control. The input parameters are N , the pro-portional control constants K1, K2 and K3, and the generic sensor input valuesdenoting ξ = [εa εn] in Eqs. 10-11 and ζ = [k1 k2] in Figs. 2-7. The proportionalconstants K1, K2 and K3 denote the control response to errors in position, tra-jectory tangent vector and pitch control. N is also used in trajectory tangentvector control and is a measure of the memory size of the algorithm.

Position control

Trajectory tangentvector control

pN−1 pN pN−2 p1

K2

Pitchcontrol

K3

K1

...

ξ

ζ

PN−1

aN

PN

nN

oN

pN

aN−1

oN−1

pN−1

1z_ 1

z_ 1

z_

oN

o × a

Figure 6 The principal scheme of the 6D seam tracking control system. PN−1

and PN denote current and next TCP. Next TCP is calculated at the beginningof each weaving cycle, and the motion between PN−1 and PN is interpolated.

The seam profile presented in Fig. 1 is a fillet joint. Since all standard seamprofiles are basically L-shaped, the generic parameters ξ and ζ may be es-timated by the same profile extraction methods used for fillets joints. Analternative to fillet joint is for instance the V-groove, which mainly containsa gap between the seam walls. In arc sensing, since electrical current always

Page 178: Development of intelligent robot systems based on sensor control

162

∂n

∂s

ni*

|si | sin γ1 · ni

k1 = −∂s/∂n = −sin γ1

si

|si | cos γ1 · ai

Pi

si*

Pi*

γ1

ai*

ni

ai

n1

γ1 γ1

Figure 7 Cross section, planeΩ, corresponding to the upper seam wall in Fig. 1.The measured distance from TCP to nearest wall is always perpendicular tothe wall. The geometrical relation between the measured distance and thegeometrical shape of the workpiece is calculated by the least-square estimateof the coefficients k1 and k2 in Fig. 2. Note that since both position and theorientation of TCP are interpolated between PN−2 and PN−1, this figure is onlyan approximation of the trajectory between Pi∗ and Pi. Note also that γ < 0for k > 0.

takes the shortest path between TCP and the seam walls, see Fig. 2, such gaphas in general small influence on the measurements in arc sensing. In addi-tion, since using other shapes than sinus, such as sawtooth and square wavedoes not effect seam tracking as much as it effects the welding process in arcsensing, only sine waves were used in the simulation experiments.

3.1 Position control

The 6D seam tracking algorithm starts by the calculation of next position forthe TCP. The position is calculated by the differential method in Eqs. 10-11,rewritten as a function of distance instead of current (with d defined in Fig. 1),calculated by using Eq. 1:

εa = s(iA)+ s(i−A)− 2d (10)

εn = s(i−A)− s(iA) (11)

Since the distance to the workpiece is approximately inverse proportional tothe current, this may explain the main differences between Eqs. 4 and 6 andEqs. 10-11. Also here, Ea = Kaεa, En = Knεn and pN is calculated by Eq. 8.

Page 179: Development of intelligent robot systems based on sensor control

163

In the ROWER-2 project, simulation and physical experiments showed thatKn should be 50% of Ka for maximum control stability, both for Eqs. 4-6 andEqs. 10-11. Here, Ka = K1 and Kn = K1/2. Since the position is controlled bykeeping the distance from TCP to the nearest seam wall constant at the turningpoints of the weaving, the desired distance from TCP to the intersection pointof the seam walls in Fig. 1 was derived as:

D = A+ d√

1+ tan2 (α/2)tan (α/2)

(12)

This equation is valid for any angle α different to an integer multiple of π ,where D is instead equal to d. This relation was primarily used for the calcu-lation of the position errors in evaluation of the 6D arc sensing system.

3.2 Trajectory tangent vector control

The positions p1...pN are used for trajectory estimation by curve fitting, seeFig. 8. To decrease calculation time, an analytical estimator was devised for1st and 2nd degree polynomials that showed to work much faster than tra-ditional matrix multiplication followed by inversion techniques using Gausselimination. The estimation of the trajectory at N gives o′. Since an attemptto abruptly change the direction of o from oN−1 to o′ would most probablycreate instability in the system, the motion between oN−1 and o′ is balancedby vector interpolation using K2. The scalar κ2 is not explicitly used in thecalculations, but used for visualization of the calculation of oN in Fig. 8.

X = c0 + c1 t + c2 t 2

PastPositions

CurrentPosition

NextPosition

o’

K2 (o’ − oN−1 )

oN /κ2

oN−1

∂X∂t t = N

F = F

pN

pN−1

pN

pN−2 pN−3

pN−4

Figure 8 Trajectory tangent vector control is performed by the least squarecurve fitting of the last N trajectory points to a 2nd degree polynomial curvefor x, y and z, followed by vector interpolation. Here N = 5. X, c0, c1 andc2 denote vector entities. The trajectory tangent vector F is for a 2nd degreepolynomial equal to c1 + 2Nc2.

Page 180: Development of intelligent robot systems based on sensor control

164

3.3 Pitch control

Pitch control is performed in two steps. First, aN−1 is orthogonalized withrespect to oN . This step could also have been performed after the second step(which then becomes the first step). Empirical experiments showed howeverthat the performance of the algorithm is slightly improved by pre-orthogonalization.In the second step the new coordinate system is rotated around o by δφ:

δφ = −K3 · (k1 + k2) (13)

Here, δφ is in radians, and k1 and k2 are estimated by the least square method,based on sampling of the arc current during a half weaving cycle, see Figs. 2-7.The index i in these figures denote arc current samples. Since current alwaystakes the shortest path between TCP and the nearest seam wall (homogen-eously conducting surface) and the distance between the TCP and the seamwalls may be calculated by Eq. 1, the full extraction of the seam profile istheoretically possible. With sinγ = ∂s/∂n and k = −∂s/∂n in Figs. 2-7, thefollowing relation is deduced:

k = − sinγ (14)

The orientation of the walls projected on planeΩ is calculated from the vectorcomponents of si:

si = |si|(cosγ · ai + sinγ ·ni) (15)

where |si| is the distance calculated from the current measurement, usingEq. 1. Eq. 14 and sin2 γ + cos2 γ = 1 give in turn the relation:

cosγ =√

1− sin2 γ =√

1− k2 (16)

which finally yields the direction of the current, projected on plane Ω:

si = |si|(√

1− k2 · ai − k ·ni) (17)

3.4 Arc sensing weaving interpolation

The interpolation between two poses such as PN−1 and PN is usually performedby vector interpolation of position p and quaternion interpolation of the ori-entation n,o,a. Since most of the calculation time in arc sensing was spent on

Page 181: Development of intelligent robot systems based on sensor control

165

quaternion interpolation, a new method was adopted that empirically showedto work much faster in Matlab and still performed as well as quaternion inter-polation in 6D seam tracking. The denotations in this subsection, genericallylabeled as x′ and x′′ are local and not related to other denotations elsewhere.

In this method, linear interpolation was first performed between PN−1 andPN including o, a and p for any sample i, resulting in o′i, a

′′i and pi. o′i was

rescaled to the unit vector oi, anda′′i was projected on the plane perpendicularto oi by:

a′i = a′′i − (a′′i · oi)oi (18)

The dot producta′′i ·oi gives the length of the vectora′′i orthogonally projectedon oi. Finally, ai was calculated by rescaling a′i to the unit vector ai, and niwas calculated by oi × ai. The weaving motion was added as a second layerof motion to Pi in ni direction. In robot simulation the torch was allowedto freely rotate ±180 around a as a third layer of motion, which showed tobe important, but not entirely imperative for the robot, to be able to perform360 welding of pipe intersection joints, see Fig. 5. The angle around a wasdetermined by the position of the robot with respect to the position of theworkpiece.

3.5 Code and pseudo code samples

The actual implementation of the 6D seam tracking system showed to bestraightforward. Most of the development effort was spent on the design ofthe surrounding environment of the seam tracking system, including a semi-automatic testing and evaluation system. A pseudo code representation ofthe 6D seam tracking system in Fig. 6, follows below:

1. pN = pN−1−rwo+K1 · (12enn+eaa) based on Eqs. 5, 7-8, 10-11 and the

relation K1 = 2Kn = Ka.

2. Calculation of xN by shifting xN−1 one step to the left with pN : xN−1 =[p0 p1 ... pN−2 pN−1]→ xN = [p1 p2 ... pN−1 pN].

3. Estimation of the trajectory tangent vector F by least square curve fittingof the parameterized 3D curve X = C · T with a parameter t to threeindependent second-degree polynomials (for x, y and z). F = ∂X/∂t|t=Nwith X = C · T , where C = [c0 c1 c2] = [cx cy cz]T and T = [1 t t2]T(Note that AT is the transpose function for a vector or matrix A andhas no correlation with the vector T). This gives F = C · ∂T/∂t|t=N =C ·[0 1 2t]T |t=N = C ·[0 1 2N]T . C is calculated by the estimation of theparameters ci for i = x,y, z, where ci = (HTH)−1(HTpi) and pi are

Page 182: Development of intelligent robot systems based on sensor control

166

the row vectors of xN , which may also be written as xN = [px py pz]T .The matrix H consists of N rows of [1 t t2] with t = 1...N , where t is therow number. It should be mentioned that C may be directly calculatedby the expression C = ((HTH)−1(HTxTN))T , but we have chosen to usethe notation above for pedagogical reasons.

4. oN = normalize(oN−1 − K2·(normalize(F ) + oN−1)). Normalization of avector v denotes v divided by its length.

5. a′′ = normalize(aN−1− dot(oN,aN−1)·oN ), where dot(oN,aN−1) is the dotproduct oTN · aN−1.

6. P′ =

[nN−1 oN a′′ pN

0 0 0 1

]

7. For the estimation of k1 and k2, see previous text and Fig. 2. It shouldhowever be mentioned that the x-components in linear fitting depend onthe weaving function between−A andA. For a sinusoidal wave, this givesthat thex-components in the negativen direction areA sin(2πi/M)withi running from the position −A to 0 for k1, and from the position 0to A for k2. M is here number of samples per weaving cycle. The y-components in linear fitting are the corresponding measured distances(subsets of s in Fig. 2). Linear fitting may be performed by:

k = (n*sxy-sx*sy)/(n*sxx-sxˆ2);

where n, sx, sy, sxx and sxy are defined below in the function pfit.

8. P′′ = P′· roty(−K3(k1 + k2)), where roty(θ), with θ counted in radians,is a 4 × 4 transformation matrix with elements Amn (rows m, columnsn), such that all elements are zero except for A11 = A33 = cos(θ),A13 =−A31 = sin(θ) and A22 = A44 = 1. This gives:

P′′ =

[n′′ oN aN pN0 0 0 1

]

9. PN =

[oN × aN oN aN pN

0 0 0 1

].

The initial value of x is determined by linear extrapolation with respect tothe estimated direction at the beginning of the seam. The elements of F(i)with i = 1,2,3 for x,y, z were here calculated by the Matlab functionpfit((1:N)’,x(i,:)’) (the input parameters x and y in this function are[1 2 ... N]T and bi as defined above). The pfit function was developed in thiswork using Maple (Waterloo Maple Inc.) and gives an analytical solution to 2nddegree polynomial curve fitting for this application, that works 8 times fasterfor N = 10 and 4 times faster for N = 100 than the standard Matlab functionpolyfit. The analytical function pfit was implemented according to below:

Page 183: Development of intelligent robot systems based on sensor control

167

function k = pfit(x,y)

n = length(x); sx = sum(x); sy = sum(y); sxx = x’*x; sxy = x’*y;sx3 = sum(x.ˆ3); sx4 = sum(x.ˆ4); sxxy = sum(x.ˆ2.*y);t2 = sx*sx; t7 = sx3*sx3; t9 = sx3*sxx; t12 = sxx*sxx;den = 1/(sx4*sxx*n-sx4*t2-t7*n+2.0*t9*sx-t12*sxx);t21 = (sx3*n-sxx*sx)*t15;k = 2.0*n*((sxx*n-t2)*t15*sxxy-t21*sxy+(sx3*sx-t12)*t15*sy)- ...

t21*sxxy+(sx4*n-t12)*t15*sxy+(t9-sx4*sx)*t15*sy;

The calculation speed was optimization by the application of Cramer’s ruleby an analythical inversion of HTH in Maple and the elimination of c0. In thefunction above, x and y areN×1 column vectors, x´ equals xT , sum(x) is thesum of the vector elements of x, and a point before an operator such as theexponential operator , denotes elementwise operation. This implementationmay be further optimized by reusing terms that only depend on x in pfit(which is equal to t outside pfit). Since such optimization is however outsidethe scope of this paper, we have chosen to keep this presentation as simpleas possible.

3.6 Kinematics singularities

In sensor guided control, the manipulator may involuntarily move into areasin which no inverse kinematics solutions exist, generally called kinematicssingularities. It is however possible to minimize inner singularity problemscaused by robotic wrists. A novel method is suggested in [10] which was de-signed for the 6D seam tracking system, but works basically for any industrialrobot with 6 degrees of freedom. In this method, the stronger motors, oftenat the base of the robot, assist the weaker wrist motors to compensate forany position error. This method allows also for smooth transitions betweendifferent configurations.

3.7 Initial and final conditions

In the experiments, seam tracking was initiated from a proper position alreadyfrom start. The start position for seam tracking is in general found either byimage analysis using a camera or by seam tracking itself. Since previous posi-tions are unknown at start, an estimation is made of previous N−1 positions,by extrapolation of the trajectory in the direction of o. This is the reason whyin some experiments, higher orientation errors occurred at the beginning ofseam tracking. The condition to stop seam tracking may be based on the po-sition of the TCP. The stop signal may be triggered by definition of volumesoutside which no seam tracking is allowed.

Page 184: Development of intelligent robot systems based on sensor control

168

4 Experimental results

4.1 Workpiece samples

The position and orientation errors at seam tracking were calculated basedon simulation experiments of basically 8 different workpieces, see Figs. 9-12.It should be mentioned that the experiment in the bottom of Figs. 10 is notactually possible in practice due to collision problems, but was performed forthe analysis of the control system. The workpieces in Matlab were approxim-ated by lines orthogonal to the direction of the weld and the resolution wasdefined as the nominal distance between these lines. These resolutions wereset to 0.5 mm.

r = 50 mm

θ

noa

R = 2r

r = 25 mm

θ

θ = 0

0 45 90 135 180−25

−20

−15

−10

−5

0

5

10

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−3

−2

−1

0

1

2

θ (deg)

ε (m

m)

Position Error

0 45 90 135 180−10

−5

0

5

10

15

20

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−1

−0.5

0

0.5

1

1.5

2

2.5

θ (deg)

ε (m

m)

Position Error

Figure 9 Seam tracking, arc sensing, with addition of random noise to the meas-ured distance. Pipe intersection (top) and workpiece for isolating control arounda (bottom). K1 = 1.0, K2 = 0.5, K3 = 0.25, N = 10. Error denotes the differencebetween desired and current pose.

The workpieces are categorized into two groups, continuous and discrete. Inthe experiments presented in this paper, the range was 0− 180 for continu-ous, and 120 mm for discrete workpieces. In seam tracking of a continuousworkpiece, a sudden change is interpreted as a curve with a small radius ofcurvature, see Figs. 9-10, while in seam tracking of a discrete workpiece a sud-den change is regarded as a disturbance, see Figs. 11-12. The settings of thecontrol parameters decide alone the mode of the control system: continuousor discrete. For seam tracking of objects with a large radius of curvature, thediscrete mode works well for both continuous and discrete workpieces.

Page 185: Development of intelligent robot systems based on sensor control

169

r = 25 mm

r = 50 mm

θ

θ

noa

0 45 90 135 180−30

−25

−20

−15

−10

−5

0

5

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−1

0

1

2

3

θ (deg)

ε (m

m)

Position Error

0 45 90 135 180−5

0

5

10

15

20

25

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−1.5

−1

−0.5

0

0.5

1

1.5

θ (deg)

ε (m

m)

Position Error

Figure 10 Seam tracking, arc sensing, with addition of random noise to themeasured distance. Isolating rotation around n, following inner and outercurves. Note that in this figure, the case in the bottom is not very realistic,but only included for the completeness of the experiments. K1 = 1.0, K2 =0.5, K3 = 0.25, N = 10.

∆a = 5 mm

x

∆a = −5 mm

x

noa

−20 0 20 40 60 80 100−5

0

5

10

15

x (mm)

γ (d

eg)

Angular Error

−20 0 20 40 60 80 100−2

0

2

4

6

x (mm)

ε (m

m)

Position Error

−20 0 20 40 60 80 100−15

−10

−5

0

5

10

15

x (mm)

γ (d

eg)

Angular Error

−20 0 20 40 60 80 100−6

−4

−2

0

2

x (mm)

ε (m

m)

Position Error

Figure 11 Seam tracking, arc sensing, with addition of random noise to themeasured distance. The step is introduced at x = 0 mm. K1 = 0.5, K2 =0.2, K3 = 0.1, N = 20.

Page 186: Development of intelligent robot systems based on sensor control

170

x

30°

noa

∆n = 5 mm

x

−20 0 20 40 60 80 100−15

−10

−5

0

5

10

x (mm)

γ (d

eg)

Angular Error

−20 0 20 40 60 80 100−4

−2

0

2

4

6

x (mm)

ε (m

m)

Position Error

−20 0 20 40 60 80 100−40

−30

−20

−10

0

10

x (mm)

γ (d

eg)

Angular Error

−20 0 20 40 60 80 100−4

−2

0

2

4

x (mm)

ε (m

m)

Position Error

Figure 12 Seam tracking, arc sensing, with addition of random noise to themeasured distance. In this figure, the case in the bottom denotes a step of −30performed around o. K1 = 0.5, K2 = 0.2, K3 = 0.1, N = 20.

The control parameters K1, K2, K3 and N used in the presented experimentsshowed to work well for each category, continuous and discrete. These set-tings were found by more than 100 arc sensing simulation experiments inMatlab, and large deviation from these parameters showed to create instabil-ity in the system, making the control components working against each otherinstead of in synergy, resulting in divergence from the seam and ultimatelyfailure at seam tracking.

As mentioned previously, the seam tracking system is theoretically able tohandle any angle α (in Figs. 1 and 2) between 0 and 180. In practice however,a small angle may cause collision between torch and workpiece. For simplicity,α was set to 90 for all experiments except for pipe intersections. Further, din Fig. 1 was set to 1 mm.

4.2 Arc sensing

According to [22], the accuracy of an arc sensor in seam tracking is 0.5 mm. Insimulation, the workpiece line resolution was set to 0.5 mm. To increase theerrors in the system, a random signal with a range of 1 mm was added to themeasured distance. In addition, no integration of the current measurementvalues was made near the turning points at weaving. On the contrary, onemeasurement was considered as enough for each turning point. The signal

Page 187: Development of intelligent robot systems based on sensor control

171

was prefiltered by an active 4:th order Bessel low-pass filter in the ROWER-2project. In the simulation experiments presented in this paper no filtering wasperformed. Thereby the accuracy had been decreased to less than half of theaccuracy suggested in [22]. In practical applications however, the arc signalshould be filtered by a method that does not add any phase shift to the signal,such as a proper digital filter.

The arc sensing simulations presented in this paper were performed with aconstant welding speed of 10 mm/s, weaving amplitude and frequency of 5mm and 3 Hz, and 64 current samplings per weaving cycle (or 192 samplingsper second). The algorithm worked well also for lower sampling rates, such as50 samples per second, which was used in the ROWER-2 project. In general,higher sampling rates showed to improve the control ability of the system, forrates higher than perhaps 10 samples per weaving cycle, though rather slightlythan proportionally. The simulation models proved to predict the estimationsderived from physical experiments in the ROWER-2 project for 6D motion.According to those experiments, it was possible to perform differential seamtracking in y and z directions in Fig. 1, up to 20 mm for a seam of the length1 m. At a speed of 9 mm per second, weaving amplitude and frequency of3 mm and 3 Hz, this gives a redirection capability of 1.15 per weaving cyclewhich in the worst case is equivalent to a minimum radius of 150 mm. Usinga power-source without any intrinsic current control would further give up to3 times better results according to the experiments in the ROWER-2 project(current control was only applied by the power source to limit the current).Using a high performing industrial robot and a weaving amplitude of 5 mminstead of 3, this would finally bring down the curvature to 25-50 mm, whichactually was the amount found by simulation.

5 Results and discussion

A 6D seam tracking system was designed and validated by simulation that isable to follow any continuous 3D seam with moderate curvature, using an arcsensor, and is able to continuously correct both position and orientation ofthe tool-tip of the welding torch along the seam. Thereby the needs for robottrajectory programming or geometrical databases were eliminated for work-pieces with moderate radius of curvatures, increasing intelligence in roboticarc welding.

Simulations based on physical experiments in the ROWER-2 project showedhowever that arc sensing in real-time welding was theoretically possible for aradius of curvature below 50 mm. This is much less than the initial objectiveto design a 6D seam tracking system that could manage a radius of curvaturebelow 200 mm, which is still considered as small for instance in shipbuilding.

Page 188: Development of intelligent robot systems based on sensor control

172

6 Future work

Though the 6D seam tracking system showed to be very robust, optimizationof the algorithm is still possible. Presently, pitch control in arc sensing isbased on data measurements during a half weaving cycle. By extending themeasurements to one full cycle, pitch control may be optimized. Further on,though the control system worked well using proportional constants alone,if needed, for instance PID or adaptive control may be included to enhancethe performance. One of many examples is for instance to define K2 as afunction of the “radius of curvature” (or more specifically as a function offor example c2 in Fig. 8) of the 2nd degree polynomial used for calculationof the trajectory tangent vector, thereby enable the system to automaticallychange between normal operation, used for maximum control stability, andextraordinary operation with fast response, used for management of smallradius of curvatures.

References

[1] European Commission. ROWER-2: On-Board Automatic Welding for ShipErection – Phase 2. Record Control Number 39785. www.cordis.lu, update2002-11-05.

[2] M. A. A. Arroyo Leon, A. Ruiz Castro, and R. R. Leal Ascencio. An artificialneural network on a field programmable gate array as a virtual sensor. InDesign of Mixed-Mode Integrated Circuits and Applications, 1999. ThirdInternational Workshop on, pages 114–117, July 1999.

[3] G. Bolmsjö. Sensor systems in arc welding. Technical report, RoboticsGroup, Department of Production and Materials Engineering, Lund Uni-versity, Sweden, 1997.

[4] G. Bolmsjö. Programming robot welding systems using advanced simu-lation tools. In Proceedings of the International Conference on the Joiningof Materials, pages 284–291, Helsingør, Denmark, May 16-19 1999.

[5] H. Bruyninckx, J. De Schutter, and B. Allotta. Model-based constrainedmotion: a. modeling, specification and control. In Advanced Robotics,1991. ’Robots in Unstructured Environments’, 91 ICAR., Fifth InternationalConference on, volume 2, pages 976–981, 1991.

[6] H. Bruyninckx, J. De Schutter, and B. Allotta. Model-based constrainedmotion: b. introducing on-line identification and observation of the mo-tion constraints. In Advanced Robotics, 1991. ’Robots in Unstructured En-vironments’, 91 ICAR., Fifth International Conference on, volume 2, pages982–987, 1991.

Page 189: Development of intelligent robot systems based on sensor control

173

[7] G. E. Cook, K. Andersen, K. R. Fernandez, M. E. Shepard, and A. M. Wells.Electric arc sensing for robot positioning control. Robot Welding, pages181–216. Editor: J. D. Lane. IFS Publications Ltd, UK, Springer-Verlag,1987.

[8] P. Corke. A robotics toolbox for MATLAB. IEEE Robotics and AutomationMagazine, 3(1):24–32, Mar. 1996.

[9] W. Doggett and S. Vazquez. An architecture for real-time interpretationand visualization of structural sensor data in a laboratory environment.In Digital Avionics Systems Conferences, 2000. Proceedings. DASC. The19th, volume 2, pages 6D2/1–6D2/8, Oct. 2000.

[10] M. Fridenfalk. Eliminating position errors caused by kinematics singu-larities in robotic wrists for use in intelligent control and teleroboticsapplications. Patent application SE0203180-5, October 2002.

[11] M. Fridenfalk and G. Bolmsjö. The Unified Simulation Environment —Envision telerobotics and Matlab merged in one application. In the Pro-ceedings of the Nordic Matlab Conference, pages 177–181, Oslo, Norway,October 2001.

[12] M. Fridenfalk and G. Bolmsjö. Design and validation of a sensor guidedrobot control system for welding in shipbuilding. Accepted for futurepublication in the International Journal for the Joining of Materials, 2002.

[13] M. Fridenfalk and G. Bolmsjö. Intelligent system eliminating trajectoryprogramming and geometrical databases in robotic welding. Patent ap-plication SE0203239-9, November 2002.

[14] M. Fridenfalk, U. Lorentzon, and G. Bolmsjö. Virtual prototyping andexperience of modular robotics design based on user involvement. Inthe Proceedings of the 5th European Conference for the Advancement ofAssistive Technology, Düsseldorf, Germany, Nov. 1999.

[15] M. Fridenfalk, M. Olsson, and G. Bolmsjö. Simulation based design of arobotic welding system. In Proceedings of the Scandinavian Symposiumon Robotics 99, pages 271–280, Oulu, Finland, October 14-15 1999.

[16] G. Hailu and A. Zygmont. Improving tracking behavior with virtualsensors in the loop. In Aerospace Conference, 2001, IEEE Proceedings.,volume 4, pages 1729–1737, 2001.

[17] N. Nayak and A. Ray. An integrated system for intelligent seam track-ing in robotic welding. Part I. Conceptual and analytical development.In Robotics and Automation, 1990. Proceedings., 1990 IEEE InternationalConference on, volume 3, pages 1892–1897, 1990.

Page 190: Development of intelligent robot systems based on sensor control

174

[18] N. Nayak and A. Ray. An integrated system for intelligent seam trackingin robotic welding. Part II. Design and implementation. In Robotics andAutomation, 1990. Proceedings., 1990 IEEE International Conference on,volume 3, pages 1898–1903, 1990.

[19] M. Oosterom and R. Babuska. Virtual sensor for fault detection and isol-ation in flight control systems — fuzzy modeling approach. In Decisionand Control, 2000. Proceedings of the 39th IEEE Conference on, volume 3,pages 2645–2650, Dec. 2000.

[20] H. Pasika, S. Haykin, E. Clothiaux, and R. Stewart. Neural networks forsensor fusion in remote sensing. In Neural Networks, 1999. IJCNN ’99.International Joint Conference on, volume 4, pages 2772–2776, July 1999.

[21] P. Ridao, J. Battle, J. Amat, and M. Carreras. A distributed environment forvirtual and/or real experiments for underwater robots. In Robotics andAutomation, 2001. Proceedings 2001 ICRA. IEEE International Conferenceon, volume 4, pages 3250–3255, May 2001.

[22] M. Wilson. Vision systems in the automotive industry. Industrial Robot:An International Journal, 26(5):354–357, 1999.

Page 191: Development of intelligent robot systems based on sensor control

Patent I

Method eliminating position errors caused bykinematics singularities in robotic wrists for use in

intelligent control and telerobotics applications

Mikael Fridenfalk

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

Application No. SE0203180-5, October 2002

Page 192: Development of intelligent robot systems based on sensor control
Page 193: Development of intelligent robot systems based on sensor control

Method eliminating position errors caused bykinematics singularities in robotic wrists for use in

intelligent control and telerobotics applications

Mikael Fridenfalk

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

1 Background of the invention

Forward kinematics in robotics denotes the calculation of the Cartesian posi-tion and orientation of the last link of the robot manipulator for a robot withn degrees of freedom as a function of its joint values, θ1 − θn, see Fig. 1. In-verse kinematics denotes the calculation of joint values as a function of theCartesian position and orientation of the last link of the robot manipulator,and has often several solutions. These solutions are called kinematics config-urations.

In robotics, the manipulator may move into areas in which no inverse kin-ematics solutions exist, generally called kinematics singularities. Kinematicssingularities cause control disruption in robotics and has been the focus of alarge number of scientific papers and a few patents (US4716350, EP0672507,US5159249). The method presented here minimizes the problems caused bykinematics singularities by a new approach.

2 Problem analysis

The Jacobian matrix that gives the relation between joint and Cartesian velo-cities is invalid near and on singularity points. Since joint, velocity, accelera-tion and torque constraints, to name some, are not included in the Jacobian,the Jacobian may at best be used to measure the amount of manipulabilityof the robot near singularity areas. The concept of a “Real Jacobian” is hereintroduced as a Jacobian that also takes consideration to the kinematics anddynamics constraints. Such Jacobian should for instance be non-singular near

Page 194: Development of intelligent robot systems based on sensor control

178

and on the singularity point for Roll-Pitch-Yaw (RPY) wrists, i.e. when θ4 andθ6 coincide in Fig. 1.

Motion near the singularity point of for instance a RPY-wrist may cause greatposition errors, which may cause damage to the robot equipment or the work-piece, or delays in a production line. As robotic systems become more intelli-gent by introduction of sensor guided robot control systems, the requirementson the ability to handle kinematics singularities is proportionally increased,since the precise motion of the robot is no longer known in advance. This ap-plies also to telerobotics applications, where the motion of the robot manipu-lator most often is directly controlled in Cartesian space by a human operator.

3 New method

The new method is based of a different approach to this problem and offersa more general, yet direct solution. According to preliminary experimentsbased on theoretical simulation models, while position errors for the TCP inFig. 1 up to several centimeters were detected in some cases caused by othermethods, the position errors were entirely eliminated by the new method.

By θ4 − θ6 in Fig. 1, the wrist may orient the tool-tip in an arbitrary directionin space. In the example in Fig. 1, since a welding torch is mounted on the lastlink, a change of orientation effects also the position of the TCP. By modific-ation of the inverse kinematics calculations according to Fig. 2, the positionerrors are however entirely eliminated. According to preliminary experimentsmentioned above, while position errors up to several centimeters were elim-inated by this method, the orientation errors remained basically unchanged.

The components of the scheme in Fig. 2 are labeled as A, B and C, where

(A) is a limiter, that limits the motion for a robot with respect to kinematicsand dynamics constraints such as maximum joint, velocity, acceleration andtorque limits. The limiter may in addition be implemented to decrease orequalize constraint values in the neighborhood of singularities, for enhancedcontrol stability.

(B) is a configuration manager that automatically changes kinematics config-uration, if crossover to the new configuration, based on information from A,is unavoidable. The configuration of B is linked to the inverse kinematics boxin Fig. 2.

(C) is a compensator that on the basis of A eliminates any TCP position errors,by the modification of the position of the wrist. The resulting transformationmatrix Pout is built by concatenation of the position part of Pin and rotationpart of Plim.

One way to interpret the functionality of this new method is that since jointsθ1−θ3 are in general driven by much more powerful motors compared to the

Page 195: Development of intelligent robot systems based on sensor control

179

wrist joints, position errors are eliminated by using the more powerful motorsof the robot to compensate for the shortcoming of the less powerful ones.

Brief description of the drawings

1. A typical industrial robot, ABB IRB 2400.16, with a RPY-wrist and a weld-ing torch. The Tool Center Point (TCP) is defined as the tool-tip of therobot. Joint 3 (θ3) is here symbolically denoted according to this figure.In reality, for this specific robot, this joint is managed by a link near thebase of the robot.

2. The flow scheme of the new method. The input to the system is Pin,which is a 4× 4 transformation matrix often used in robotics, includingboth position and orientation of the TCP. In block C, the rotation partof the limited input Plim is concatenated with the position part of Pin,thereby eliminating the position errors of the TCP, while keeping theorientation errors basically unmodified. The reason why this is possibleis because the motors of the first three joints are in general much morepowerful than the wrist motors.

Page 196: Development of intelligent robot systems based on sensor control

180

Abstract

The invention consists of a method that minimizes problems caused by kin-ematics singularities in robotic wrists. By elimination of position errors nearand on inner singularity points, based on kinematics and dynamics constraints,fewer failures may occur, especially in intelligent robot control systems or tel-erobotics applications. One interpretation of the new method is that strongermotors are allowed to compensate the errors caused by weaker ones of a ro-botic manipulator.

Page 197: Development of intelligent robot systems based on sensor control

181

Claims

1. Method for elimination of position errors near and on inner kinemat-ics singularity points for robotic wrists, by using an inverse kinematicsscheme according to Fig. 2, based on AC, where A is an inverse kinemat-ics calculator followed by a limiter that limits the motion for a robot withrespect to kinematics and dynamics constraints of the total robot systemincluding its surrounding environment, incorporating properties such asmaximum joint, velocity, acceleration and torque limits. This limiter mayadditionally decrease or equalize constraint values in the neighborhoodof singularities, for enhanced control stability. C is a compensator thaton the basis of A modifies the position of the TCP in Fig. 1, building thetransformation matrix Pout in Fig. 2, by concatenation of the positionpart of Pin and the rotation part of Plim.

2. According to above, but based on ABC in Fig. 2, where B is a configur-ation manager that automatically changes kinematics configuration forA, if crossover to the new configuration, based on information from A,is unavoidable. The configuration to be used is selected by B and iscommunicated to the inverse kinematics box in Fig. 2, so that the sameconfiguration is used throughout the scheme.

Page 198: Development of intelligent robot systems based on sensor control

182

TCP

θ6

θ5

θ4

θ3

θ2

θ1

n

a

Fig. 1

Fig. 2

A or AB Forward Kinematics

Pin Θlim

Inverse Kinematics

ΘoutPout

Ppos

Prot

C

Configuration

Plim

Page 199: Development of intelligent robot systems based on sensor control

APPENDIX

Page 200: Development of intelligent robot systems based on sensor control

184

Page 201: Development of intelligent robot systems based on sensor control

Patent II

Intelligent system eliminating trajectoryprogramming and geometrical databases in robotic

welding

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

Application No. SE0203239-9, November 2002

Page 202: Development of intelligent robot systems based on sensor control
Page 203: Development of intelligent robot systems based on sensor control

Intelligent system eliminating trajectoryprogramming and geometrical databases in robotic

welding

Mikael Fridenfalk, Gunnar Bolmsjö

Division of Robotics, Department of Mechanical EngineeringLund University, Box 118SE-221 00 Lund, Sweden

www.robotics.lu.se

1 Introduction

1.1 Robot welding

Robot automation increases productivity, the product quality and frees manfrom involuntary, unhealthy and dangerous work. While computational powerhas increased exponentially during the last decades, since many new applica-tion areas require operation in relatively unstructured environments, the lim-itations in flexibility constitute today a bottleneck in the further evolution ofrobotic systems.

One application for intelligent systems is seam tracking in robotic welding.Seam tracking is among others used for the manufacturing of large ships suchas cruisers and oil tankers, where high tolerances in the sheet material areallowed to minimize manufacturing costs [2, 11]. Seam tracking is today typ-ically only performed in 2D, by a constant motion in x and compensation ofthe errors in y and z directions, see Fig. 1. There exist solutions for higherautonomy that include seam tracking in 3D where compensation is also per-formed in o direction or around an orientation axis. These limitations makehowever seam tracking only possible for workpieces with relatively simplegeometrical shapes.

The next step in the evolution of intelligent systems in robotics welding isperhaps the introduction of a full 6D sensor guided control system for seamtracking that is able to correct the TCP in x, y , z and around roll, pitch andyaw. Such ultimate system is per definition able to follow any continuous3D seam with moderate curvature. This system has many similarities withan airplane control system, where a change in either position or orientationeffects all other five degrees of freedom.

Page 204: Development of intelligent robot systems based on sensor control

188

Though seam tracking has been performed in 2D since at least 40 years ago,the hypothetical design of 6D seam tracking systems was probably first ad-dressed in the beginning of the last decade by suggestions of models basedon force sensors [4, 5] and laser scanners [15, 16]. It is however difficult toevaluate these systems, since no experimental results are presented, neitherany explicit control systems for creating balance between the subsystems.

The major contribution of this patent is the invention of a universal 6D seamtracking system for robotic welding, validated by simulation experiments basedon physical experiments, that proved that 6D seam tracking is possible andeven very robust for laser scanning and arc sensing. On a more detailed level,the general contributions of this patent is the introduction of: (1) the conceptof a 6D seam tracking based on arc sensing, (2) differential control, not only inarc sensing as performed in 2D seam tracking [6], but also in laser scanning,(3) a trajectory tangent vector using differential vector interpolation control inlaser scanning, (4) the trajectory tangent vector in arc sensing, (5) pitch controlfor laser scanning by vector interpolation, (6) pitch control in arc sensing and(7) new method replacing quaternion interpolation and thereby increasing thecomputational efficiency significantly in arc sensing.

Though the development of the seam tracking system in this patent was per-formed entirely without any knowledge of previous work within 6D seamtracking, there is perhaps by coincidence one unique similarity to be foundbetween the this system and previous in [15, 16], which is the introduction ofa trajectory tangent vector by curve fitting. There exist however large differ-ences between how such vector was used and implemented. The differencesconsist of (1) curve fitting by 2nd instead of 3rd degree polynomials, for fastercomputation and according to preliminary experiments, for higher control sta-bility, (2) using an analytic solver for curve fitting of 2nd degree polynomialsdeveloped and implemented for this system, increasing the calculation speedfurther and (3) not using the trajectory tangent vector directly, but by differ-ential vector interpolation between what could be called the current and theestimated trajectory tangent vectors, for essential control stability, and (4) theintroduction of the trajectory tangent vector also in arc sensing.

The objective of this work was to increase the intelligence and flexibility inmanufacturing systems based on robotic welding by the design and validationof a universal 6D seam tracking system based on laser scanning or arc sensing.This eliminates robot trajectory preprogramming and geometrical databases.The simulation results (based on physical experiments) showed that the initialobjective to design a 6D seam tracking system that could manage a radiusof curvature down to 200 mm was reached. Further on, according to theexperiments presented in this paper, if low-speed seam tracking is performedwithout welding to find the geometry of the workpiece, there is basically nolimit how small radius of curvature this system is able to handle, using laserscanning.

Page 205: Development of intelligent robot systems based on sensor control

189

2 Materials and methods

2.1 Sensors guided robot control

Many systems have been developed during the last decades for seam trackingat arc welding. The patents and scientific publications within this applicationarea show that laser scanners and arc sensors today replace older systems.Laser scanners have in general high accuracy but are expensive and have to bemounted on the torch, thereby decreasing the workspace of the robot. Thisproblem is partly solved by introduction of an additional motion that rotatesthe scanner around the torch to keep its relative orientation constants withrespect to the seam profile. Arc sensors are inexpensive but inaccurate due tothe stochastic nature of arc welding. Since welding of large structures in gen-eral requires a relatively low accuracy however, arc sensors are competitivealternative to laser scanners. In addition, it should be mentioned that thereexist solutions for digital image processing of the workpiece before welding.The problems are however that small changes in surrounding light may dis-turb analysis. Further on, 3D spline curve seams require the camera to followthe seam all the way to be able to capture segments that otherwise would haveremained hidden. This applies also to very long seams. In practical applica-tions today, solutions based on pure image processing by a CCD camera aresuccessfully used to find the initial point where 2D seam tracking, based onlaser scanning or arc sensing should start [11].

Laser scanning

In laser scanning, the seam profile is extracted by periodically scanning thelaser beam across a plane perpendicular to the direction of the seam. Thelaser scanner is mounted on the torch in front of the direction of welding.As a general rule, the control system is based on full compensation of theposition errors, which is equal to a constant value of K1 = 1, as defined inFig. 5.

Arc sensing

In “through-arc” sensing or simply arc sensing [6], the arc current is sampledwhile a weaving motion is added to the TCP in n direction, see Fig. 1. Weavingis useful in arc welding due to process related factors and was used alreadybefore the introduction of arc sensing in the beginning of the 80s. In arc sens-ing, by keeping the voltage constant it is possible to calculate the distancefrom TCP to the nearest seam wall at each current sample, and thereby duringa weaving cycle obtain the profile of the seam. According to experimental res-ults [6] the approximate relationship between arc voltage V , arc current I and

Page 206: Development of intelligent robot systems based on sensor control

190

the nearest distance between the electrode and the workpiece l for an elec-trode extension ranging between 5-15 mm, may be expressed by the relation:

V = β1I + β2 + β3/I + β4l (1)

where the constants β1 − β4 are dependent on factors such as wire, shieldinggas and power-source. In practice the voltage and current readings of the arccontain much noise, why the signal has to be low-pass filtered before it isused for feedback control. An example of such filtered signal, sampled in theROWER-2 project is presented in Fig. 3. In automatic control, error is definedas the difference between desired (input) valueu and current (output) value y .The control algorithms in arc sensing are based on methods such as templatematching or differential control [6]. In template matching the errors may forinstance be calculated by integrating the difference of the template u and thearc signal y , where A denotes the weaving amplitude:

Ea = εa =∫ A−A(u(n)− y(n))dn (2)

En = εn =∫ 0

−A(u(n)−y(n))dn−

∫ A0(u(n)−y(n))dn (3)

Another example includes using integrated difference squared errors. Theequation above use n as variable to simplify the notation. Actually, morespecifically, the integration between −A and A is performed in negative odirection, using an auxiliary variable ψ denoting the state of weaving in ndirection, integrated over [ψ−A,ψA], where ψ−A and ψA denote the turningpoints of weaving. In differential control, which has proven to be quite reliablefor control in a andn directions, current sampling is performed at the turningpoints of the weaving. Here, the errors in a and n directions are obtained by:

εa = Iref − i+A + i−A2(4)

Ea = Kaεa (5)

εn = i+A − i−A (6)

Page 207: Development of intelligent robot systems based on sensor control

191

En = Knεn (7)

where Iref is a reference current value and i+A and i−A are the average meas-ured current values at a pair of adjacent extreme points. The parameters Kaand Kn are dependent on the weld joint geometry and other process paramet-ers such as shielding gas and wire feed rate. Since these parameters will beknown in advance, Ka and Kn may be defined for any welding application. Anexample of how Ea and En are used for position control follows below:

pN = pN−1 + Enn− rwo+ Eaa (8)

where pN−1 and pN denote current and next positions and rw is nominal dis-placement during a weaving cycle, also in Fig. 1, which is equal to weldingspeed (without consideration to the weaving motion) divided by weaving fre-quency.

Virtual sensors

The development of sensor guided control systems may be accelerated us-ing virtual sensors in the simulation control loop. Since a system’s state isoften easily obtained in simulation, assuming that the simulation model issufficiently accurate compared to the real system, this provides for a betterfoundation for process analysis than in general is possible in physical experi-ments. Further on, insertion of a sensor in a control loop may cause damage toexpensive equipment, unless the behavior of the entire sensor guided controlsystem is precisely known, which is rarely the case at the development stage.

Virtual sensors are used in many application areas, such as robotics, aerospaceand marine technologies [3, 8, 14, 17, 19]. Development of new robot systems,such as for seam tracking may be accelerated by application of simulation [12,13]. In general, the design methodology of virtual sensors may vary due totheir specific characteristics. If a sensor model is not analytically definable,artificial neural networks may be used to find appropriate models [1, 18].

2.2 Robot system

The design of the 6D seam tracking system was a spin-off from the EuropeanROWER-2 project. The objective of this project was to automate the weldingprocess in shipbuilding by the design of a robotic system, specifically for join-ing double-hull sections in super-cruisers and oil tankers. Figure 2 shows the

Page 208: Development of intelligent robot systems based on sensor control

192

robot system developed in the ROWER-2 project and the corresponding sim-ulation model in FUSE (described later). The implementation of the 3D seamtracking system was performed in C/C++, running on a QNX-based embeddedcontroller (QNX is an operating system for embedded controllers). The robotwas manufactured partly in aluminum alloy to decrease the weight for easytransportation and mounted on a mobile platform with 1 degree of freedomfor increased workspace.

2.3 Simulation system

The development and implementation of the 6D seam tracking system wasinitially performed in the Flexible Unified Simulation Environment (FUSE) [10].FUSE is an integrated software system based on Envision (robot simulator,Delmia Inc.) and Matlab (MathWorks Inc.), including Matlab Robot Toolbox [7].The integration was performed by compiling these software applications andtheir libraries into one single software application. The methodology de-veloped for the design of the 6D seam tracking system was based on theintegration of mechanical virtual prototyping and software prototyping. The6D seam tracking simulations in FUSE were performed either using the robotdeveloped in the ROWER-2 project or the ABB S4 IRB2400.16 robot, see Figs. 2and 9. After the design and successful validation of the 6D seam trackingsystem for arc sensing, the virtual prototyping and software models were en-tirely translated to Matlab code. The Matlab models were modified to workalso for laser scanners by the addition of a virtual model of the Servo-RobotM-SPOT laser scanner system. The translation was necessary for the accuratecalculation of position and orientation errors during simulation.

2.4 Calculation of position and orientation errors

Error is defined as the difference between desired and current pose P. The po-sition errors were calculated by projection of εa and εn on a plane Ωt perpen-dicular to the desired trajectory while intersecting current TCP, Pt . Thereby,per definition εo = 0. The orientation error (the orientation part of Pδ) iscalculated by:

Pδ = P−1t · PΩt (9)

where Pt is the current TCP and PΩt is the desired TCP for any sample pointt. The errors around n, o and a are calculated by the subsequent conversionof the resulting transformation matrix to roll, pitch and yaw, here defined asrotation around, in turn, a, o andn. A rotation in the positive direction in here(and in general) defined as counterclockwise rotation from the perspective ofa viewer when the axis points towards the viewer.

Page 209: Development of intelligent robot systems based on sensor control

193

3 Modeling

In this section the design of the 6D seam tracking system is described forlaser scanning and arc sensing. The principal control scheme of this systemis presented in Fig. 4. The control system is based on three main components,position, trajectory tangent vector and pitch control. Pitch denotes rotationaround o. The main input of this control system is the 4 × 4 transformationmatrix PN−1, denoting current TCP position and orientation. The output isnext TCP, PN . N is the number of samples used for orientation control. Theinput parameters are N , the proportional control constants K1, K2 and K3,and the generic sensor input values denoting ξ = ζ = [v0 v1] in Fig. 5 forlaser scanning and ξ = [εa εn] in Eqs. 10-11 and ζ = [k1 k2] in Figs. 6-7 forarc sensing. The proportional constants K1, K2 and K3 denote the controlresponse to errors in position, trajectory tangent vector and pitch control.N is also used in trajectory tangent vector control and is a measure of thememory size of the algorithm.

The seam profile presented in Fig. 1 is called a fillet joint. Since all standardseam profiles are basically similar, the generic parameters ξ and ζ may beestimated by the same profile extraction methods used for fillets joints inboth laser scanning and arc sensing. An example of an alternative to filletis the V-groove, which mainly contains a gap between the seam walls. In arcsensing, since electrical current always takes the shortest path between TCPand the seam walls, see Fig. 6, such gap has in general small influence on themeasurements in arc sensing. In addition, since using other shapes than sinus,such as sawtooth and square wave, does not effect seam tracking as much asit effects the welding process in arc sensing, only sine waves were used in thesimulation experiments.

3.1 Position control

The 6D seam tracking algorithm starts by the calculation of next position forthe TCP. In laser scanning, next position pN is calculated by vector interpola-tion based on K1 in Fig. 5, between current position pN−1 and the position p′extracted from the laser scanner.

In arc sensing, the position is calculated by the differential method in Eqs. 10-11, rewritten as a function of distance instead of current (with d defined inFig. 1), calculated by using Eq. 1:

εa = s(iA)+ s(i−A)− 2d (10)

εn = s(i−A)− s(iA) (11)

Page 210: Development of intelligent robot systems based on sensor control

194

Since the distance to the workpice is approximately inverse proportional tothe current, this may explain the main differences between Eqs. 4 and 6 andEqs. 10-11. Also here, Ea = Kaεa, En = Knεn and pN is calculated by Eq. 8.In the ROWER-2 project, simulation and physical experiments showed thatKn should be 50% of Ka for maximum control stability, both for Eqs. 4-6 andEqs. 10-11. Here, Ka = K1 and Kn = K1/2. Since the position is controlled bykeeping the distance from TCP to the nearest seam wall constant at the turningpoints of the weaving, the desired distance from TCP to the intersection pointof the walls in Fig. 1 was derived as:

D = A+ d√

1+ tan2 (α/2)tan (α/2)

(12)

This equation is valid for any angle α different to an integer multiple of π ,where D is instead equal to d. This relation was primarily used for the calcu-lation of the position errors in evaluation of the 6D arc sensing system.

3.2 Trajectory tangent vector control

The positions p1 − pN are used for trajectory estimation by curve fitting, seeFig. 8. To decrease calculation time, an analytical estimator was devised for 1stand 2nd degree polynomials that showed to work much faster than traditionalmatrix inversion techniques using Gauss elimination. The estimation of thetrajectory at N gives o′. Since an attempt to abruptly change the direction ofo from oN−1 to o′ would most probably create instability in the system, themotion between oN−1 and o′ is balanced by vector interpolation using K2.

3.3 Pitch control

In laser scanning, the first step in pitch control is performed by vector in-terpolation between aN−1 and a′, using K3, see Fig. 5. In a second step aNis calculated by orthogonally projecting a′′ on a plane perpendicular to oN .Therefore, nN is calculated by the cross product oN × aN .

In arc sensing, pitch control is performed in two steps. First, aN−1 is orthogon-alized with respect to oN . This step could also, in similarity to laser scanning,be performed after the second step (which then becomes the first step). Em-pirical experiments showed however that the performance of the algorithmwas slightly improved by pre-orthogonalization. In the second step the newcoordinate system is rotated around o by δφ:

δφ = −K3 · (k1 + k2) (13)

Page 211: Development of intelligent robot systems based on sensor control

195

Here, δφ is in radians, and k1 and k2 are estimated by the least square method,based on current measurements during a half weaving cycle, see Figs. 6-7. Theindex i in these figures denote current samples. Since current always takesthe shortest path between TCP and the nearest seam wall (homogeneouslyconducting surface) and the distance between the TCP and the seam walls maybe calculated by Eq. 1, the full extraction of the seam profile is theoreticallypossible. With sinγ = ∂s/∂n and k = −∂s/∂n in Figs. 6-7, the followingrelation is deduced:

k = − sinγ (14)

The orientation of the walls projected on planeΩ is calculated from the vectorcomponents of si:

si = |si|(cosγ · ai + sinγ ·ni) (15)

where |si| is the distance calculated from the current measurement, usingEq. 1. A combination of Eq. 14 and sin2 γ + cos2 γ = 1 gives:

cosγ =√

1− sin2 γ =√

1− k2 (16)

which finally yields the direction of the current, projected on plane Ω:

si = |si|(√

1− k2 · ai − k ·ni) (17)

3.4 Code samples

The actual implementation of the 6D seam tracking system showed to be bothsimple and concise. In simulation, plane Ω is assumed to intersect the TCPpoint. In reality however, the laser scanner is placed on the torch, often sev-eral centimeters ahead of the direction of the seam. The transformation tosuch system is simply performed by storing the trajectory data, for use witha time delay. Most of the development effort was spent on the design of thesurrounding environment, including a semi-automatic testing and evaluationsystem. The implementation of the arc sensing system contained much code,why it is not presented here. The Matlab implementation of the 6D seamtracking system for laser scanning, also presented in Fig. 4 follows below:

function P = next_P(P,v,w);

Page 212: Development of intelligent robot systems based on sensor control

196

p = P(1:3,4)-P(1:3,2)*w.r_w+w.K1*(v(:,1)-P(1:3,4)+v(:,2)*w.D);global x; x = [x(:,2:end) p];for i = 1:3, F(i) = pfit(1:w.N,x(i,:)); endo = unit(P(1:3,2)-w.K2*(unit(F’)+P(1:3,2)));a = unit(P(1:3,3)-w.K3*(P(1:3,3)+v(:,2))); a = unit(a-o’*a*o);P(1:3,:) = [cross(o,a) o a p];

The input of this function is current pose, here denoted simply as P, read-ings from laser scanner v according to Fig. 5, and the parameter data struc-ture w. The output is next pose, also here denoted as P for code minim-ization. The parameter w.D is equal to D in the same figure, F is definedin Fig. 8 and w.r_w here is nominal displacement during a control sample.The matrix x denotes here the last N positions p0...pN−1. By calculationof pN and the subsequent matrix shift, p0 is erased. The initial value of xis determined by linear extrapolation with respect to the estimated direc-tion at the beginning of the seam. The function pfit(1:w.N,x(i,:)) isequal to polyfit(1:w.N,x(i,:),2)*[2*w.N 1 0]’. The pfit function wasdeveloped in this work using Maple (Waterloo Maple Inc.) and gives an ana-lytical solution to 2nd degree polynomial fitting for this application, that forexample works better than 8 times for N = 10 and 4 times for N = 100 thanpolyfit. The normalization function unit(x) from Matlab Robot Toolbox [7]is for a n × 1 vector x equal to x/

√xTx. The analytical function pfit was

implemented as below:

function k = pfit(x,y)

n = length(x); sx = sum(x); sy = sum(y); sxx = x*x’; sxy = x*y’;sx3 = sum(x.ˆ3); sx4 = sum(x.ˆ4); sxxy = sum(x.ˆ2.*y);t2 = sx*sx; t7 = sx3*sx3; t9 = sx3*sxx; t12 = sxx*sxx;den = (sx4*sxx*n-sx4*t2-t7*n+2.0*t9*sx-t12*sxx);if abs(den) < 1e-5, k = 0; else,

t15 = 1/den; t21 = (sx3*n-sxx*sx)*t15;k = 2.0*n*((sxx*n-t2)*t15*sxxy-t21*sxy+(sx3*sx-t12)*t15*sy)- ...

t21*sxxy+(sx4*n-t12)*t15*sxy+(t9-sx4*sx)*t15*sy;end

In linear fitting, used for the calculation of k1 and k2, k is calculated by:

k = (n*sxy-sx*sy)/(n*sxx-sxˆ2);

3.5 Computational costs

The computational costs for the 6D seam tracking system based on laser scan-ning (without consideration to memory operations) was preliminary estimated

Page 213: Development of intelligent robot systems based on sensor control

197

to less than 250+ 50N floating point operations, which for N = 10 gives 750operations per control cycle. At a rate of maximum 30 scans per second, thisgives 22.500 floating-point operations per second.

3.6 Arc sensing weaving interpolation

The interpolation between PN−1 and PN is usually performed by vector inter-polation of position p and quaternion interpolation of the orientation n,o,a.Since most of the calculation time in arc sensing was spent on quaternion in-terpolation, a new method was adopted that empirically showed to work muchfaster and still performed as well as quaternion interpolation in 6D seam track-ing. The denotations in this subsection, generically labeled as x′ and x′′ arelocal and not related to other denotations elsewhere.

In this method, linear interpolation was first performed between PN−1 and PNincluding o, a and p for any sample i, which resulted in o′i, a

′′i and pi. o′i was

rescaled to the unit vector oi, anda′′i was projected on the plane perpendicularto oi by:

a′i = a′′i − (a′′i · oi)oi (18)

The dot producta′′i ·oi gives the length of the vectora′′i orthogonally projectedon oi. ai was finally calculated by rescaling a′i to the unit vector ai, and niwas calculated by oi × ai. The weaving motion was added as a second layerof motion to Pi in ni direction. In robot simulation the torch was allowedto freely rotate ±180 around a as a third layer of motion, which showed tobe important, but not entirely imperative for the robot to be able to perform360 welding of pipe intersection joints, see Fig. 9. The angle around a wasdetermined by the position of the robot with respect to the position of theworkpiece.

3.7 Kinematics singularities

In sensor guided control, the manipulator may involuntarily move into areasin which no inverse kinematics solutions exist, generally called kinematicssingularities. It is however possible to minimize inner singularity problemscaused by robotic wrists. A novel method is suggested in [9] which was de-signed for the 6D seam tracking system, but works basically for any industrialrobot with 5-6 degrees of freedom. In this method, the stronger motors of-ten at the base of the robot assist the weaker wrist motors to compensate forany position error. This method allows also for smooth transitions betweendifferent configurations.

Page 214: Development of intelligent robot systems based on sensor control

198

In addition to the addition of this method, to ensure good welding quality, pre-scanning and trial run may be performed before welding in laser scanning. Inthe first step, low-speed seam tracking is performed to calculate the trajectoryof the TCP, following the seam. In the second step, a trial run is performedin full speed to check the performance of the robot, and finally the actualwelding is performed.

3.8 Initial and final conditions

In the experiments, seam tracking was initiated from a proper position alreadyfrom start. The start position for seam tracking is in general found either byimage analysis using a camera or by seam tracking itself. Since previous posi-tions are unknown at start, an estimation is made of previous N−1 positions,as previously mentioned, by extrapolation of the trajectory in the direction ofo. This is the reason why in some experiments, higher orientation errors oc-curred at the beginning of seam tracking. The condition to stop seam trackingmay be based on the position of the TCP. The stop signal may be triggered bydefinition of volumes outside which no seam tracking is allowed.

4 Experimental results

4.1 Workpiece samples

The position and orientation errors at seam tracking were calculated basedon simulation experiments of basically 8 different workpieces, see Figs. 10-18. The workpieces in Matlab were approximated by lines orthogonal to thedirection of the weld and the resolution was defined as the nominal distancebetween these lines. These resolutions were set to 0.02 mm in laser scanningand 0.5 mm in arc sensing.

The workpieces are categorized into two groups, continuous and discrete. Inthe experiments presented in this patent, the range was 0− 180 for continu-ous, and 10-100 mm for discrete workpieces. In seam tracking of a continuousworkpiece, a sudden change is interpreted as a curve with a small radius ofcurvature, see Figs. 10-11 and 15-16, while in seam tracking of a discrete work-piece a sudden change is regarded as a disturbance, see Figs. 13-14 and 17-18.The settings of the control parameters decide alone the mode of the controlsystem: continuous or discrete. For seam tracking of objects with a large ra-dius of curvature, the discrete mode works well for both a continuous and adiscrete workpiece, see Fig. 12.

The control parameters K1, K2, K3 and N used in the presented experimentsshowed to work well for each category, continuous or discrete. These set-tings were found by more than 150 simulation experiments in Matlab, and

Page 215: Development of intelligent robot systems based on sensor control

199

large deviation from these parameters showed to create instability in the sys-tem, making the control components working against each other instead of insynergy, resulting in divergence from the seam and ultimate failure at seamtracking.

As mentioned previously, the seam tracking system is theoretically able tohandle any angle α (in Figs. 1, 5 and 6) between 0 and 180. In practice how-ever, a small angle may cause collision between torch and workpiece. Forsimplicity, α was set to 90 for all experiments except for pipe intersections.Further, in arc sensing d in Fig. 1 was set to 1 mm.

4.2 Laser scanning

M-SPOT has a sweeping angle of 28, an accuracy better than 0.02 mm at thedistance of 100 mm, and 1.5 mm at a distance of 1 m. It is able to scan at afrequency of up to 40 Hz with a resolution of maximum 512 points per scan.The virtual model of the M-SPOT that was designed, were set to have an accur-acy of 0.02 mm and was mounted on the torch at a distance of approximately80 mm from the workpiece. The resolution was set to 11 points per scan fora sweeping frequency of 5 (the only exception was for n direction steps, seeFigs. 14 and 18 , where 31 points per scan with a sweeping angle of 10 wasused instead) and the scanning frequency to 30 Hz, which showed to be fullysufficient for successful seam tracking.

In fact, the simulation experiments proved that the accuracy of M-SPOT washigher than needed for seam tracking. The accuracy is so high in close range(100 mm) that only robots with very high performance are able to use the ac-curacy of this sensor to its full extent, when seam tracking is performed inreal-time. The experiments presented in Figs. 10-14 show the limits of per-formance for the 6D seam tracking system (based on the M-SPOT), when it isat the verge of instability. For larger radius of curvatures and a lower valuefor K2, i.e. for normal operation, Fig. 12 gives perhaps a better indication ofthe efficiency of the 6D seam tracking system. Since the performance of a ro-bot is dependent on current joint values, it is further not possible to translatethese values directly to joint position, velocity and acceleration. What theseexperiments proved however, was that performance of the robot is the onlylimitation in seam tracking based on laser scanning.

The laser scanning simulations presented in this patent were performed witha constant welding speed of 10 mm/s and a virtual TCP, 2 mm away fromthe intersection point of the seam walls. Empirical experiments showed thatby moving the TCP from previously D = 15 mm in Fig. 5 to only 2 mm, thecontrol ability of the 6D seam tracking was highly improved, especially in seamtracking of objects with a small radius of curvature. Though the parametersettings K1 = K2 = K3 = 1 showed to be unstable in laser scanning, K1 =1, K2 = K3 = 0.8 showed to be relatively stable for continuous welds. For

Page 216: Development of intelligent robot systems based on sensor control

200

discrete workpieces, K2 was set as low as 0.1 to maximize control stability.

4.3 Arc sensing

According to [20], the accuracy of an arc sensor in seam tracking is 0.5 mm. Insimulation, the workpiece line resolution was set to 0.5 mm. To increase theerrors in the system, a random signal with a range of 1 mm was added to themeasured distance. In addition, no integration of the current measurementvalues was made near the turning points at weaving. On the contrary, onemeasurement was considered as enough for each turning point. The signalwas prefiltered by an active 4:th order Bessel low-pass filter in the ROWER-2project. In the simulation experiments presented in this patent no filteringwas performed. Thereby the accuracy had been decreased to less than halfof [20]. In practical applications however, the arc signal should be filtered bya method that does not add any phase shift to the signal, such as a properdigital filter.

The arc sensing simulations presented in this patent were performed with aconstant welding speed of 10 mm/s, weaving amplitude and frequency of 5mm and 3 Hz, and 64 current samplings per weaving cycle (or 192 samplingsper second). The algorithm worked well also for lower sampling rates, such as50 samples per second, which was used in the ROWER-2 project. In general,higher sampling rates showed to improve the control ability of the system, forrates higher than perhaps 10 samples per weaving cycle, though rather slightlythan proportionally. The simulation models proved to predict the estimationsderived from physical experiments in the ROWER-2 project for 6D motion.According to those experiments, it was possible to perform differential seamtracking in y and z directions in Fig. 1, up to 20 mm for a seam of the length1 m. At a speed of 9 mm per second, weaving amplitude and frequency of3 mm and 3 Hz, this gives a redirection capability of 1.15 per weaving cyclewhich in the worst case is equivalent to a minimum radius of 150 mm. Usinga power-source without any intrinsic current control would further give up to3 times better results according to the experiments in the ROWER-2 project(current control was only applied by the power source to limit the current).Using a high performing industrial robot and a weaving amplitude of 5 mminstead of 3, this would finally bring down the curvature to 25-50 mm, whichactually was the amount found by simulation.

5 Results and discussion

A 6D seam tracking system was designed and validated by simulation that isable to follow any continuous 3D seam with moderate curvature, using a laserscanner or an arc sensor, and is able to continuously correct both position and

Page 217: Development of intelligent robot systems based on sensor control

201

orientation of the tool-tip of the welding torch along the seam. Thereby theneed for robot trajectory programming or geometrical databases were elimin-ated. This system was designed primarily to increase the intelligence in roboticarc welding, but is also applicable for instance in robotic laser welding.

Simulations based on the M-SPOT laser scanner showed that 6D seam track-ing was theoretically possible, of objects down to a radius of curvatures of 2mm. Seam tracking in real-time requires however a robot with very high per-formance. It is however possible to perform low-speed seam tracking withoutwelding to find the geometry of the workpiece, and perhaps even checkingthat the trajectory is within the workspace of the robot, before actual weldingis performed. Thereby it will be possible to perform welding of very smallobjects, theoretically also including sharp edges.

Simulations based on physical experiments in the ROWER-2 project showedthat arc sensing was theoretically possible for a radius of curvature below 50mm. This is however much less than the initial objective to design a 6D seamtracking system for laser scanners and arc sensors that could manage a radiusof curvature down to 200 mm, which is still considered as small for instancein shipbuilding. By performance of “non-destructive” and low-speed seamtracking using low voltage and current in arc sensing, it is perhaps possible toseam track also workpieces with very small radius of curvatures, by calculationof the trajectory, before real welding is performed. Thereby, welding usingweaving in arc sensing will no longer be imperative, but an option.

6 Future work

Though the 6D seam tracking system showed to be very robust, optimizationof the algorithm is still possible. Presently, pitch control in arc sensing isbased on data measurements during a half weaving cycle. By extending themeasurements to one full cycle, pitch control may be optimized. Further on,though the control system worked well using proportional constants alone, ifneeded, PID or adaptive control may be included to enhance the performance.One of many examples is for instance to define K2 as a function of the “radiusof curvature” (or more specifically as a function of c2 in Fig. 8) of the 2nddegree polynomial in trajectory tangent vector control, thereby enable thesystem to automatically change between normal operation, used for maximumcontrol stability, and extraordinary operation, giving fast response which isessential for management of very small radius of curvatures.

References

[1] M. A. A. Arroyo Leon, A. Ruiz Castro, and R. R. Leal Ascencio. An artificialneural network on a field programmable gate array as a virtual sensor. In

Page 218: Development of intelligent robot systems based on sensor control

202

Design of Mixed-Mode Integrated Circuits and Applications, 1999. ThirdInternational Workshop on, pages 114–117, July 1999.

[2] G. Bolmsjö. Sensor systems in arc welding. Technical report, RoboticsGroup, Department of Production and Materials Engineering, Lund Uni-versity, Sweden, 1997.

[3] G. Bolmsjö. Programming robot welding systems using advanced simu-lation tools. In Proceedings of the International Conference on the Joiningof Materials, pages 284–291, Helsingør, Denmark, May 16-19 1999.

[4] H. Bruyninckx, J. De Schutter, and B. Allotta. Model-based constrainedmotion: a. modeling, specification and control. In Advanced Robotics,1991. ’Robots in Unstructured Environments’, 91 ICAR., Fifth InternationalConference on, volume 2, pages 976–981, 1991.

[5] H. Bruyninckx, J. De Schutter, and B. Allotta. Model-based constrainedmotion: b. introducing on-line identification and observation of the mo-tion constraints. In Advanced Robotics, 1991. ’Robots in Unstructured En-vironments’, 91 ICAR., Fifth International Conference on, volume 2, pages982–987, 1991.

[6] G. E. Cook, K. Andersen, K. R. Fernandez, M. E. Shepard, and A. M. Wells.Electric arc sensing for robot positioning control. Robot Welding, pages181–216. Editor: J. D. Lane. IFS Publications Ltd, UK, Springer-Verlag,1987.

[7] P. Corke. A robotics toolbox for MATLAB. IEEE Robotics and AutomationMagazine, 3(1):24–32, Mar. 1996.

[8] W. Doggett and S. Vazquez. An architecture for real-time interpretationand visualization of structural sensor data in a laboratory environment.In Digital Avionics Systems Conferences, 2000. Proceedings. DASC. The19th, volume 2, pages 6D2/1–6D2/8, Oct. 2000.

[9] M. Fridenfalk. Eliminating position errors caused by kinematics singu-larities in robotic wrists for use in intelligent control and teleroboticsapplications. Patent application SE0203180-5, October 2002.

[10] M. Fridenfalk and G. Bolmsjö. The Unified Simulation Environment —Envision telerobotics and Matlab merged in one application. In the Pro-ceedings of the Nordic Matlab Conference, pages 177–181, Oslo, Norway,October 2001.

[11] M. Fridenfalk and G. Bolmsjö. Design and validation of a sensor guidedrobot control system for welding in shipbuilding. Accepted for futurepublication in the International Journal for the Joining of Materials, 2002.

Page 219: Development of intelligent robot systems based on sensor control

203

[12] M. Fridenfalk, U. Lorentzon, and G. Bolmsjö. Virtual prototyping andexperience of modular robotics design based on user involvement. Inthe Proceedings of the 5th European Conference for the Advancement ofAssistive Technology, Düsseldorf, Germany, Nov. 1999.

[13] M. Fridenfalk, M. Olsson, and G. Bolmsjö. Simulation based design of arobotic welding system. In Proceedings of the Scandinavian Symposiumon Robotics 99, pages 271–280, Oulu, Finland, October 14-15 1999.

[14] G. Hailu and A. Zygmont. Improving tracking behavior with virtualsensors in the loop. In Aerospace Conference, 2001, IEEE Proceedings.,volume 4, pages 1729–1737, 2001.

[15] N. Nayak and A. Ray. An integrated system for intelligent seam track-ing in robotic welding. Part I. Conceptual and analytical development.In Robotics and Automation, 1990. Proceedings., 1990 IEEE InternationalConference on, volume 3, pages 1892–1897, 1990.

[16] N. Nayak and A. Ray. An integrated system for intelligent seam trackingin robotic welding. Part II. Design and implementation. In Robotics andAutomation, 1990. Proceedings., 1990 IEEE International Conference on,volume 3, pages 1898–1903, 1990.

[17] M. Oosterom and R. Babuska. Virtual sensor for fault detection and isol-ation in flight control systems — fuzzy modeling approach. In Decisionand Control, 2000. Proceedings of the 39th IEEE Conference on, volume 3,pages 2645–2650, Dec. 2000.

[18] H. Pasika, S. Haykin, E. Clothiaux, and R. Stewart. Neural networks forsensor fusion in remote sensing. In Neural Networks, 1999. IJCNN ’99.International Joint Conference on, volume 4, pages 2772–2776, July 1999.

[19] P. Ridao, J. Battle, J. Amat, and M. Carreras. A distributed environment forvirtual and/or real experiments for underwater robots. In Robotics andAutomation, 2001. Proceedings 2001 ICRA. IEEE International Conferenceon, volume 4, pages 3250–3255, May 2001.

[20] M. Wilson. Vision systems in the automotive industry. Industrial Robot:An International Journal, 26(5):354–357, 1999.

Brief description of the drawings

1. Left: Definition of Tool Center Point (TCP) and the orthonormal coordin-ate system n,o,a. Weaving, if any, is performed in n direction in arcsensing, o is opposite to the direction of welding and perpendicular tothe plane Ω, and a is the direction of approach. x,y, z is a local co-ordinate system, defined for the workpiece. Right: The optimal positionfor seam tracking in arc sensing.

Page 220: Development of intelligent robot systems based on sensor control

204

2. The design of the robot system for 3D seam tracking based on arc sensingin the ROWER-2 project (left) was based on simulations in FUSE (right).As a spin-off effect after the completion of the ROWER-2 project, the 6Dseam tracking system presented in this patent was developed.

3. Example of a filtered arc signal at seam tracking, sampled in the ROWER-2 project. Though anomalies are common at the ignition of the arc, onlyminor deviations from this pattern showed to occur during continuousseam tracking. The unit of the horizontal axis is ticks, which equals1/50 seconds. The current values were filtered by an analogue active4:th order Bessel low-pass filter with f0 = 10 Hz before sampling.

4. The principal scheme of the 6D seam tracking control system. PN−1 andPN denote current and next TCP. In laser scanning, PN may be calculatedat every control sample. In arc sensing, next position is calculated at thebeginning of each weaving cycle, and the motion between PN−1 and PNis interpolated.

5. Cross section, plane Ω. Vector interpolation used for position and pitchcontrol at laser scanning. The input from the laser scanner system isthe intersection point of the seam walls v0 and the unit vector v1 on Ω,between the seam walls. D denotes the distance between v0 and p′.

6. Arc sensing. Left: Cross section, perpendicular to the approach vectora. The weaving is added to the interpolated motion between PN−2 andPN−1, including changes in position and orientation. Right: Cross sec-tion, plane Ω. By the measurement of the current from start to end, theprofile of the workpiece is approximately determined.

7. Cross section, plane Ω, corresponding to the upper seam wall in Fig. 1.The measured distance from TCP to nearest wall is always perpendicularto the wall. The geometrical relation between the measured distance andthe geometrical shape of the workpiece is calculated by the least-squareestimate of the coefficients k1 and k2 in Fig. 6. Note that since bothposition and the orientation of TCP is interpolated between PN−2 andPN−1, this figure is only an approximation of the trajectory between Pi∗and Pi. Note also that γ < 0 for k > 0.

8. Trajectory tangent vector control is performed by the least square curvefitting of the last N trajectory points to a 2nd degree polynomial curvefor x, y and z, followed by vector interpolation. Here N = 5. Note thatX, c0, c1 and c2 denote vector entities. The trajectory tangent vector F isequal to c1 + 2Nc2 for an 2nd degree polynomial.

9. Examples of 6D seam tracking experiments performed in Matlab usinglaser scanner (left) and FUSE, Envision, using a model of the ROWER-2robot (right).

Page 221: Development of intelligent robot systems based on sensor control

205

10. Seam tracking, laser scanning. Pipe intersection (top) and workpiece forisolating control around a (bottom). K1 = 1.0, K2 = K3 = 0.8, N = 10.Error denotes the difference between desired and current pose.

11. Seam tracking, laser scanning. Isolating rotation around n, following in-ner and outer curves. Note that in this figure, the case in the bottom isnot very realistic, but only included for the completeness of the experi-ments. K1 = 1.0, K2 = K3 = 0.8, N = 10.

12. Seam tracking, laser scanning. The same as in previous figure, except forK2 = 0.1 and r = 50 mm. This shows that K2 = 0.1 works fine as well forcontinuous seams with moderate radius of curvature. Further, the sameexperiment was also performed with r = 20 mm, which showed to bebasically identical to this figure, but with saturation at 12 degrees insteadof 4. Note that while other laser scanning experiments were performedon the verge of instability to find out the limits of the seam trackingsystem, using the M-SPOT laser scanner, the system is highly stable atmoderate curvatures using low values for K2.

13. Seam tracking, laser scanning. The step is introduced at x = 0 mm.K1 = 1.0, K2 = 0.1, K3 = 0.8, N = 10.

14. Seam tracking, laser scanning. In this figure, the case in the bottomdenotes a step of −30 performed around o. K1 = 1.0, K2 = 0.1, K3 =0.8, N = 10.

15. Seam tracking, arc sensing, with addition of random noise to the meas-ured distance, as defined in the text. K1 = 1.0, K2 = 0.5, K3 = 0.25, N =10.

16. Seam tracking, arc sensing, with addition of random noise to the meas-ured distance. K1 = 1.0, K2 = 0.5, K3 = 0.25, N = 10.

17. Seam tracking, arc sensing, with addition of random noise to the meas-ured distance. K1 = 0.5, K2 = 0.2, K3 = 0.1, N = 20.

18. Seam tracking, arc sensing, with addition of random noise to the meas-ured distance. K1 = 0.5, K2 = 0.2, K3 = 0.1, N = 20.

Page 222: Development of intelligent robot systems based on sensor control

206

Abstract

Robot automation increases productivity and frees man from involuntary anddangerous work. Though computational power has increased exponentiallyduring the last decades, today, limitations in intelligence and flexibility con-stitute a serious bottleneck in the further evolution of robotic systems.

This invention consists of a universal 6D seam tracking system that eliminatesrobot trajectory programming or geometrical databases in robotic welding,thereby increasing the intelligence and flexibility in manufacturing systems.The system is able to follow any 3D spline seam in space with relatively smallradius of curvature by the real-time correction of the position and orientationof the welding torch, using a laser scanner in robotic welding including bothlaser and arc welding, or an arc sensor in robotic arc welding.

Page 223: Development of intelligent robot systems based on sensor control

207

Claims

1. Method for 6D seam tracking of seams following 3D spline curves usinglaser scanning or arc sensing based on a system of three control com-ponents used in any arbitrary order consisting of (1) position control,limiting the motion between current and desired position by the controlparameterK1, (2) trajectory tangent control, by calculation of the desiredtrajectory tangent vector F in Fig. 8 based on N trajectory points and thelimitation of the motion between current and desired trajectory tangentvector by the control parameter K2 and (3) pitch control, limiting themotion between current and desired pitch by the control parameter K3.The parameters K1, K2 and K3 may be functions of any properties.

2. A rapid method replacing quaternion interpolation, distinguished by, inorder, interpolation of any two coordinate axes, the orthogonalizationof one of these axes in relation to the other and the calculation of thethird axis by preferably the cross product.

Page 224: Development of intelligent robot systems based on sensor control

208

n

x

Ω

oa

TCP

A

D na

d

d

α2

α2

z

y

TCP

rw

Figure 1

Figure 2

Page 225: Development of intelligent robot systems based on sensor control

209

0 10 20 30 40 50 60 70 80 90 100

200

250

300

350

Fillet weld sample

ticksA

rc c

urre

nt (

A)

Figure 3

Position control

Trajectory tangentvector control

pN−1 pN pN−2 p1

K2

Pitchcontrol

K3

K1

...

ξ

ζ

PN−1

aN

PN

nN

oN

pN

aN−1

oN−1

pN−1

1z_ 1

z_ 1

z_

oN

o × a

Figure 4

K1 (p’− p

N−1)

p’

pN

n’a’

K3 (a’− aN−1

)

α2

α2

pN−1

aN−1

a’’

nN−1

n’’

pN

v0

v1 D

Figure 5

Page 226: Development of intelligent robot systems based on sensor control

210

k1

s

n1 nnmax

k2

n2

smax

PN−1

PN−2

n A

−A

ai

niPini

oi

Pi

α

n−A nA

Figure 6

∂n

∂s

ni*

|si | sin γ1 · ni

k1 = −∂s/∂n = −sin γ1

si

|si | cos γ1 · ai

Pi

si*

Pi*

γ1

ai*

ni

ai

n1

γ1 γ1

Figure 7

X = c0 + c1 t + c2 t 2

PastPositions

CurrentPosition

NextPosition

o’

K2 (o’ − oN−1 )

oN /κ2

oN−1

∂X∂t t = N

F = F

pN

pN−1

pN

pN−2 pN−3

pN−4

Figure 8

Page 227: Development of intelligent robot systems based on sensor control

211

Figure 9

r = 10 mm

θ

noa

R = 2r

r = 2 mm

θ

θ = 0

0 45 90 135 180−15

−10

−5

0

5

10

15

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−0.1

−0.05

0

0.05

0.1

0.15

θ (deg)

ε (m

m)

Position Error

0 45 90 135 180−4

−2

0

2

4

6

8

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−0.02

−0.01

0

0.01

0.02

0.03

0.04

0.05

θ (deg)

ε (m

m)

Position Error

Figure 10

Page 228: Development of intelligent robot systems based on sensor control

212

r = 5 mm

r = 10 mm

θ

θ

noa

0 45 90 135 180−10

−8

−6

−4

−2

0

2

θ (deg)γ

(deg

)

Angular Error

0 45 90 135 180−0.01

0

0.01

0.02

0.03

0.04

θ (deg)

ε (m

m)

Position Error

0 45 90 135 180−10

−5

0

5

10

15

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−0.15

−0.1

−0.05

0

0.05

θ (deg)

ε (m

m)

Position Error

Figure 11

r = 50 mm

θ

noa

0 45 90 135 180−1

0

1

2

3

4

5

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−0.04

−0.03

−0.02

−0.01

0

0.01

θ (deg)

ε (m

m)

Position Error

Figure 12

Page 229: Development of intelligent robot systems based on sensor control

213

∆a = 5 mm

x

∆a = −5 mm

x

noa

−2 0 2 4 6 8 10−5

0

5

10

15

20

25

30

x (mm)

γ (d

eg)

Angular Error

−2 0 2 4 6 8 10−1

0

1

2

3

4

5

x (mm)

ε (m

m)

Position Error

−2 0 2 4 6 8 10−30

−25

−20

−15

−10

−5

0

5

x (mm)

γ (d

eg)

Angular Error

−2 0 2 4 6 8 10−5

−4

−3

−2

−1

0

1

x (mm)

ε (m

m)

Position Error

Figure 13

x

30°

noa

∆n = 5 mm

x

−2 0 2 4 6 8 10−30

−25

−20

−15

−10

−5

0

5

x (mm)

γ (d

eg)

Angular Error

−2 0 2 4 6 8 10−1

0

1

2

3

4

5

x (mm)

ε (m

m)

Position Error

−2 0 2 4 6 8 10−30

−25

−20

−15

−10

−5

0

5

x (mm)

γ (d

eg)

Angular Error

−2 0 2 4 6 8 10−0.4

−0.2

0

0.2

0.4

0.6

0.8

1

x (mm)

ε (m

m)

Position Error

Figure 14

Page 230: Development of intelligent robot systems based on sensor control

214

r = 50 mm

θ

noa

R = 2r

r = 25 mm

θ

θ = 0

0 45 90 135 180−25

−20

−15

−10

−5

0

5

10

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−3

−2

−1

0

1

2

θ (deg)

ε (m

m)

Position Error

0 45 90 135 180−10

−5

0

5

10

15

20

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−1

−0.5

0

0.5

1

1.5

2

2.5

θ (deg)

ε (m

m)

Position Error

Figure 15

r = 25 mm

r = 50 mm

θ

θ

noa

0 45 90 135 180−30

−25

−20

−15

−10

−5

0

5

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−1

0

1

2

3

θ (deg)

ε (m

m)

Position Error

0 45 90 135 180−5

0

5

10

15

20

25

θ (deg)

γ (d

eg)

Angular Error

0 45 90 135 180−1.5

−1

−0.5

0

0.5

1

1.5

θ (deg)

ε (m

m)

Position Error

Figure 16

Page 231: Development of intelligent robot systems based on sensor control

215

∆a = 5 mm

x

∆a = −5 mm

x

noa

−20 0 20 40 60 80 100−5

0

5

10

15

x (mm)

γ (d

eg)

Angular Error

−20 0 20 40 60 80 100−2

0

2

4

6

x (mm)

ε (m

m)

Position Error

−20 0 20 40 60 80 100−15

−10

−5

0

5

10

15

x (mm)

γ (d

eg)

Angular Error

−20 0 20 40 60 80 100−6

−4

−2

0

2

x (mm)

ε (m

m)

Position Error

Figure 17

x

30°

noa

∆n = 5 mm

x

−20 0 20 40 60 80 100−15

−10

−5

0

5

10

x (mm)

γ (d

eg)

Angular Error

−20 0 20 40 60 80 100−4

−2

0

2

4

6

x (mm)

ε (m

m)

Position Error

−20 0 20 40 60 80 100−40

−30

−20

−10

0

10

x (mm)

γ (d

eg)

Angular Error

−20 0 20 40 60 80 100−4

−2

0

2

4

x (mm)

ε (m

m)

Position Error

Figure 18

Page 232: Development of intelligent robot systems based on sensor control

216

Page 233: Development of intelligent robot systems based on sensor control

List of Acronyms

API . . . . . . . . . . . . Application Programmer’s Interface

CAR . . . . . . . . . . . Computer Aided Robotics

DOF . . . . . . . . . . . Degrees of Freedom

FUSE . . . . . . . . . . Flexible Unified Simulation Environment

MP . . . . . . . . . . . . Envision Motion Pipeline

OOP . . . . . . . . . . . Object-Oriented Programming

RT . . . . . . . . . . . . Real-Time

SGRC . . . . . . . . . Sensor Guided Robot Control

TCP . . . . . . . . . . . Tool Center Point