machine-vision-based identi crossmark heads

8
Contents lists available at ScienceDirect Robotics and ComputerIntegrated Manufacturing journal homepage: www.elsevier.com/locate/rcim Machine-vision-based identication of broken inserts in edge prole milling heads Laura Fernández-Robles a,b, , George Azzopardi b,c , Enrique Alegre a , Nicolai Petkov b a Industrial and Informatics Engineering School, University of León, Spain b Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, The Netherlands c Intelligent Computer Systems, University of Malta, Malta ARTICLE INFO Keywords: Machine vision Automatic identication Edge milling Tool breakage ABSTRACT This paper presents a reliable machine vision system to automatically detect inserts and determine if they are broken. Unlike the machining operations studied in the literature, we are dealing with edge milling head tools for aggressive machining of thick plates (up to 12 centimetres) in a single pass. The studied cutting head tool is characterised by its relatively high number of inserts (up to 30) which makes the localisation of inserts a key aspect. The identication of broken inserts is critical for a proper tool monitoring system. In the method that we propose, we rst localise the screws of the inserts and then we determine the expected position and orientation of the cutting edge by applying some geometrical operations. We compute the deviations from the expected cutting edge to the real edge of the inserts to determine if an insert is broken. We evaluated the proposed method on a new dataset that we acquired and made public. The obtained result (a harmonic mean of precision and recall 91.43%) shows that the machine vision system that we present is eective and suitable for the identication of broken inserts in machining head tools and ready to be installed in an on-line system. 1. Introduction Tool wear monitoring (TWM) systems have been widely developed over the last decades for the evaluation of the wear level of inserts, also known as cutting tools. In this paper, we present a method to identify broken inserts in a milling machine. This is an important application in the eld of face milling as broken inserts pose a threat to the stability of milling heads. An unnoticed broken insert may go on working without being detected, and can cause a decay of the quality of the nal manufactured product or a breakage of the milling machine itself [1]. Fig. 1 shows the machine used in this study, which manufactures metal poles of wind towers. Milling is performed in a single pass across very thick and long plates (up to 12 cm and 42 m, respectively) which is not common in standard milling machines. The current state of the art in TWM comprises two approaches known as direct and indirect methods. Indirect techniques can be applied while the machine is in operation. These methods evaluate the state of the inserts through measurable quantities (e.g. cutting forces and vibrations) that are typically aected by noisy signals [2345]. On the contrary, direct techniques monitor the state of the inserts directly at the cutting edge when the head tool is in a resting position [6]. As to direct methods, image processing and computer vision techniques are the most popular ways for measuring ank and crater wear [5]. Ongoing progress in the elds of machine vision and computing hardware has permitted the implementation of reliable on-line TWM systems [78]. The method that we present falls into the direct approach category. Many machine-vision-based systems have dealt with the detection of parts or imperfections in production environments [91011]. The problem that we deal with here presents two challenges: (i) the localisation of inserts and their cutting edge and (ii) the identication of broken inserts. Fig. 2 shows a milling head tool that contains indexable inserts. In this case, each insert has four edges, with the cutting edge being the (nearly) vertical one on the left hand side. 1.1. Related work There is a large body of work in the literature that evaluates the state of given inserts without having to localise them in images [2121314]. Other methods capture images directly on the head tool but they are focused on ball-end milling cutters [5] or microdrills [61516], in which only two utes and therefore two inserts are present. The others deal with face milling heads where it is easy to set the acquisition system to only capture one insert per acquired image http://dx.doi.org/10.1016/j.rcim.2016.10.004 Received 2 November 2015; Received in revised form 3 July 2016; Accepted 19 October 2016 Corresponding author at: Industrial and Informatics Engineering School, University of León, Spain. E-mail addresses: [email protected] (L. Fernández-Robles), [email protected] (G. Azzopardi), [email protected] (E. Alegre), [email protected] (N. Petkov). Robotics and Computer–Integrated Manufacturing 44 (2017) 276–283 0736-5845/ © 2016 Elsevier Ltd. All rights reserved. crossmark

Upload: others

Post on 12-Jan-2022

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Machine-vision-based identi crossmark heads

Contents lists available at ScienceDirect

Robotics and Computer–Integrated Manufacturing

journal homepage: www.elsevier.com/locate/rcim

Machine-vision-based identification of broken inserts in edge profile millingheads

Laura Fernández-Roblesa,b,⁎, George Azzopardib,c, Enrique Alegrea, Nicolai Petkovb

a Industrial and Informatics Engineering School, University of León, Spainb Johann Bernoulli Institute for Mathematics and Computer Science, University of Groningen, The Netherlandsc Intelligent Computer Systems, University of Malta, Malta

A R T I C L E I N F O

Keywords:Machine visionAutomatic identificationEdge millingTool breakage

A B S T R A C T

This paper presents a reliable machine vision system to automatically detect inserts and determine if they arebroken. Unlike the machining operations studied in the literature, we are dealing with edge milling head toolsfor aggressive machining of thick plates (up to 12 centimetres) in a single pass. The studied cutting head tool ischaracterised by its relatively high number of inserts (up to 30) which makes the localisation of inserts a keyaspect. The identification of broken inserts is critical for a proper tool monitoring system. In the method that wepropose, we first localise the screws of the inserts and then we determine the expected position and orientationof the cutting edge by applying some geometrical operations. We compute the deviations from the expectedcutting edge to the real edge of the inserts to determine if an insert is broken. We evaluated the proposedmethod on a new dataset that we acquired and made public. The obtained result (a harmonic mean of precisionand recall 91.43%) shows that the machine vision system that we present is effective and suitable for theidentification of broken inserts in machining head tools and ready to be installed in an on-line system.

1. Introduction

Tool wear monitoring (TWM) systems have been widely developedover the last decades for the evaluation of the wear level of inserts, alsoknown as cutting tools. In this paper, we present a method to identifybroken inserts in a milling machine. This is an important application inthe field of face milling as broken inserts pose a threat to the stability ofmilling heads. An unnoticed broken insert may go on working withoutbeing detected, and can cause a decay of the quality of the finalmanufactured product or a breakage of the milling machine itself [1].

Fig. 1 shows the machine used in this study, which manufacturesmetal poles of wind towers. Milling is performed in a single pass acrossvery thick and long plates (up to 12 cm and 42 m, respectively) which isnot common in standard milling machines.

The current state of the art in TWM comprises two approachesknown as direct and indirect methods. Indirect techniques can beapplied while the machine is in operation. These methods evaluate thestate of the inserts through measurable quantities (e.g. cutting forcesand vibrations) that are typically affected by noisy signals [2345]. Onthe contrary, direct techniques monitor the state of the inserts directlyat the cutting edge when the head tool is in a resting position [6]. As todirect methods, image processing and computer vision techniques are

the most popular ways for measuring flank and crater wear [5].Ongoing progress in the fields of machine vision and computinghardware has permitted the implementation of reliable on-line TWMsystems [78]. The method that we present falls into the direct approachcategory. Many machine-vision-based systems have dealt with thedetection of parts or imperfections in production environments[91011].

The problem that we deal with here presents two challenges: (i) thelocalisation of inserts and their cutting edge and (ii) the identificationof broken inserts. Fig. 2 shows a milling head tool that containsindexable inserts. In this case, each insert has four edges, with thecutting edge being the (nearly) vertical one on the left hand side.

1.1. Related work

There is a large body of work in the literature that evaluates thestate of given inserts without having to localise them in images[2121314]. Other methods capture images directly on the head toolbut they are focused on ball-end milling cutters [5] or microdrills[61516], in which only two flutes and therefore two inserts are present.The others deal with face milling heads where it is easy to set theacquisition system to only capture one insert per acquired image

http://dx.doi.org/10.1016/j.rcim.2016.10.004Received 2 November 2015; Received in revised form 3 July 2016; Accepted 19 October 2016

⁎ Corresponding author at: Industrial and Informatics Engineering School, University of León, Spain.E-mail addresses: [email protected] (L. Fernández-Robles), [email protected] (G. Azzopardi), [email protected] (E. Alegre),

[email protected] (N. Petkov).

Robotics and Computer–Integrated Manufacturing 44 (2017) 276–283

0736-5845/ © 2016 Elsevier Ltd. All rights reserved.

crossmark

Page 2: Machine-vision-based identi crossmark heads

[171819]. In our application, the head tool contains 30 rhombohedralinserts leading to 8–10 visible inserts per image, which makes thelocalisation of the inserts a new and challenging task. In our previouswork [20], we introduced a method to localise inserts in images of suchhead inserts and here we improve it and propose a new method thatevaluates the status of the inserts.

As to the identification of broken inserts, approaches based ontexture analysis have been widely used in the literature for wearmonitoring when dealing with machining operations [7]. Tool wearor degree of tool wear have been determined by using GLCM basedtexture analysis in turning and milling in [21222324252627]. Barreiroet al. [28] estimated three wear levels (low, medium, and high) of thetool insert by means of the Hu and Legendre invariant moments. Dattaet al. [29] used two texture features based on Voronoi tessellation todescribe the amount of flank wear from machined surface images. Themachines that we are dealing with in this study, however, use anaggressive edge milling in a single pass for the machining of thickplates. This may cause the breakage of inserts, such as the examplesmarked with a blue rectangle in Fig. 2. Part of a cutting edge may betorn without harming the texture of the remaining part of the insert.For this reason we believe that texture features are not suitable for theconcerned application.

Other approaches use the contours of the inserts to determine thestate of the inserts. For instance, Atli et al. [30] classified drilling toolsas sharp or dull using a new measure namely DEFROL (deviation fromlinearity) to the Canny-detected edges. Makki et al. [31] capturedimages of a drilling tool at 100 rpm rotation speed and used edgedetection and segmentation methods to describe the tool wear as thedeviation of the lip portion. Also, Chetan et al. [32] compared imageareas of the tool obtained through a texture-based segmentationmethod before and after cutting in order to determine the state of adrilling tool. For turning operations, Shahabi and Ratnam [33] applied

thresholding and subtraction of worn and unworn tool images tomeasure the nose worn regions.

Some papers also deal with micro-milling or end milling in this lineof work. Otieno et al. [34] compared images captured before and afterthe usage of two fluted micro and end mills thresholded by an XORoperator. Neither edge detection, nor tool wear quantification and norwear classification was performed. Zhang and Zhang [5] also comparedimages of ball-end milling cutters before and after machining processin order to monitor the state of the tool. Liang et al. [35] presented amethod based on image registration and mutual information torecognise the change of nose radius of TiN-coated, TiCN-coated andTiAlN-coated carbide milling inserts for progressive milling operation.They perform logic subtraction of two images before and after milling.The mentioned works share one common requirement; they all musthave an image of the intact tool to evaluate any discrepancies of a newimage of the same tool.

In this paper we propose a novel algorithm that evaluates the stateof cutting edges without requiring image references of intact inserts.This avoids calibrating the system each time an insert is replaced andallows to free memory after each monitoring. It automatically deter-mines the ideal position and orientation of the cutting edges in a givenimage and computes the deviation from the real cutting edges. Thismeans that from a single image we can determine the broken inserts.

The paper is organised as follows. First we explain the method inSection 2. In Section 3 we present the publicly available dataset that wecreated for an edge profile shoulder milling head and describe theexperiments that we carried out. In Section 4 we discuss the results andcertain aspects of the proposed method, and finally we draw conclu-sions in Section 5.

2. Method

In the method that we propose, we first localise cutting edges in agiven image, and then we classify every cutting edge as broken orunbroken. From the image analysis point of view, an unbroken insert isone which has a straight cutting edge (Fig. 3b), while a broken inserthas a curved or uneven cutting edge (Fig. 3c).

Fig. 4 presents a schema that shows the proposed method. First welocalise the inserts and the cutting edges and then, we evaluate theinserts using a three-stage method: applying an edge preservingsmoothing filter, computing the gradient of the image and finally usinggeometrical properties of the edge to assess its state. Below weelaborate on each of these steps.

2.1. Detection of inserts and localisation of ideal cutting edges

We use the algorithm that we introduced in [20] to detect insertsand localise the respective cutting edges. We apply the contrast-limitedadaptive histogram equalisation (CLAHE) method [36] in order toimprove the contrast quality of the images and facilitate the detectionof edges. In Section 3.2.1 we provide the parameter values that we used

Fig. 1. Machine tool for machining of metal poles of wind towers. (a) General view. (b) Detail of the milling head. (c) Close-up of the milling head. The rhomboidal inserts are fastenedwith screws.

Fig. 2. Head of an edge profile milling machine. The white rectangles mark intact insertswhereas the blue rectangles mark broken ones. Red line segments mark the ideal (intact)cutting edges. All markers have been provided by a human observer. (For interpretationof the references to color in this figure caption, the reader is referred to the web version ofthis paper.)

L. Fernández-Robles et al. Robotics and Computer–Integrated Manufacturing 44 (2017) 276–283

277

Page 3: Machine-vision-based identi crossmark heads

in our experiments.In brief, the algorithm works by first applying a circular Hough

transform (CHT) with a two-stage algorithm to detect the circularshapes of the screws. The CHT is a feature extraction technique forlocalising circles based on the standard Hough transform (SHT). TheSHT [37] aims at detecting straight lines. The main idea of the Houghtransform is to map a parametric curve (line, ellipse, and circle) into anew feature space. For instance, in the standard Hough transform(SHT), the Cartesian coordinates of edge points are mapped into a 2-dimensional feature space, referred to as the accumulator array. Theaccumulator array is characterised by the slope of a linear equation inthe form of slope-intercept on one axis and the intercept on the otheraxis. For each edge point in the given image, the cells of theaccumulator array that satisfy the line parameters in the new featurespace are incremented by one. In order to avoid numerical errors in theslope of vertical lines, a generalised Hough transform uses a polarcoordinate system instead. In the case of the CHT algorithm, theHough parameter space is defined by three variables, namely the twopolar coordinates of the centre of the circle and its radius. Afterapplying the CHT, we crop a rectangular area of 205 × 205 pixelscentred around a detected screw on the images of size 1280 × 960pixels, Fig. 3b and c. The dimensions of the cropped area aredetermined from the largest insert in the training images. For aparticular head tool, all parameters can be estimated from the imagesof a small training set (in our case we use 10 images) and no furtherneed of adjustment is required as long as the parameters of the cameraare not modified.

We use Canny's method [38] to detect edges in the automatically

cropped inserts. Then, we detect lines by applying a SHT to theresulting edge map. We use the known geometry of the inserts (widths,heights and angles) to determine which lines define the edges of theinserts. For a more detailed explanation, we refer the reader to [20].We take the (nearly) vertical edge in the left of the screw to be the idealcutting edge. Finally, for each localised insert, we consider a set I ofCartesian coordinates that form the ideal cutting edge:

I x y t u= {( , )| = 1… }t t (1)

where u is the number of locations of the ideal cutting edge of alocalised insert.

We determine a region of interest (ROI) from the ideal cutting edgeand the horizontal edges that are detected by the algorithm that wepublished in [20]. In Fig. 5a we show examples of ROIs in broken andunbroken inserts. An ROI is determined by considering two parallellines to the ideal cutting edge, one 3 pixels to the left and the other oneto the right with a distance of 0.7 times the space between the idealcutting edge and the centre of the screw. Moreover, we consider aparallel line to the top edge 3 pixels towards the bottom and a parallelline to the bottom edge 3 pixels towards the top. From the resultingquadrilateral, we remove a segment from a circle (with a radius of 45pixels) around the centre of the screw that coincides with thequadrilateral. Such a ROI is sufficient to evaluate the state of a cuttingedge while ignoring possibly worn edges coming from the top orbottom parts of the insert as well as ignoring any texture coming fromthe screw. In the end, we consider a rectangle around the ROI with a 3-pixel width boundary and use it to crop the corresponding part of theimage that contains the ROI, Fig. 5b. We also consider a mask definingthe ROI in such a rectangular area, Fig. 5c.

2.2. Detection of real cutting edges

The heterogeneous texture and the low contrast of the insert withrespect to the head tool make the detection of the real cutting edge anarduous task. If an edge detector is applied directly to the croppedimages, many edges apart from the cutting edge would be recognised.In order to enhance the edge contrast, we apply the edge-preservingsmoothing filter of Gastal and Oliveira1 [39] to the cropped region. Wechoose this approach due to its efficiency and its good performance.This filtering method smooths the heterogeneous texture of the insertand of the background but preserves the edges of the insert. The imagesin Fig. 5d show examples of the output of this algorithm.

After that, we apply Canny's method [38] to find edges by lookingfor local maxima of the gradient on the filtered region. Other edgedetectors, such as the ones based on Sobel, Prewitt, Roberts and LoG,performed worse. Canny's algorithm computes the gradient after

Fig. 3. (a) Edges that define the rhomboid shape of a completely intact insert. (b) The green line segment marks the cutting edge of an intact insert. The white cross marks the centre ofthe screw. (c) The green line segment marks the “real” cutting edge of a broken insert. The red line segment marks the “ideal” cutting edge of the insert if it would be intact. All markersare manually drawn. (For interpretation of the references to color in this figure caption, the reader is referred to the web version of this paper.)

Fig. 4. Outline of the proposed method.

1 Standard deviation of the spatial filter equals 60 and standard deviation of the rangefilter equals 0.4, as the default configuration.

L. Fernández-Robles et al. Robotics and Computer–Integrated Manufacturing 44 (2017) 276–283

278

Page 4: Machine-vision-based identi crossmark heads

applying a Gaussian filter that reduces noise. Non-maximal suppres-sion is applied to thin the edges. This is followed by hysteresisthresholding which uses a low and a high threshold in order to keepthe strong edges (above the high threshold) and only the weak edges(with a value between the low and high threshold) that are connected toany of the strong ones. We show examples of Canny's gradientmagnitude and binary edge maps in Fig. 5e and f. Finally, we onlyconsider the edges within the ROI (Fig. 5g). For each localised insert,we define a set R of 3-tuples that represent the Cartesian coordinatesx y( , )q q and the corresponding gradient magnitude value gq of eachlocation in the real cutting edge:

R x y g q v= {( , , ) | = 1… }q q q (2)

where v is the number of locations of the real cutting edge of thelocalised insert.

2.3. Measurement of deviations between real and ideal cutting edges

For a pair of coordinates x y( , )t t in the ideal set of edges I, wedetermine a set Pt of coordinates x y( , ) and the corresponding gradientmagnitudes g from the set of real edges R such that x y( , ) lie on a linethat passes through x y( , )t t . The slopem of this line is the gradient of the

top edge:

P x y g y m x x y x y g R x y I= {( , , ) | = ( − ) + , ( , , ) ∈ , ( , ) ∈ }t t t t t (3)

Examples of such lines are marked in blue in Fig. 6a. Next, wedenote by Et the set of Euclidean distances from x y( , )t t to allcoordinates in the set Pt:

⎧⎨⎩⎫⎬⎭E x x y y x y I x y P= ( − ) + ( − ) ( , ) ∈ , ∀ ( , ) ∈t t p t p t t p p t

2 2

(4)

Et could be an empty set. Let D be the set of minimum distances ofEt for each point t in I. D represents the minimum deviations betweenthe ideal and real edges:

D E t I= {min( )| = 1…| |}t (5)

Let G be the set of gradient magnitudes of the points in Pt with theminimum distance in Et for each point in I:

G g g P i E t I= { | ∈ , = argmin( ), = 1…| |}i i t t (6)

In Fig. 6 we plot the values of the sets D and G.We remove abnormal deviations that are usually caused by texture

on the surface of the insert rather than by the cutting edge. Forexample, Fig. 6e presents two such abnormal deviations (spikes) at the

Fig. 5. (a) In yellow, ideal, top and bottom edges and a circle of radius 45 pixels centred at the centre of the detected screw. In red, definition of the ROI. In green, rectangle to crop. (b)Cropped region. (c) Mask defining the ROI in a cropped region. (d) Edge-preserving smoothed region. (e) Gradient magnitude map. (f) Edge map. (g) Result of multiplying the edge mapby the mask. It represents the R set defined in Eq. (2). (h) R set in white and I set overlaid in red. The two top inserts are unbroken while the two bottom ones are broken. (Forinterpretation of the references to color in this figure caption, the reader is referred to the web version of this paper.)

L. Fernández-Robles et al. Robotics and Computer–Integrated Manufacturing 44 (2017) 276–283

279

Page 5: Machine-vision-based identi crossmark heads

beginning and end of the set D. We denote by f D t( , )A N, a function thatevaluates a neighbourhood of width A within the given set D centred atpoint t:

⎧⎨⎩f D t D D N D( , ) = if ≤ × median ( )∅ otherwiseA N

t t j AA

t j,

=− +

(7)

This function returns the element Dt if Dt is higher than a fractionN of the median within a local window of width A, otherwise it returns∅. We define a set D′ which is formed by applying the function f twoconsecutive times in order to remove spikes with a length of at most 3points:

D f f D t t t D′ = { ( ( , ), ) | ∀ ∈ }A N A N, ,2 2 1 1 (8)

The first insert in Fig. 6 shows a typical problem in this application.

The lower part of its cutting edge has low contrast and as a result thecorresponding edge points have very low gradient magnitudes. We areonly interested in evaluating the parts along the cutting edge that havehigh contrast because they are more reliable. Formally, we define a newset D″ whose elements are copied from the set D′ when the correspond-ing edge points have gradient magnitudes higher than a threshold B,otherwise they are set to ∅:

D g B d g B d D g G′′ = { ≥ → ∧ < → ∅| ∈ ′, ∀ ∈ }t t t t t (9)

In order to ensure that an insert is broken, the deviation should besufficiently high along a region of the cutting edge and not just in oneisolated pixel. We apply a mean filter using a window of width C andsubsequently take the maximum deviation d of the cutting edge:

Fig. 6. Two examples of deviation and gradient magnitude computation. (a)–(e) and (f)–(j) correspond to the second and third rows in Fig. 5, respectively. (a) and (f) Edge maps. (b)and (g) Gradient magnitude maps. (c) and (h) In white the edge maps, overlapped in red the detected ideal cutting edge and in blue examples of analysed lines y as in Eq. (3). (d) and (i)

In blue deviations along the detected ideal cutting edge D, in green deviations after spike and low contrast elimination D″ and in magenta deviations after mean filtering D″. (e)–(h) Indark green gradient magnitudes along the detected ideal cutting edge and in red an example of threshold B=0.2. (For interpretation of the references to color in this figure caption, thereader is referred to the web version of this paper.)

L. Fernández-Robles et al. Robotics and Computer–Integrated Manufacturing 44 (2017) 276–283

280

Page 6: Machine-vision-based identi crossmark heads

⎪⎪

⎪⎪

⎧⎨⎩

⎫⎬⎭

∑dC

d d D= max 12 + 1

( ) ∀ ∈ ′′j C

C

t j t=−

+

(10)

Moreover, we also compute the mean gradient magnitude g alongthe cutting edge.

∑gG

g g G= 1| |

, ∀ ∈t

G

t t=1

| |

(11)

As a result every localised insert is represented by the twoparameter values d and g .

2.4. Classification of inserts

The evaluation of inserts is performed during the resting state of themilling head tool between the processing of two metallic plates. Platesare mechanised in a single pass. We set up a capturing system at thislocation as shown in Fig. 7a, with a fixed camera and the head toolmakes 24 rotations of 15° each. For every rotation we take a picture ofthe head tool. The same insert is captured in several images (between 7and 10) under different poses as the head tool rotates, Fig. 8. In thiswork, the correspondences of the same insert in multiple images is

manually labelled. In Section 4 we provide a suggestion how thecorrespondence issue can be implemented automatically. For eachinsert we compute the maximum deviation d and the mean gradient gfor every image where it is detected.

We classify an insert as broken if the image with the highest meangradient magnitude g along the cutting edge has a maximum deviationd higher than a threshold T, or if the maximum deviations of at leasttwo images (irrespective of the mean gradient magnitude) are greaterthan T. Otherwise we classify the insert as unbroken. Formally, wedefine the classification function z as:

⎧⎨⎩z e d T d T( ) = 1 if ( > ) ∨ (∑ ( > )) ≥ 20 otherwise

g hr

hargmax { } =1h r h=1…

(12)

where r is the number of images where the same insert e is detected.

3. Evaluation

3.1. Dataset

To the best of our knowledge, there are no publicly available imagedatasets of milling cutting heads in the literature. For this reason, weacquired a new dataset with ground truth and we published it on-line.2

It is made up of 144 images of an edge profile cutting head used in acomputer numerical control (CNC) milling machine. We conductedexperiments on a milling TECOI TRF machine, using a bevelling tool byKennametal.3 The inserts used were rhomboid type fastened by screws,without chip breaking and a rake angle of 0. The head tool, withcylindrical shape, contains 30 inserts in total from which 7 to 10 insertsare seen in each image of the dataset. The inserts are arranged in 6groups of 5 inserts diagonally positioned along the axial direction of thetool perimeter, as illustrated in Fig. 7b. The last insert of each group isvertically aligned with the first insert of the following group. It gives atotal of 24 different positions along the radial perimeter of the millinghead tool in which at least one insert is aligned with the camera inintervals of 15°.

We created the dataset following an iterative process. We mounted30 inserts in the head tool and took 24 images of the head tool indifferent orientations that differ by 15°. We repeat this process for 6times, where each time we use a different set of inserts, thus collecting(6× 24=) 144 images that contain (6× 30=) 180 unique inserts, ofwhich 19 are broken and 161 are unbroken. All inserts that we used toacquire this dataset were taken after some milling operations by thesame machine.

We used a monochrome camera Genie M12801/3″ with pixel size of3.75 μm, active resolution of 1280× 960 pixels and fixed C-mountAZURE-2514MM with a focal length of 25 mm and 2/3″ format. Thetwo compact bar shape structures with high red intensity LED arraysBDBL-R(IR)82/16H were used to enhance the image capturing cap-ability and intensified the lighting on the edges. The milling machinethat we used to create our dataset does not use oils, lubricants or other

Fig. 7. (a) Diagram representing a schema of the arrangement of inserts on a cylindricalmilling head tool depicted opened up as a rectangle. Squares denote the inserts. Thevertical dashed line shows the alignment between adjacent groups of inserts. (b) Front(top) and side (bottom) view of the capturing system. Measurements are in centimetres.

Fig. 8. The numbers indicate the ground truth labels of each cutting edge along three consecutive images of the dataset. Consecutive images are taken by rotating the milling head toolby 15°. A cutting edge present in different images is labelled with the same number.

L. Fernández-Robles et al. Robotics and Computer–Integrated Manufacturing 44 (2017) 276–283

281

Page 7: Machine-vision-based identi crossmark heads

kind of substances that can cause a filthy tool.Together with the dataset, we provide the corresponding ground

truth masks of all ideal cutting edges along with the labels of the stateof the inserts (broken or unbroken). Moreover, we labelled eachdistinct insert, by giving them unique identification numbers. InFig. 8, we show three consecutive images that contain the same inserts(with the same identification number) in different locations and posesdue to the rotation of the milling head tool in steps of 15°.

3.2. Experiments and results

We used Matlab on a personal computer with a 2 GHz processorand 8 GB RAM. The complete process to identify broken inserts in ahead tool with 30 inserts takes less than 3 min. This is sufficient for theapplication at hand because according to the consulted experts themilling head tool stays in a resting position between 5 and 30 min,during which the milled plate is replaced by a new one.

3.2.1. Localisation of ideal cutting edgesIn this work we improve the localisation of ideal cutting edges by

better fine tuning the parameters described in [20]. In particular, inorder to determine the cutting edge in the Hough transform map weconsider the lines that deviate by at most 22° from the verticalorientation and choose the one which is at least 47 pixels to the leftof the centre. Similarly, for the detection of the top and bottom edges,we consider the lines that deviate by at most 11° from the horizontalorientation and choose the ones that are at least 52 pixels away fromthe centre. With this new selection of parameter values we improve thelocalisation accuracy from 98.84% to 99.61%.

The fact that we have a controlled environment (fixed camera andfixed resting position of the head tool), this set of parameters is fixedonly once for every given milling machine.

3.2.2. Classification of insertsOur dataset is skewed with 19 broken inserts and 161 unbroken

ones. We refer to the broken inserts as the positive class and theunbroken as the negative class. Therefore, a true positive (TP) is abroken insert classified as broken; a false positive (FP) is an unbrokeninsert classified as broken and a false negative (FN) is a broken insertclassified as unbroken. We compute the precision P TP TP FP= /( + ),recall R TP TP FN= /( + ) and their harmonic mean F PR P R= 2 /( + ) fora set of thresholds T ∈ {5, 5.01,…,8} used in the classification function,and obtain a P R– curve. We consider the best pair (P R, ), the one thatcontributes to the maximum harmonic mean.

We apply a repeated random sub-sampling validation where in eachrun we randomly (stratified sampling) split the dataset into training(70%) and validation (30%) sub sets. For each such split, we use thetraining data to determine the set of parameters (A N A N B C, , , , ,1 1 2 2 )that achieves the global maximum harmonic mean F. This is obtainedby applying a grid search on A ∈ {3, 5, 7}1 , N ∈ {1, 1.5, 2}1 ,A ∈ {3, 5, 7}2 , N ∈ {1, 1.25, 1.5}2 , B ∈ {0.18, 0.2, 0.22} andC ∈ {3, 5, 7} and computing the maximum harmonic mean for eachcombination. If several sets of parameters yield the same harmonicmean, we take a random one. In Fig. 9, we show the P R– curvesobtained for two combinations of parameters of the grid search byvarying the threshold T.

The determined set of parameters is then used to evaluate thevalidation set. We repeat this process 20 times and finally we averagethe results obtained from the validation sets. We obtain an averageharmonic mean F = 0.9143( ± 0.079) with a precisionP = 0.9661( ± 0.073) and a recall R = 0.8821( ± 0.134). The most re-peated (6 out of 20 runs) set of parameters in the training is

(A N A N B C= 5, = 1.5, = 3, = 1, = 5, = 0.21 1 2 2 ). When we evaluatethe entire dataset with these parameter values we achieve precisionP=1 and recall R=0.95 for the maximum harmonic mean F=0.9744.

4. Discussion

The contribution of this work is threefold. First, we performed aneffective classification of the inserts according to the state of theircutting edges as broken and unbroken. Second, we presented a datasetof 144 images of a rotating edge milling cutting head that contains 30inserts, analysing 180 inserts in total. It contains the ground truthinformation about the locations of the cutting edges and broken insertsare labelled by experts. We made our dataset publicly available (seefootnote 2). Third, we improved the selection of parameters for thelocalisation of multiple inserts and cutting edges presented in [20].

The high precision and recall rates that we achieved demonstratethe effectiveness of the proposed approach and suggest that it is readyto be applied in production. The performance can be further improvedby using more appropriate illumination conditions and better quality ofthe lenses in order to obtain higher contrast between inserts andbackground.

Typically, an insert appears in 7–10 images in different positionsand poses. In this work, the ground truth contains the identificationnumbers of the inserts in all images. This means, that an insert thatappears multiple times is manually given the same identificationnumber. Alternatively, the approximate position of inserts in theconsecutive images can be inferred from the radius of the head toolcylinder, the distance of the fixed camera from the head tool and thedegrees of rotation. In this way, after automatically detecting thepositions of inserts we can automatically determine the correspon-dences (labelling) according to the expected positions.

Inserts on the sides of an image appear skewed due to thecylindrical shape of the milling head tool. This, however, has littleeffect on the classification of an insert because the decision is based onat least 2 (out of the 7–10) images where the insert is visible. In most ofsuch images, the inserts appear roughly perpendicular with the camera,allowing us to take reliable decisions.

In this work we are concerned with detecting broken inserts as it isthe most critical evaluation for the stability of the milling head tool. Infuture, we will also evaluate the wear of inserts in order to detect theweak ones as early as possible. Moreover, we would also like tocompare the performance of different image acquisition methods.

In addition, the proposed method can be set up for differentmachining heads that contain polygonal inserts fastened by screws, atypical design in milling machines.

Fig. 9. P R– curves of the results obtained for two combinations of parameters by varyingthe threshold T. The solid circles indicate the best pair P R( – ) that contributes to the

maximum harmonic mean in each curve. a:A N A N B C{ = 5, = 1, = 7, = 1.5, = 0.18, = 3}1 1 2 2 and b:

A N A N B C{ = 5, = 1.5, = 3, = 1, = 0.2, = 5}1 1 2 2 .

2 http://pitia.unileon.es/varp/node/3953 http://www.kennametal.com

L. Fernández-Robles et al. Robotics and Computer–Integrated Manufacturing 44 (2017) 276–283

282

Page 8: Machine-vision-based identi crossmark heads

5. Conclusions

The approach that we propose for the identification of brokeninserts (cutting tools) in milling machines achieves a harmonic mean ofprecision and recall equals 0.9143 ( ± 0.079) and can analyse all insertsin a milling head tool (24 images of size 1280× 960 pixels) in less than3 minutes on a 2 GHz processor. It is based on computer visionmethods and does not require the comparison with reference images ofintact inserts. To our knowledge, this is the first automatic solution forthe identification of broken inserts in edge profile milling heads. Thepresented system can be set up on-line and it can be applied while themilling head tool is in a resting position without delaying anymachining operations. This system highly reduces the risk of head toolcollapse, which is very expensive and time consuming to replace.

Acknowledgements

This work has been supported by DPI2012-36166 grant from theSpanish Government. We thank the company TECOI for providing usan edge profile milling head tool and the inserts to create our datasetand evaluate the proposed system.

References

1 T. Kalvoda, Y.-R. Hwang, A cutter tool monitoring in machining process usingHilbert Huang transform, Int. J. Mach. Tools Manuf. 50 (5) (2010) 495–501. http://dx.doi.org/10.1016/j.ijmachtools.2010.01.006.

2 J. Jurkovic, M. Korosec, J. Kopac, New approach in tool wear measuring techniqueusing CCD vision system, Int. J. Mach. Tools Manuf. 45 (9) (2005) 1023–1030.http://dx.doi.org/10.1016/j.ijmachtools.2004.11.030.

3 S. Kurada, C. Bradley, A review of machine vision sensors for tool conditionmonitoring, Comput. Ind. 34 (1) (1997) 55–72. http://dx.doi.org/10.1016/S0166-3615(96)00075-9.

4 W. Wang, Y. Wong, G. Hong, Flank wear measurement by successive image analysis,Comput. Ind. 56 (8–9) (2005) 816–830. http://dx.doi.org/10.1016/j.com-pind.2005.05.009 (Machine Vision Special Issue).

5 C. Zhang, J. Zhang, On-line tool wear measurement for ball-end milling cutter basedon machine vision, Comput. Ind. 64 (6) (2013) 708–719. http://dx.doi.org/10.1016/j.compind.2013.03.010.

6 T. Pfeifer, L. Wiegers, Reliable tool wear monitoring by optimized image andillumination control in machine vision, Measurement 28 (3) (2000) 209–218. http://dx.doi.org/10.1016/S0263-2241(00)00014-2.

7 S. Dutta, S. Pal, S. Mukhopadhyay, R. Sen, Application of digital image processing intool condition monitoring: a review, CIRP J. Manuf. Sci. Technol. 6 (3) (2013)212–232. http://dx.doi.org/10.1016/j.cirpj.2013.02.005.

8 Y. Kwon, T.-L.B. Tseng, Y. Ertekin, Characterization of closed-loop measurementaccuracy in precision CNC milling, Robot. Comput.-Integr. Manuf. 22 (4) (2006)288–296. http://dx.doi.org/10.1016/j.rcim.2005.06.002.

9 D.-M. Tsai, M.-C. Lin, Machine-vision-based identification for wafer tracking in solarcell manufacturing, Robot. Comput.-Integr. Manuf. 29 (5) (2013) 312–321. http://dx.doi.org/10.1016/j.rcim.2013.01.009.

10 M. Dinham, G. Fang, Detection of fillet weld joints using an adaptive line growingalgorithm for robotic arc welding, Robot. Comput.-Integr. Manuf. 30 (3) (2014)229–243. http://dx.doi.org/10.1016/j.rcim.2013.10.008.

11 H. Golnabi, A. Asadpour, Design and application of industrial machine visionsystems, Robot. Comput.-Integr. Manuf. 23 (6) (2007) 630–637, http://dx.doi.org/10.1016/j.rcim.2007.02.005 (16th International Conference on Flexible Automationand Intelligent Manufacturing).

12 M. Castejón, E. Alegre, J. Barreiro, L. Hernández, On-line tool wear monitoring usinggeometric descriptors from digital images, Int. J. Mach. Tools Manuf. 47 (12–13)(2007) 1847–1853. http://dx.doi.org/10.1016/j.ijmachtools.2007.04.001.

13 T. Lim, M. Ratnam, Edge detection and measurement of nose radii of cutting toolinserts from scanned 2-d images, Optics Lasers Eng. 50 (11) (2012) 1628–1642.http://dx.doi.org/10.1016/j.optlaseng.2012.05.007.

14 G. Xiong, J. Liu, A. Avila, Cutting tool wear measurement by using active contourmodel based image processing, in: 2011 International Conference on Mechatronicsand Automation (ICMA), 2011, pp. 670–675, http://dx.doi.org/10.1109/ICMA.2011.5985741.

15 J. Su, C. Huang, Y. Tarng, An automated flank wear measurement of microdrillsusing machine vision, J. Mater. Process. Technol. 180 (1–3) (2006) 328–335. http://dx.doi.org/10.1016/j.jmatprotec.2006.07.001.

16 J.-H. Kim, D.-K. Moon, D.-W. Lee, J. Suk Kim, Myung Chang Kang, K.H. Kim, Toolwear measuring technique on the machine using CCD and exclusive jig, J. Mater.Process. Technol. 130–131 (2002) 668–674. http://dx.doi.org/10.1016/S0924-0136(02)00733-1.

17 M. Sortino, Application of statistical filtering for optical detection of tool wear, Int. J.Mach. Tools Manuf. 43 (5) (2003) 493–497. http://dx.doi.org/10.1016/S0890-6955(02)00266-3.

18 W. Wang, G. Hong, Y. Wong, Flank wear measurement by a threshold independentmethod with sub-pixel accuracy, Int. J. Mach. Tools Manuf. 46 (2) (2006) 199–207.http://dx.doi.org/10.1016/j.ijmachtools.2005.04.006.

19 W. Weis, Tool wear measurement on basis of optical sensors, vision systems andneuronal networks (application milling), in: WESCON/'93 Conference Record, 1993,pp. 134–138, http://dx.doi.org/10.1109/WESCON.1993.488423.

20 L. Fernández-Robles, G. Azzopardi, E. Alegre, N. Petkov, Cutting edge localisation inan edge profile milling head, in: Computer Analysis of Images and Patterns, LectureNotes in Computer Science, vol. 9257, Springer International Publishing,Switzerland, 2015, pp. 336–347.

21 D. Kerr, J. Pengilley, R. Garwood, Assessment and visualisation of machine tool wearusing computer vision, Int. J. Adv. Manuf. Technol. 28 (7–8) (2006) 781–791.http://dx.doi.org/10.1007/s00170-004-2420-0.

22 S. Dutta, S. Pal, R. Sen, Tool condition monitoring in turning by applying machinevision, ASME J. Manuf. Sci. Eng. 138 (5) (2016) 051008. http://dx.doi.org/10.1115/1.4031770.

23 S. Dutta, S.K. Pal, R. Sen, On-machine tool prediction of flank wear from machinedsurface images using texture analyses and support vector regression, Precis. Eng. 43(2016) 34–42. http://dx.doi.org/10.1016/j.precisioneng.2015.06.007.

24 N.N. Bhat, S. Dutta, T. Vashisth, S. Pal, S.K. Pal, R. Sen, Tool condition monitoringby SVM classification of machined surface images in turning, Int. J. Adv. Manuf.Technol. 83 (9) (2016) 1487–1502. http://dx.doi.org/10.1007/s00170-015-7441-3.

25 S. Dutta, A. Kanwat, S. Pal, R. Sen, Correlation study of tool flank wear withmachined surface texture in end milling, Measurement 46 (10) (2013) 4249–4260.http://dx.doi.org/10.1016/j.measurement.2013.07.015.

26 S. Dutta, A. Datta, N.D. Chakladar, S. Pal, S. Mukhopadhyay, R. Sen, Detection of toolcondition from the turned surface images using an accurate grey level co-occurrencetechnique, Precis. Eng. 36 (3) (2012) 458–466. http://dx.doi.org/10.1016/j.pre-cisioneng.2012.02.004.

27 M. Danesh, K. Khalili, Determination of tool wear in turning process usingundecimated wavelet transform and textural features, Procedia Technol. 19 (0)(2015) 98–105, http://dx.doi.org/10.1016/j.protcy.2015.02.015 (8th InternationalConference Interdisciplinarity in Engineering, INTER-ENG 2014, 9–10 October2014, Tirgu Mures, Romania).

28 J. Barreiro, M. Castejón, E. Alegre, L. Hernández, Use of descriptors based onmoments from digital images for tool wear monitoring, Int. J. Mach. Tools Manuf. 48(9) (2008) 1005–1013. http://dx.doi.org/10.1016/j.ijmachtools.2008.01.005.

29 A. Datta, S. Dutta, S. Pal, R. Sen, Progressive cutting tool wear detection frommachined surface images using Voronoi tessellation method, J. Mater. Process.Technol. 213 (12) (2013) 2339–2349. http://dx.doi.org/10.1016/j.jmatpro-tec.2013.07.008.

30 A.V. Atli, O. Urhan, S. Ertürk, M. Sönmez, A computer vision-based fast approach todrilling tool condition monitoring, Proc. Inst. Mech. Eng. Part B: J. Eng. Manuf. 220(9) (2006) 1409–1415, arXiv:http://pib.sagepub.com/content/220/9/1409.full.pdf+html, http://dx.doi.org/10.1243/09544054JEM412

31 H. Makki, R. Heinemann, S. Hinduja, O. Owodunni, Online determination of toolrun-out and wear using machine vision and image processing techniques, in:Innovative Production Machines and Systems, 2009.

32 Y. Chethan, H. Ravindra, Y.K. Gowda, G.M. Kumar, Parametric optimization indrilling EN-8 tool steel and drill wear monitoring using machine vision applied withTaguchi method, Procedia Mater. Sci. 5 (0) (2014) 1442–1449, http://dx.doi.org/10.1016/j.mspro.2014.07.463 (International Conference on Advances in Manufacturingand Materials Engineering, ICAMME 2014).

33 H. Shahabi, M. Ratnam, Assessment of flank wear and nose radius wear fromworkpiece roughness profile in turning operation using machine vision, Int. J. Adv.Manuf. Technol. 43 (1–2) (2009) 11–21. http://dx.doi.org/10.1007/s00170-008-1688-x.

34 A. Otieno, C. Pedapati, X. Wan, H. Zhang, Imaging and wear analysis of micro-toolsusing machine vision, in: IJME-INTERTECH International Conference, 2006.

35 Y.-T. Liang, Y.-C. Chiou, C.-J. Louh, Automatic wear measurement of Ti-basedcoatings milling via image registration, in: MVA, 2005, pp. 88–91.

36 K. Zuiderveld, Graphics gems iv, in: P.S. Heckbert (Ed.), Contrast Limited AdaptiveHistogram Equalization, Academic Press Professional, Inc., San Diego, CA, USA,1994, pp. 474–485, URL ⟨http://dl.acm.org/citation.cfm?id=180895.180940⟩.

37 P. Hough, Method and Means for Recognizing Complex Patterns, U.S. Patent 3.069.654, December 1962.

38 J. Canny, A computational approach to edge detection, IEEE Trans. Pattern Anal.Mach. Intell. 8 (6) (1986) 679–698. http://dx.doi.org/10.1109/TPAMI.1986.4767851.

39 E.S.L. Gastal, M.M. Oliveira, Domain transform for edge-aware image and videoprocessing, ACM TOG 30 (4) (2011) 69:1–69:12 (Proceedings of SIGGRAPH 2011).

L. Fernández-Robles et al. Robotics and Computer–Integrated Manufacturing 44 (2017) 276–283

283