feature extraction of optical projectiles images

4
PROCEEDINGS Feature extraction of optical projectiles images M PIRLOT*, A CHABOTTIER, E CELENS Royal Military Academy, Avenue de la Renaissance 30, B-1000 Brussels, Belgium J De KINDER National Institute of Forensic Science, Vilvoordsesteenweg 98-100, B-1120 Brussels, Belgium and P Van HAM Universite' Libre de Bruxelles, Avenue F Rooseveld 50, B-1050 Brussels, Belgium A paper presented at the First Meeting of the European Academy of Forensic Sciences, Lausanne, Switzerland 1997 When dealing with large collections of cartridge cases or bullets, a substantial amount of work goes into the compar- ison of a newly arrived piece of evidence with the collec- tion. Different manufacturers have proposed a solution to this problem using specific hardware and software pro- grams. They record by a CCD-camera images of the char- acteristics of a particular firearm left on bullets and car- tridge cases. The images are compared using algorithms, appropriate to the manufacturer. While application of these commercial products seems very promising, two major objections can be made: (i) their price is substantial for a lot of laboratories, (ii) nothing is known about the internal algorithms effecting the comparisons. The latter argument removes all possible interpretation (e.g., statistics) by the firearm examiner who is using the program. Our experience in the field of weapon identification convinced us to believe that the problem of a fully automated comparison, without possibility of intervention by an experienced operator, is too complicated an issue [I]. It is important that the person using a (partially) automated system should be aware of its internal algorithms and protocols, apart from having sub- stantial experience in the field of firearm identification. These arguments prompted the authors to develop a system that is capable of assisting the firearm investigator in cum- bersome and tedious activities. Initially starting, along with others, to use optical images in the same fashion as the com- mercially available systems. Concurrently, the authors investigated the field of alternative 'imaging' systems, such *Corresponding author e-mail: [email protected] as the laser surface scanner [2] to obtain direct 3D profiles of the striation marks on bullets. This paper is restricted to the study of the comparisons of striation patterns on bullets. An obvious extension of the approach is the analysis of line patterns which are present in the different marks left on car- tridge cases. The steps of the algorithm developed are discussed in dif- ferent sections of the paper. After dealing with the experi- mental details it discusses the particular selection procedure of the region of interest (ROI). The extraction of the char- acteristic information in a compact format, resulting in a more efficient comparison, follows. Taking the algorithm for the comparison of two grooves, there is a comparison of two bullets in the penultimate section. The combined knowledge of the underlying algorithm and experience in the field of firearms identification allows the authors to pre- sent the data, combined with a suggestion for the signature of the weapon from which the bullets were fired. Finally, the conclusions and future prospects are presented. Throughout the following sections, this paper illustrates the principles applied on one particular example or on a syn- thetic image selected for its educational purposes. Regardless of the differentiation into several units in this paper, the user of the program has only to deal with the first and final step. It is understood that a pre-selection on class characteristics (calibre of the bullet, the number and the average width of the marks) and on administrative data has been performed before this procedure is initiated. Science & Justice 1999; 39(1): 53-56 53

Upload: m-pirlot

Post on 05-Jul-2016

212 views

Category:

Documents


2 download

TRANSCRIPT

PROCEEDINGS

Feature extraction of optical projectiles images

M PIRLOT*, A CHABOTTIER, E CELENS

Royal Military Academy, Avenue de la Renaissance 30, B-1000 Brussels, Belgium

J De KINDER

National Institute of Forensic Science, Vilvoordsesteenweg 98-100, B-1120 Brussels, Belgium

and

P Van HAM

Universite' Libre de Bruxelles, Avenue F Rooseveld 50, B-1050 Brussels, Belgium

A paper presented at the First Meeting of the European Academy of Forensic Sciences, Lausanne, Switzerland 1997

When dealing with large collections of cartridge cases or bullets, a substantial amount of work goes into the compar- ison of a newly arrived piece of evidence with the collec- tion. Different manufacturers have proposed a solution to this problem using specific hardware and software pro- grams. They record by a CCD-camera images of the char- acteristics of a particular firearm left on bullets and car- tridge cases. The images are compared using algorithms, appropriate to the manufacturer. While application of these commercial products seems very promising, two major objections can be made: (i) their price is substantial for a lot of laboratories, (ii) nothing is known about the internal algorithms effecting the comparisons. The latter argument removes all possible interpretation (e.g., statistics) by the firearm examiner who is using the program. Our experience in the field of weapon identification convinced us to believe that the problem of a fully automated comparison, without possibility of intervention by an experienced operator, is too complicated an issue [I]. It is important that the person using a (partially) automated system should be aware of its internal algorithms and protocols, apart from having sub- stantial experience in the field of firearm identification.

These arguments prompted the authors to develop a system that is capable of assisting the firearm investigator in cum- bersome and tedious activities. Initially starting, along with others, to use optical images in the same fashion as the com- mercially available systems. Concurrently, the authors investigated the field of alternative 'imaging' systems, such

*Corresponding author e-mail: [email protected]

as the laser surface scanner [2] to obtain direct 3D profiles of the striation marks on bullets. This paper is restricted to the study of the comparisons of striation patterns on bullets. An obvious extension of the approach is the analysis of line patterns which are present in the different marks left on car- tridge cases.

The steps of the algorithm developed are discussed in dif- ferent sections of the paper. After dealing with the experi- mental details it discusses the particular selection procedure of the region of interest (ROI). The extraction of the char- acteristic information in a compact format, resulting in a more efficient comparison, follows. Taking the algorithm for the comparison of two grooves, there is a comparison of two bullets in the penultimate section. The combined knowledge of the underlying algorithm and experience in the field of firearms identification allows the authors to pre- sent the data, combined with a suggestion for the signature of the weapon from which the bullets were fired. Finally, the conclusions and future prospects are presented.

Throughout the following sections, this paper illustrates the principles applied on one particular example or on a syn- thetic image selected for its educational purposes. Regardless of the differentiation into several units in this paper, the user of the program has only to deal with the first and final step. It is understood that a pre-selection on class characteristics (calibre of the bullet, the number and the average width of the marks) and on administrative data has been performed before this procedure is initiated.

Science & Justice 1999; 39(1): 53-56 53

Feature extraction of optical projectiles images

Experimental Bullets were mounted onto one stage of a Leica comparison microscope and illuminated using oblique lighting. Images were recorded by a CCD-camera with 768 x 576 pixel res- olution and transferred to a Pentium Pro personal computer through a frame-grabber IMAQ. The scanning direction is chosen perpendicular to the striation pattern [3], the stored image is the mean image of ten successive snaps for noise reduction purposes [4]. All images were recorded using the same magnification factor of the comparison microscope. One image only contains one groove of the bullet. LabVIEWa software was used for the acquisition as well as for the development of the graphical user interface.

Brass jacketed bullets of calibre 9 rnm PARA were fired using a Browning High Power pistol and recovered in a water tank. These projectiles, carrying six grooves with a right hand twist, were used to test the system.

with the signature of the firearm situated near the base of the projectile. As can be seen from Figure 1, the freehand mode can be a more effective means to select only the rele- vant part of the bullet. Another advantage of the freehand mode is the possibility to eliminate particular regions from being selected, e.g., parts of the bullet which are damaged by terminal effects.

A disadvantage of this selection procedure is its subjective character, which has an obvious impact on the result of the objective comparison.

Extraction of the feature vector From the image of Figure 1, notice that there occurs almost no change in the grey value within the ROI parallel to the striations. This lead the authors to assume that the striations on the bullet are intrinsically of a one-dimensional nature. Hence a one-dimensional array of values, the so-called fea- ture vector, can contain all the data necessary to make a

Selection of the region of interest comparison between two images. An experienced operator decides on the best area of the stri- ation pattern for selection, which contains characteristic information for the firearm identification. Generally, only a rectangular shape is provided for making a selection of the ROI. In this present application, the area selected can equal- ly well have a polygonal shape, but can also be drawn in a freehand mode. An example of such a selection is shown in F i~u re 1. where one striation mark on a bullet is dis~laved

The major obstacles encountered when trying to extract such a feature are scale, rotation and translation invariance. These problems are generally present for computerised comparison. We dealt with the scale factor by imposing one definite magnification during the recording of the images. However, the orientation and the exact positioning of the bullet on the screen are difficult to align from one image to

p~ . . " I d another. The rotation invariance is essential to the success- ful extraction of the feature vector. To eliminate this remaining problem, a computer algorithm was designed, based on a Hough transform [4] of the ROI. This transform maps a straight line in the image space into a point in the Hough space, a so-called accumulator. The co-ordinates in the Hough space are given by the straight line's degree of elevation 8 and its distance to the origin r. The transform corresponds to the following mathematical formula (inte- gral representation):

Figure 2 shows an example of a Hough transform. The par- allel lines in the ROI are mapped into different accumula- tors that are aligned onto one line (constant 8). This 8 value is used to rotate the ROI in such a way that the parallel stri- ation lines become horizontal. The next step consists of the projection of the two-dimensional ROI information onto a vertical axis. The projected value is not obtained by averag- ing out the points on a particular horizontal line, instead the most frequently appearing grey value (modulus) of that line is selected for the ID vector. The resulting 1D feature vec- tor for the present examples is shown in Figure 3. This information is stored and used in performing the comparisons.

FIGURE 1 The selection of a region of interest (ROI) on one groove of a 9 mm PARA bullet.

54 Science & Justice 1999; 39(1): 53-56

M PIRLOT, A CHAB07TIER. J DE KINDER, P VAN HAM and E CELENS

FIGURE 2 Synthetic image Z(x,y) with straight lines (left) and its corresponding Hough H(8,r) transform (right). The smearing of the accumulators is a typical artefact of the Hough

-- transform. -- - - - -- - - - -

vertical axis

grey -level

I

-- 5- %- vertical

axis

, E-

grey-level

FIGURE 3 Feature vectors of the real image (left) and synthetic image (right).

Comparison For the comparison of two 1D-feature vectors, one might be tempted to perform a classical correlation. However, appli- cation of this correlation yields a triangular shape, resulting from an increasing and decreasing overlap of the two arrays. The signal from a positive correlation can hardly be discerned against this background. Another disadvantage of this technique is the likely influence from a light gradient present in the image, which is transferred to the intensities in the feature vector. Both problems can be solved by the application of a phase-only correlation. To this purpose, only the phase information of the signal is used for the correlation, which can be obtained by applying a Fourier transform on the feature vector, followed by equalling the intensities and transforming the signal back to real space.

Science & Justice 1999; 39(1): 53-56

A typical result of the phase-only correlation is given in Figure 4. The cross-correlation shows only one significant peak, which is one order of magnitude larger than the back- ground. The position of the peak can be used to shift the two images for a future visualisation side-by-side. The height of the peak, normalised by the auto-correlation of the two sig- nals, is a measure of the correspondence of the two striation marks. It can be split up into substantial contributions from matching lines present in the two features. These matching lines are called Contributors to Maximum Correlation (CMC) and mostly make up the signature left by the firearm on the bullet. These CMCs are stored into memory and are used for later visualisation.

Presentation of the Comparison Results For one bullet, the ROI of all the striae have to be selected

55

Feature extraction of optical projectiles images

Amplitude correlation

function T

FIGURE 4 The result of a phase-only correlation of two feat& vectors.

P' - 3.- 3-w-w- l r- . ., 4Wql

Striation Mark 1

Striation Mark 2

Striation Mark 3

Striation Mark 4

Striation Mark 5

Striation Mark 6

FIGURE 5 Correlation matrix of two bullets, showing the intensity of the relative correlation peak translated into grey values, where black and white correspond to the lowest and

highest values, respectively.

and treated as mentioned under Extraction of the Feature Conclusion Vector. Above is presented the methodology for comparing This paper presents a system which is based on the selection the feature vectors of the two grooves. When the operator of a ROI by a firearm examiner. Starting from that infor- applied the comparison algorithm to all possible pairs com- mation, a usable feature vector is derived for each striation posed of one of the six striation marks on either bullet, a 6 mark. This one-dimensional quantity reduces the amount of x 6 matrix resulted. This should, in theory, show a symmet- data to its essentials and rejects the superfluous second rical correlation pattern. dimension. This feature vector can be used to make com-

Assuming that the same firearm fired both bullets, a corre- spondence between the successive striation marks should result. If striation mark number 1 of bullet A matches stria- tion mark number 3 of bullet B, the other five couples of matching striae are unambiguously defined. With this infor- mation, the numbering of the striation marks on one of the bullets can be shifted in such a way that the positive corre-

parisons using the phase-only correlation algorithm. The results for the comparison of the different striation marks are combined to yield a numerical value for the matching of two bullets. A scoring list for the different candidates is transferred to the firearms examiner, who can then manipu- late the two images on the computer in a way similar to a 'real' comparison microscope.

lations correspond to the diagonal of the matrix. Figure 5 Obviously, the system is open to improvements and adapta- shows the obtained correlation matrix, where the peak tions to special needs. intensity of the phase-only correlation is translated into a grey value. Six traces of the matrix are defined as the sum of the six correlation coefficients for each of the six possi- ble combinations of striation marks on the bullet. The high- est value of these six numbers, the highest correlation matrix trace (HCMT) is a measure for the matching of the two bullets. If both bullets are not fired by the same firearm, lower values for the HCMT are found and no diagonal shape of the matrix can be obtained.

Future directions Further work on the system is needed for several reasons. The effectiveness of the algorithm has to be studied when using a larger database and it is planned to extend the pre- sent database to 32 weapons with five bullets each. The sta- tistical interpretation of HCMT value needs to be investi- gated. Also planned is the investigation of the impact on the final results of using a video camera with a higher resolu- tion. The extension of the system to cartridge cases should

If one bullet is compared to a database, the computer sets up be straightforward. a list of possible candidates, based on the HCMT value. The

References different can be using a graphical user 1, E &lens and F Demanet, Basic principles of weapon identification, interface that simulates the possible manipulations in a Proceedings of 11th International Symposium on Ballistics, Vol

comparison microscope. The CMCs are also presented to 111.9-11 May 1989 Dmkkerij Stroobants Neerijse. 2. J De Kinder, P Prevot, M Pirlot and B Nys. Surface topology of bullet the user during visualisation and turn out to be very helpful

striations: an innovating technique. Association of Firearms and in eliminating false candidates. Tool Mark Examiners Journal 1998; 30(2): 294-299

3. T Uchiyama. Automated landmark identification system. Association of Firearms and Tool Mark Examiners Journal 1993; 25(3).

4. VF Leavers. Shape detection in computer vision using the Hough Transform. London: Springer-Verlag London Limited, 1992.

56 Science & Justice 1999; 39(1): 53-56