1 modeling condition and performance of mining equipment tad s. golosinski and hui hu mining...
TRANSCRIPT
1
Modeling Condition And Performance Of Mining Equipment
Tad S. Golosinski and Hui Hu
Mining Engineering
University of Missouri-Rolla
2
Condition and Performance MonitoringSystems
Machine health monitoring• Allows for quick diagnostics of problems
Payload and productivity• Provides management with machine and fleet
performance data Warning system
• Alerts operator of problems, reducing the risk of catastrophic failure
3
CAT’s VIMS (Vital Information Management System)
Collects / processes information on major machine components• Engine control
• Transmission/chassis control
• Braking control
• Payload measurement system
Installed on…• Off-highway trucks
• 785, 789, 793, 797
• Hydraulic shovels• 5130, 5230
• Wheel loaders• 994, 992G (optional)
4
Other, Similar Systems
Cummins• CENSE (Engine Module)
Euclid-Hitachi• Contronics & Haultronics
Komatsu• VHMS (Vehicle Health Monitoring System)
LeTourneau• LINCS (LeTourneau Integrated Network Control System)
5
Round Mountain Gold Mine
Truck Fleet17 CAT 785 (150t)11 CAT 789B (190t)
PSA (Product Support Agreement) CAT dealer guarantees 88% availability
6
VIMS in RMG Mine
Average availability is 93% over 70,000 operating hours
VIMS used to help with preventive maintenance• Diagnostics after engine failure
• Haul road condition assessment
• Other
Holmes Safety Association Bulletin 1998
7
CAT MineStar
CAT MineStar - Integrates …• Machine Tracking System
(GPS)
• Computer Aided Earthmoving System (CAES)
• Fleet scheduling System (FleetCommander)
• VIMS
8
Cummins Mining Gateway
Cummins Engine
Base
Station
RF Receiver
Modem
Modem
CENSEDatabaseMiningGateway.com
9
VIMS Data & Information Flow
VIMS Data Warehouse
Data ExtractData CleanupData Load
Data Mining
Tools
Information Extraction
Information Apply
Mine Site 1
Mine Site 2
Mine Site 3
VIMS Legacy Database
10
Earlier Research: Data Mining of VIMS
Kaan Ataman tried modeling using:• Major Factor Analysis
• Linear Regression Analysis
• All this on datalogger data Edwin Madiba tried modeling using:
• Data formatting and transferring
• VIMS events association
• All this on datalogger and event data
11
Research Objectives
Build the VIMS data warehouse to facilitate the data mining
Develop the data mining application for knowledge discovery
Build the predictive models for prediction of equipment condition and performance
13
VIMS Features
Sensors & ControlsMonitor & Store
• Event list• Event recorder• Data logger• Trends• Cumulative data• Histograms• Payloads
Wireless Link
Maintenance
Management
Download
Operator
VIMS wireless
15
VIMS Statistical Data Warehouse
• Minimum
• Maximum
• Average
• Data Range
• Variance
• Regression Intercept
• Regression Slope
• Regression SYY
• Standard Deviation
1-3 minute interval statistical data
16
VIMS Data Description Six CAT 789B trucks 300 MB of VIMS data 79 “High Engine Speed” events
One-minute data statistics
17
SPRINT -A Decision Tree Algorithm IBM Almaden Research Center
GINI index for the split point
Strictly binary tree Built-in v-fold cross validation
21)( jpsgini
)()()( 22
11 sgini
n
nsgini
n
nsginisplit
19
00000654321000000
High Engine Speed
Snapshot
Normal Engine Speed
Normal Engine Speed
High Eng
767_1 767_2
Eng_1 Eng_2Other Other OtherOther
VIMS
Data
Predicted Label
Event_ID
VIMS EVENT PREDICTION
21
Total Errors = 120 (6.734%)
Predicted Class --> | Other | Eng1 | Eng3 | Eng2 | Eng4 | Eng6 | Eng5 |
----------------------------------------------------------------------------------------------------------------
Other | 1331 | 18 | 9 | 5 | 16 | 6 | 1 | total = 1386
Eng1 | 0 | 62 | 1 | 3 | 0 | 0 | 0 | total = 66
Eng3 | 0 | 11 | 51 | 2 | 2 | 1 | 0 | total = 67
Eng2 | 0 | 12 | 8 | 38 | 7 | 0 | 0 | total = 65
Eng4 | 0 | 3 | 7 | 2 | 55 | 0 | 1 | total = 68
Eng6 | 0 | 0 | 0 | 1 | 0 | 61 | 4 | total = 66
Eng5 | 0 | 0 | 0 | 0 | 0 | 0 | 64 | total = 64
--------------------------------------------------------------------------------------------------------------
1331 | 106 | 76 | 51 | 80 | 68 | 70 | total = 1782
Decision Tree: Training on One-Minute Data
22
Total Errors = 24 (24%)
Predicted Class --> | Other | Eng1 | Eng3 | Eng2 | Eng4 | Eng6 | Eng5 |
-----------------------------------------------------------------------------------------------------------
Other | 59 | 3 | 0 | 2 | 3 | 0 | 1 | total = 68
Eng1 | 4 | 1 | 0 | 1 | 0 | 0 | 0 | total = 6
Eng3 | 0 | 3 | 1 | 0 | 1 | 0 | 0 | total = 5
Eng2 | 1 | 1 | 1 | 1 | 0 | 0 | 0 | total = 4
Eng4 | 1 | 1 | 0 | 1 | 1 | 0 | 0 | total = 4
Eng6 | 0 | 0 | 0 | 0 | 0 | 7 | 0 | total = 7
Eng5 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | total = 6
-----------------------------------------------------------------------------------------------------------
65 | 9 | 2 | 5 | 5 | 7 | 7 | total = 100
Decision Tree: Test#1 on One-Minute Data
23
Decision Tree: Test#2 on One-Minute Data Total Errors = 35 (17.86%)
Predicted Class --> | Other | Eng1 | Eng3 | Eng2 | Eng4 | Eng6 | Eng5 |
--------------------------------------------------------------------------------------------------------
Other | 141 | 9 | 2 | 4 | 4 | 0 | 0 | total = 160
Eng1 | 2 | 2 | 1 | 1 | 0 | 0 | 0 | total = 6
Eng3 | 2 | 1 | 2 | 0 | 1 | 0 | 0 | total = 6
Eng2 | 2 | 1 | 2 | 1 | 0 | 0 | 0 | total = 6
Eng4 | 1 | 0 | 1 | 1 | 3 | 0 | 0 | total = 6
Eng6 | 0 | 0 | 0 | 0 | 0 | 6 | 0 | total = 6
Eng5 | 0 | 0 | 0 | 0 | 0 | 0 | 6 | total = 6
---------------------------------------------------------------------------------------------------------
148 | 13 | 8 | 7 | 8 | 6 | 6 | total = 196
25
Total Errors = 51 (5.743%)
Predicted Class --> | OTHER | ENG1 | ENG2 | ENG3 |
---------------------------------------------------------------------
OTHER | 657 | 6 | 19 | 3 | total = 685
ENG1 | 0 | 62 | 10 | 0 | total = 72
ENG2 | 0 | 13 | 54 | 0 | total = 67
ENG3 | 0 | 0 | 0 | 64 | total = 64
---------------------------------------------------------------------
657 | 81 | 83 | 67 | total = 888
Decision TreeTraining on Two-Minute Data Sets
26
Total Errors = 14 (29.79%)
Predicted Class --> | OTHER | ENG1 | ENG2 | ENG3 |
---------------------------------------------------------------------
OTHER | 28 | 5 | 4 | 1 | total = 38
ENG1 | 1 | 0 | 0 | 0 | total = 1
ENG2 | 2 | 1 | 1 | 0 | total = 4
ENG3 | 0 | 0 | 0 | 4 | total = 4
---------------------------------------------------------------------
31 | 6 | 5 | 5 | total = 47
Decision TreeTest #1 on Two-Minute Data
27
Total Errors = 15 (15.31%)
Predicted Class --> | OTHER | ENG1 | ENG2 | ENG3 |
---------------------------------------------------------------------
OTHER | 71 | 8 | 1 | 0 | total = 80
ENG1 | 3 | 3 | 0 | 0 | total = 6
ENG2 | 0 | 3 | 3 | 0 | total = 6
ENG3 | 0 | 0 | 0 | 6 | total = 6
---------------------------------------------------------------------
74 | 14 | 4 | 6 | total = 98
Decision TreeTest #2 on Two-Minute Data
29
Total Errors = 28 (4.878%)
Predicted Class --> | OTHER | ENG1 | ENG2 |
----------------------------------------------------
OTHER | 411 | 23 | 4 | total = 438
ENG1 | 1 | 65 | 0 | total = 66
ENG2 | 0 | 0 | 70 | total = 70
----------------------------------------------------
412 | 88 | 74 | total = 574
Decision TreeTraining on Three-Minute Data
30
Total Errors = 12 (19.05%)
Predicted Class --> | OTHER | ENG1 | ENG2 |
----------------------------------------------------
OTHER | 42 | 9 | 0 | total = 51
ENG1 | 3 | 5 | 0 | total = 8
ENG2 | 0 | 0 | 4 | total = 4
----------------------------------------------------
45 | 14 | 4 | total = 63
Decision Tree Test #1 on Three-Minute Data
31
Decision TreeTest #2 on Three-Minute Data
Total Errors = 9 (14.06%)
Predicted Class --> | OTHER | ENG1 | ENG2 |
----------------------------------------------------
OTHER | 47 | 5 | 0 | total = 52
ENG1 | 4 | 2 | 0 | total = 6
ENG2 | 0 | 0 | 6 | total = 6
----------------------------------------------------
51 | 7 | 6 | total = 64
32
Decision Tree Summary “One-Minute model” needs more complex tree
structure “One-Minute model” gives low accuracy of
predictions “Three-Minute” decision tree model gives
reasonable accuracy of predictions• Based on test #1 
• Other - 13% error rate
• Eng1 - 50% error rate
• Eng2 – 0 error rate
Other approach?
33
Backpropagation A Neural Network Classification Algorithm
Input HiddenLayer
Out
Some choices for F(z):f(z) = 1 / [1+e-z] (sigmoid)f(z) = (1-e-2z) / (1+e-2z) (tanh)
Characteristic: Each output corresponds to a possible classification.
f(z)
x1
x2
x3
w3
w2
w1
Node Detail
z = iwixi
Node
34
m
kkk ytE
1
2)(21
min
m
kkk ytE
1
2)(21
yk (output) is a function of the weights wj,k.tk is the true value.
SSQ Error Function
Freeman & Skapura, Neural Networks, Addison Wesley, 1992
Minimize the Sum of Squares
kj,,,
, for W 0 solve and
kjWkj
kjW EW
EE
In the graph:
• Ep is the sum of squares error
Ep is the gradient, (direction of maximum function increase)
More
37
NN Summary
Insufficient data for one-minute and two-minute prediction models
Three-minute network shows better performance than the decision tree model:
• Other - 17% error rate
• Eng1 - 28% error rate
• Eng2 - 20% error rate
38
Conclusions
Predictive model can be built Neural Network model is more accurate
than the Decision Tree one • Based on all data
Overall accuracy is not sufficient for practical applications
More data is needed to train and test the models
39
References Failure Pattern Recognition of a Mining
Truck with a Decision Tree Algorithm• Tad Golosinski & Hui Hu, Mineral Resources
Engineering, 2002 (?)
Intelligent Miner-Data Mining Application for Modeling VIMS Condition Monitoring Data• Tad Golosinski and Hui Hu, ANNIE, 2001, St. Louis
Data Mining VIMS Data for Information on Truck Condition• Tad Golosinski and Hui Hu, APCOM 2001, Beijing,
P.R. China