research article convolution neural network of deep learning for detection of fire ... · 2020. 12....
TRANSCRIPT
![Page 1: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/1.jpg)
Horticultural Science and Technology 763
Received: April 20, 2020
Revised: June 24, 2020
Accepted: August 20, 2020
OPEN ACCESS
HORTICULTURAL SCIENCE and TECHNOLOGY
38(6):763-775, 2020
URL: http://www.hst-j.org
pISSN : 1226-8763
eISSN : 2465-8588
This is an Open Access article distributed
under the terms of the Creative Commons
Attribution Non-Commercial License which
permits unrestricted non-commercial use,
distribution, and reproduction in any medium,
provided the original work is properly cited.
Copyrightⓒ2020 Korean Society for
Horticultural Science.
This study was supported by Rural Development Administration through Development of Image- based Fire Blight Regional Forecasting Technology Project (PJ01277604).
RESEARCH ARTICLE https://doi.org/10.7235/HORT.20200069
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
Tae Hwan Kang1†
, Hyun-Jung Kim2†
, and Hyun Kwon Noh2*
1Major in Bio-Industry Mechanical Engineering, Kongju National University, Yesan 32439, Korea
2Department of Biosystems Engineering, Chungbuk National University, Cheongju 28644, Korea
*Corresponding author: [email protected]
†These authors contributed equally to this work
Abstract
In this study, a RGB sensor was installed on a rotary-wing drone to obtain RGB images of a fire
blight simulation site and the actual infection of fire blight (caused by Erwinia amylopora) on pear
(Pyrus communis L.) trees, and to develop a system that can determine the occurrence of fire blight
in the field by analyzing the RGB images by using Deep Learning's Convolution Neural Network
(CNN). The ability to detect fire blight was reviewed by analyzing fire blight learning using the fire
blight simulation and applying it to the actual fire blight infection category. The learning of fire
blight used the overlap-based Dice function, and the error during learning was performed based on
an average of 22%. As a result, 64 of the 80 fire blight infected fruit tree images were recognized as
an infection, while 16 images were recognized as non-infected and the predicted rate was 80.0%. In
the case of the actual infection of fire blight, 15 of the 21 images of the infected fruit trees were
recognized as being infected with fire blight, and 6 images were recognized as non-infected, with a
predicted rate of 71.4%. Therefore, our results show that it is possible to detect fire blight by
applying images acquired with drones and RGB sensors to a CNN of Deep Learning. The system
developed in this study can be applied to the site to reduce the manpower and time required to
forecast actual fire blight and also it is believed that rapid response will reduce secondary spread
damage.
Additional key words: Dice function, field diagnoses, plant disease, RGB sensor, rotary-wing drones
Introduction
Fruit tree fire blight first occurred in New York in 1780 and has since occurred in many parts of the
world, including Europe and Central Asia (Calzolari et al., 1999; Bahadou et al., 2018; Zhao et al.,
2019). This disease is caused by Erwinia amylopora, a bacterial pathogen, and is a highly contagious
fruit tree disease that causes black-brown decay of branches, berries, and leaves, on fruit trees such as
apples and pears (Jeong et al., 2018). There is currently no treatment for fire blight, and the disease is
difficult to control, especially since the pathogen is easily spread to nearby trees by bees and
splashing rain and has a long incubation period. In addition, if the presence of the disease is
confirmed, the damage to the orchard can be very serious because all of the trees
![Page 2: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/2.jpg)
764 Horticultural Science and Technology
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
within a radius of 100 meters must be buried in accordance with the 'Guidelines for Preventing Fire Blight' based on
(Preparation and Prevention Act 36). In Korea, fire blight occurred in 43 orchards in Anseong and Cheonan in 2015, 8
orchards in 2016, and 17 orchards in 2017, resulting in the destruction of trees over an area of about 103 ha (Park et al.,
2017). According to the Rural Development Administration, as of June 2020, fire blight incidences were reported in 312
farms and an area of 187 ha. In Korea, copper-based compounds and antibiotics such as kasugamycin and oxyttracycline,
or antagonistic microorganisms are used to prevent fire blight (McGhe and Sundin, 2011; Lee et al., 2018).
The study on fire blight detection was intended to make a strip-type diagnostic kit using locally separated Erwinia
amylopora strains, check the specificity and sensitivity of the domestic fire blight pathogens in this diagnostic kit and a
diagnostic kit already commercialized in Europe, and examine the possibility of application in field diagnosis of orchard
fire blight (Heo et al., 2017). There are also improved loops for diagnosis of apple and pear fire blight (Shin et al., 2018)
and research on selection of in-flight test repellents for orchard disease in Korea (Lee et al., 2018). Also, there is a
possibility of future infection in Korea by using Maryblyt, which is windows application used in the U.S., Israel, Spain,
Canada, and Switzerland to study and prevent fire blight. Most of the existing studies were field-diagnostic and
environmental-induced on-site studies. In the case of on-site diagnostics, all the pear trees in the orchard must be
investigated to determine the incidence of fire blight, which requires large inputs of manpower, time, and cost. Precision
forecasts of a single orchard are difficult because prediction by environmental factors establishes and predicts the
likelihood of occurrence in a wide range of areas. Therefore, it is imperative to develop technologies that can reduce the
manpower, time, and cost of predicting fire blight by remotely monitoring orchards and performing on-site diagnostics.
Artificial intelligence (AI) is also widely used by many researchers for plant growth and nonlinear data processing
(Nam et al., 2019; Moon et al., 2020). A Convolutional Neural Network (CNN) is an algorithm useful for finding patterns
to analyze images, and CNNs learn directly from data and classify images using patterns. The core concept of a CNN is
to learn by maintaining spatial data in images. CNNs handle images more effectively by applying filtering techniques to
artificial neural networks (Yann et al., 1998). In medical image segmentation, lots of researchers have used CNNs
(Ronneberger et al., 2015; Litjens et al., 2017). They used a U-Net, which is similar to SegNet, and consists of an encoder
and a decoder network. U-Net is very useful in segmenting medical images, U-Net and U-Net-like models have been used
in many biomedical areas, such as neuronal structures (Ronneberger et al., 2015), liver (Christ et al., 2016), and skin
lesion (Lin, 2017), etc.
Therefore, in this study, the RGB images were acquired using rotary-wing drones and RGB sensors, and the detection
status of fire blight was analyzed using the CNN of Deep Learning, and then a system was developed to predict the
occurrence of fire blight on the site. The performance of the fire blight forecast system was evaluated by applying it to
actual fire blight images.
Materials and Methods
Experimental Device
Fig. 1 and Table 1 show the specifications of a rotary-wing drone and specifications for image acquisition in areas where
fire blight occur. As shown in Fig. 1, the image of pear fire blight was acquired using the data acquisition platform of the
Rotary-wing Drone DJI Company Phantom4 Pro V2.0.
![Page 3: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/3.jpg)
Horticultural Science and Technology 765
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
Fig. 1. Rotary-wing drone for image acquisition of pear fire blight (DJI Phantom4 Pro V2.0).
Fig. 2. 1" 20 MP (CMOS) camera for RGB image acquisition.
Table 1. Specifications of DJI Phantom4 Pro V2.0 Rotary-winged drone
Model DJI Phantom4 Pro V2.0
Company DJI
Weight 1,388 g
Sensing distance 30 m
Control distance 7 km
Flight speed 50 ‑ 72 km/h
Flight time 55 min
Table 2. Specifications of DJI Phantom4 Pro V2.0 camera
Model DJI Phantom4 Pro V2.0 camera
Company DJI
Sensor resolution 20 MP
Image resolution 5,472 pixels × 3,648 pixels
Band Red, Green, Blue
Fig. 2 and Table 2 show the sensors and specifications for obtaining images of pear fire blight. As shown in Fig. 2, the
sensor for acquiring the image data for pear fire blight was a 1" 20 MP RGB Sensor (DJI Phantom4 Pro V2.0 Camera).
In addition, the video acquisition software used was the DJI GO4 application. DJI GO4 sets the flight altitude and overlay
shooting area of the rotary-wing drone to create an autonomous flight path, and it is possible to set the shooting area
display and landing.
![Page 4: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/4.jpg)
766 Horticultural Science and Technology
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
Fig. 3. Fire blight-infected site and rotary-winged drone flight path (Cheonan, Chungnam).
Fig. 4. Image acquisition flight path of rotary-winged drone for simulation of fire blight and the orchard (Naju, Jeonnam).
Image Acquisition
Actual pear fire blight images (fire blight outbreak in a pear orchard in Ipjang-myeon, Cheonan-si, Chungcheongnam-do,
Korea, June 2018) were acquired at an altitude of over 6 m. The spatial resolution was 0.11 cm/pixel. In order to increase
the precision and reliability of fire blight detection machine learning, a fire blight branch model for simulation was also
installed in the orchard following the advice of a disease expert in Naju pear research center. The simulation images of fire
blight were obtained at an altitude of over 12 m. The spatial resolution was 0.32 cm/pixel.
Fig. 3 shows the location of an orchard in Cheonan, South Chungcheong Province, where the actual pear fire blight
occurred, and the autonomous flight path of the drone. Fig. 4 shows the flight path of a rotary-wing drone and a regular
image of the orchard at the pear lab in Naju, South Jeolla Province.
Pre-processing of a RGB Image
Fig. 5 shows the image preprocessing process for deep-learning analysis of the image of pear fire blight.
As shown in Fig. 5, images acquired with a 1" 20 MP RGB camera mounted on a DJI Phantom4 Pro V2.0 drone were
![Page 5: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/5.jpg)
Horticultural Science and Technology 767
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
Fig. 5. Image preprocessing process for deep learning analysis of pear fire blight images.
Fig. 6. RGB image analysis algorithm using CNN in deep learning.
obtained with 5,472 pixels × 3,648 pixels. The pre-processing of the image was performed after compressing the acquired
images to 2,048 pixels × 4,096 pixels and dividing the compressed images into 16 images by 512 pixels × 1,024 pixels for
normalization.
Fire Blight Learning Algorithm
Fig. 6 shows the RGB image segmentation algorithm for fire blight using Deep Learning's CNN.
The fire blight RGB image analysis algorithm and graph creation used Tensorflow (US 1.2.1, Google, Menlo Park,
USA), a library of numerical calculations based on the programming language Python.
The CNN architecture consists of image input layer, convolution layer, batch normal layer, ReLU layer, max pooling
layer, transpose convolution layer, and sigmoid layer. In particular, batch normal layer was applied to improve the
performance of detection of fire blight images (Ji et al., 2018). Batch normalization is one of the proposed methods for
solving the problem of the network's Internal Covariate Shift, which is to normalize the unit output of the mini-batch pass
through each Affine layer into a standard regular distribution. In this paper, the average () and the variance () of the
minibatch were calculated and normalized () based on its value, the scale and shift ( ) were performed, and equations
(1) ‑ (6) were as follows. (Ioffe and Christian, 2015).
![Page 6: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/6.jpg)
768 Horticultural Science and Technology
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
⋯
(1)
←
(2)
←
(3)
←
(4)
← ≡ (5)
Image Learning Similarity Evaluation
In this study, the similarity coefficient was used as one of the methods of assessment based on overlap. The Dice
similarity coefficient is defined in equation (6) indicating its similarity by directly comparing the results of the automatic
detection with the gold standard image directly detected by the naked eye (Kim and Kim, 2017).
∩
(6)
where and
represent the segmented region of interest in the gold standard image and the auto-detection image,
respectively.
The component of the error sequence , , , and are True Positive, False Positive, True Negative, and False
Negative, respectively; those values are defined through Equations (7) ‑ (10) by using the segmentation function
and .
min
(7)
min
(8)
min
(9)
min
(10)
![Page 7: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/7.jpg)
Horticultural Science and Technology 769
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
Fig. 7. Original and gold standard images used for image segmentation.
Table 3. Data set for fire blight analysis
Item Total Training Evaluation
Fire blight simulation 231 131 100
Fire blight 140 100 40
where and
are the split allocation function of the gold standard image and the auto detect image, respectively,
and the function value is defined as 1, if ∈(0, 1), and
0 if ∉(0.1) in the gold standard image,
1 if ∈(0, 1), and 0 if ∉(0.1) in the auto detect image. Also, =1 means the region of split interest, =2 means
a background other than the region of interest. Additionally, using the Adam Optimizer, we optimized the learning
variables of the graph according to the DICE loss value.
Fig. 7 shows the original image and gold standard image used in image segmentation.
Fire Blight Analysis Data Set
Table 3 shows the data sets for the analysis of fire blight. 131 images out of a total of 231 simulation images were used
for training and 100 images were used for learning results evaluation.
In the case of actual fire blight images, 100 images of the total of 140 images were used for training and 40 were used
for learning results evaluation.
Results and Discussion
Learning Using a Fire Blight Simulation Branch
Fig. 8 illustrates the learning loss value of overlap-based fire blight imaging simulations using the Dice function. The
number of studies using simulation images was performed 3,600 times, and errors were performed based on an average
of 22% to prevent overfitting of the image data.
![Page 8: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/8.jpg)
770 Horticultural Science and Technology
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
Fig. 8. Loss value using Dice similarity coefficient.
Table 4. Results of the detection of a fire blight simulation using the CNN learning method in deep learning
Confusion matrixPrediction
Total Prediction ratio (%)Infection No-infection
ActualInfection 64 16 80 80.0
No-infection 1 19 20 95.0
Detection of a Fire Blight Simulation Branch
Fig. 9 shows a still image acquired with a 1" 20 MP RGB camera at an altitude of 12 m and the result of detecting fire
blight by deep learning.
As shown in Fig. 9, the area marked by a red ellipse is the site of onset of fire blight and detection by deep learning shows
that the area of fire blight can be detected accurately. In particular, as shown in the enlarged image, it was possible to
detect the ground shading and a dried branch of an orchard similar in color to that of a fire blight-infected branch without
recognizing it as fire blight. However, there were also cases in which very small shaded areas of a normal branch were
mis-detected, as seen in Fig. 10.
Table 4 shows the learning results of the pear's RGB image taken at an altitude of 12 meters. For fire blight simulation
branch learning 131 images of fire blight simulation were used.
As shown in Table 4, 100 images were used for the training test, which consisted of 80 images of fire blight-infected
fruit trees and 20 images of non-infected fruit trees.
As a result, 64 of the 80 images of the fruit tree infected with fire blight were recognized as an infection while 16 images
were recognized as non-infected, resulting in an 80.0% prediction rate. In addition, in the test results of 20 non-infected
images, 19 images were recognized as non-infected, and 1 image was recognized as infected, showing a 95.0% prediction
rate. Therefore, it is believed that Deep Learning's CNN algorithm, which was developed in this study, can be applied
sufficiently to detect pear fire blight.
![Page 9: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/9.jpg)
Horticultural Science and Technology 771
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
(a) Original zoomed image
(b) Fire blight detection zoomed image
(c) Fire blight original image
(d) Fire blight detection image
Fig. 9. Results of the detection of a fire blight simulation image obtained from the 12-m altitude.
Fig. 10. Errors detection of fire blight simulation by deep learning.
![Page 10: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/10.jpg)
772 Horticultural Science and Technology
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
Detection of Branches of Fire Blight
Fig. 11 shows the results of the detection by the size of each occurrence of fire blight after learning the actual fire blight
incidence images acquired with a 1" 20 MP RGB camera at an altitude of 6 meters using the same algorithm as the fire
blight simulation analysis algorithm.
As shown in Fig. 11, areas marked by solid red lines show large-scale, medium-sized, and small-sized incidences of the
fire blight, which are about 50 cm in size, 20 cm in size, and 5 cm in size, respectively. In particular, the areas of shading
between the ground and tree branches are marked with red dotted lines, and the test sites that have fallen dried branches
can be detected accurately without recognizing them as fire blight. However, some of the test sites and very small shaded
parts of a normal branch were also mis-detected as shown in Fig. 12. In order to increase the predictive accuracy in the
future, more learning will be required using various types of imaging.
(a) Large size (approximately 50 cm)
(b) Medium size (approximately 20 cm)
(c) Small size (approximately 5 cm)
Fig. 11. Detection results of images of pear fire blight acquired at a height of 6 m.
![Page 11: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/11.jpg)
Horticultural Science and Technology 773
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
Fig. 12. Results of misdetection of fire blight by deep learning.
Table 5. Results of detection of actual fire blight by the CNN learning method in deep learning
Confusion matrixPrediction
Total Prediction ratio (%)Infection No infection
ActualInfection 15 6 21 71.4
No infection 5 14 19 73.7
Table 5 shows the results of actual detection of fire blight using learning results from RGB images of pear fire blight
taken at a height of 6 m.
As shown in Table 5, 40 images were used for the training test, consisting of 21 images of fire blight infection and 19
images of the non-infected fruit tree.
As a result, 15 of the 21 images of an infected fruit tree were recognized as fire blight, while 6 images were recognized
as non-infected, resulting in a prediction rate of 71.4%. In addition, the test results of 19 images of the non-infected
orchard showed that 14 images were recognized as non-infected, and 5 images were recognized as infected, with a
prediction rate of 73.7%.
Therefore, it is possible to detect fire blight if a CNN of Deep Learning is used after obtaining the images using drones
in the pear orchard. In particular, if this fire blight detection system can detect partial fire blight in the pear orchard during
actual field prediction, the spread of the disease and its damage can be reduced and also the reduction of manpower and
time required for wide-area prediction can be achieved.
Conclusions
In this study, a RGB sensor was installed on a rotary-wing drone to obtain RGB images of the fire blight simulation site
and the actual infection of fire blight, and to develop a system that can determine the occurrence of fire blight in the field
by analyzing the detection status of fire blight by using Deep Learning's CNN. Images of infected areas were acquired
with 0.32 cm/pixel images at an altitude of 6 m, with a spatial resolution of 0.11 cm/pixel at an altitude of 12 m. 131
images out of a total of 231 simulation images were used for training and 100 images were used for evaluation of learning
results. In addition, 100 images out of a total of 140 images of infection were used for training and 40 were used for
evaluation of learning results. The fire blight image learning algorithm used Tensorflow, a library of numerical
![Page 12: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/12.jpg)
774 Horticultural Science and Technology
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
calculations based on the programming language Python. The similarity of image learning was directly compared with the
result of the gold standard image and the automatic detection image, and the error was learned based on an average of 22%
to prevent overfitting of the image data.
As a result, 64 of the 80 fire blight-infected fruit tree images were recognized as an infection, while 16 were recognized
as non-infected and the prediction rate was 80.0%. In addition, in the test results of 20 non-infected orchards, 19 images
were recognized as non-infected, and only 1 image was recognized as infected, showing a 95.0% prediction rate. In the
case of an actual fire blight-infected branch, 15 of the 21 images of an infected fruit tree were recognized as infected, while
6 images were recognized as non-infected, with a prediction rate of 71.4%. In addition, the test results of 19 images of the
non-infected orchard showed that 14 images were recognized as non-infected, and 5 images were recognized as infected,
with a prediction rate of 73.7%.
Therefore, the system for analyzing the occurrence of pear fire blight by applying the images acquired with the drones
and CNN of Deep Learning developed in this study is possible with high accuracy. In particular, it is believed that by
applying the developed system to the site of preliminary pear fire blight, the actual manpower and time required for the
prediction of the area of fire blight can be reduced and the spread of the disease and the need for orchard destruction can
be reduced. It is believed that more images of infectious diseases will be needed and learned in order to increase the
accuracy of predicting fire blight. It is also necessary to analyze the possibility of detecting a highly damaging apple fire
blight using the same system.
Literature Cited
Bahadou SA, Ouijja A, Karfach A, Tahiri A, Lahlali R (2018) New potential bacterial antagonists for the biocontrol of fire blight disease
(Erwinia amylovora) in Morocco. Microb Pathog 117:7-15. doi:10.1016/j.micpath.2018.02.011
Calzolari A, Finelli F, Mazzoli GL (1999) A severe unforeseen outbreak of fire blight in the Emilia-Romagna Region. Acta Hortic
489:171-176. doi:10.17660/ActaHortic.1999.489.26
Christ PF, Elshaer M, Ettlinger F, Tatavarty S, Bickel M, Bilic P, Rempfler M, Armbruster M, Hofmann F, et al. (2016) Automatic liver and
lesion segmentation in CT using cascaded fully convolutional neural networks and 3D conditional random fields. International
Conference on Medical Image Computing and Computer-assisted Intervention, Springer, pp 415-423. doi:10.1007/978-3-319-46723-
8_48
Heo GI, Shin DS, Son SH, Oh CS, Park DH, Lee YK, Cha JS (2017) On-site diagnosis of fire blight with antibody-based diagnostic strips. Res
Plant Dis 23:306-313
Ioffe S, Christian S (2015) Batch normalization: Accelerating deep network training by reducing internal covariate shift. Int Conf Mach
Learn 37:448-456
Jeong US, Kim SS, Kim TH, Seo ST (2018) Sharing loss compensation between the owner and the lessee caused by disease control order
for fire blight in fruit. Korean J Agric Manag Policy 45:291-314. doi:10.30805/KJAMP.2018.45.2.291
Ji MG, Chun JC, Kim NG (2018) An improved image classification using batch normalization and CNN. J Internet Comput Serv 19:35-42
Kim JW, Kim JH (2017) Review of evaluation metrics for 3D medical image segmentation. J Korean Soc Imaging Inform Med 23:14-20
Lee MS, Lee IG, Kim SK, Oh CS, Park DH (2018) In vitro screening of antibacterial agents for suppression of fire blight disease in Korea.
Res Plant Dis 24:41-51
Lin BS, Michael K, Kalra S, Tizhoosh HR (2017) Skin lesion segmentation: U-nets versus clustering. 2017 IEEE Symposium Series on
Computational Intelligence (SSCI), Springer, pp 1-7. doi:10.1109/SSCI.2017.8280804
Litjens G, Kooi T, Bejnordi BE, Setio AAA, Ciompi F, Ghafoorian M, Laak J, Ginneken B, Sanchez C (2017) A survey on deep learning in
medical image analysis. Méd Image Anal 42:60-88. doi:10.1016/j.media.2017.07.005
McGhee GC, Sundin GW (2011) Evaluation of kasugamycin for fire blight management, effect on nontarget bacteria, and assessment
of kasugamtcin resistance potential in Erwinia amlyovora. Phytopathology 101:192-204. doi:10.1094/PHYTO-04-10-0128
Moon T, Choi HY, Jung DH, Chang SH, Son JE (2020) Prediction of CO2 Concentration via Long Short-Term Memory Using Environmental
Factors in Greenhouses. Hortic Sci Technol 38:201-209
Nam DS, Moon T, Lee JW, Son JE (2019) Estimating transpiration rates of hydroponically-grown paprika via an artificial neural network
![Page 13: RESEARCH ARTICLE Convolution Neural Network of Deep Learning for Detection of Fire ... · 2020. 12. 15. · The fire blight RGB image analysis algorithm and graph creation used Tensorflow](https://reader035.vdocuments.us/reader035/viewer/2022071415/611061b44f6d8229567762c2/html5/thumbnails/13.jpg)
Horticultural Science and Technology 775
Convolution Neural Network of Deep Learning for Detection of Fire Blight on Pear Tree
using aerial and root-zone environments and growth factors in greenhouses. Hortic Environ Biotechnol 60:913-923.
doi:10.1007/s13580-019-00183-z
Park DH, Lee YG, Kim JS, Cha JS, Oh CS (2017) Current status of fire blight caused by Erwinia amylovora and action for its management
in Korea. J Plant Pathol 99:59-63
Ronneberger O, Fischer P, Brox T (2015) U-net: Convolutional networks for biomedical image segmentation. International Conference
on Medical Image Computing and Computer-assisted Intervention, Springer, pp 234-241. doi:10.1007/978-3-319-24574-4_28
Rural Development Administration (RDA) (2020) Occurrence of fire blight and control. RDA, Jeonju, Korea
Shin DS, Heo GI, Son SH, Oh CS, Lee YK, Cha JS (2018) Development of an improved loop-mediated isothermal amplification assay for
on-site diagnosis of fire blight in apple and pear. J Plant Pathol 34:191-198
Yann L, Bottou L, Bengio Y, Haffner P (1998) Gradient-based learning applied to document recognition. Proc IEEE 86:2278-2324.
doi:10.1109/5.726791
Zhao Y. Tian Y. Wang L. Geng G. Zhao W. Hu B. Zhao Y. (2019) Fire blight disease, a fast-approaching threat to apple and pear
production in China. J Integrative Agriculture 18:815-820. doi:10.1016/S2095-3119(18)62033-7