dynamicmulti-swarmdifferentiallearningquantumbird...

24
Research Article Dynamic Multi-Swarm Differential Learning Quantum Bird Swarm Algorithm and Its Application in Random Forest Classification Model Jiangnan Zhang, Kewen Xia , Ziping He, and Shurui Fan School of Electronic and Information Engineering, Hebei University of Technology, Tianjin 300401, China Correspondence should be addressed to Kewen Xia; [email protected] Received 14 August 2019; Revised 21 June 2020; Accepted 24 June 2020; Published 7 August 2020 Academic Editor: Ras ¸it K¨ oker Copyright © 2020 Jiangnan Zhang et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Bird swarm algorithm is one of the swarm intelligence algorithms proposed recently. However, the original bird swarm algorithm has some drawbacks, such as easy to fall into local optimum and slow convergence speed. To overcome these short-comings, a dynamic multi-swarm differential learning quantum bird swarm algorithm which combines three hybrid strategies was established. First, establishing a dynamic multi-swarm bird swarm algorithm and the differential evolution strategy was adopted to enhance the randomness of the foraging behavior’s movement, which can make the bird swarm algorithm have a stronger global exploration capability. Next, quantum behavior was introduced into the bird swarm algorithm for more efficient search solution space. en, the improved bird swarm algorithm is used to optimize the number of decision trees and the number of predictor variables on the random forest classification model. In the experiment, the 18 benchmark functions, 30 CEC2014 functions, and the 8 UCI datasets are tested to show that the improved algorithm and model are very competitive and outperform the other algorithms and models. Finally, the effective random forest classification model was applied to actual oil logging prediction. As the experimental results show, the three strategies can significantly boost the performance of the bird swarm algorithm and the proposed learning scheme can guarantee a more stable random forest classification model with higher accuracy and efficiency compared to others. 1. Introduction e concept of swarm intelligence was first proposed by Hackwood and Beni in 1992 [1]. Swarm intelligence algo- rithms have been proved that it can solve nondifferentiable problems, NP-hard problems, and difficult nonlinear problems which the traditional techniques cannot solve. For this reason, swarm intelligence algorithms are hotly researched in computer science and have been updated from generation to generation. For classic swarm intelligence algorithms, particle swarm optimization (PSO) [2] is used to define the basic principle and equations of the swarm in- telligence algorithms. In recent years, many new swarm intelligence algorithms have been proposed, such as artificial bee colony (ABC) algorithm [3] which is inspired by the stock of food location behavior of bees. Artificial fish school algorithm (AFSA) [4] and firefly algorithm (FA) [5] are inspired by the foraging process of fish and firefly, and cat swarm optimization (CSO) [6] is developed based on vig- ilance and foraging behavior of cats in nature. According to the foraging behavior, vigilance behavior, and flight be- havior of the bird swarms in nature, Meng et al. proposed a novel swarm intelligence algorithm called bird swarm al- gorithm (BSA) [7]. Meanwhile, due to these advantages above, swarm intelligence algorithms have been applied to optimize various fields, such as PSO for mutation testing problems [8], genetic algorithm (GA) for convolutional neural networks parameters [9], FA for convolutional neural network problems [10], and whale optimization algorithm (WOA) for cloud computing environments [11]. So, BSA which will be used in this paper has been widely applied to engineering optimization problems. Hindawi Computational Intelligence and Neuroscience Volume 2020, Article ID 6858541, 24 pages https://doi.org/10.1155/2020/6858541

Upload: others

Post on 16-Oct-2020

4 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

Research ArticleDynamic Multi-Swarm Differential Learning Quantum BirdSwarm Algorithm and Its Application in Random ForestClassification Model

Jiangnan Zhang Kewen Xia Ziping He and Shurui Fan

School of Electronic and Information Engineering Hebei University of Technology Tianjin 300401 China

Correspondence should be addressed to Kewen Xia kwxiahebuteducn

Received 14 August 2019 Revised 21 June 2020 Accepted 24 June 2020 Published 7 August 2020

Academic Editor Rasit Koker

Copyright copy 2020 Jiangnan Zhang et al +is is an open access article distributed under the Creative Commons AttributionLicense which permits unrestricted use distribution and reproduction in any medium provided the original work isproperly cited

Bird swarm algorithm is one of the swarm intelligence algorithms proposed recently However the original bird swarm algorithmhas some drawbacks such as easy to fall into local optimum and slow convergence speed To overcome these short-comings adynamic multi-swarm differential learning quantum bird swarm algorithm which combines three hybrid strategies wasestablished First establishing a dynamicmulti-swarm bird swarm algorithm and the differential evolution strategy was adopted toenhance the randomness of the foraging behaviorrsquos movement which can make the bird swarm algorithm have a stronger globalexploration capability Next quantum behavior was introduced into the bird swarm algorithm for more efficient search solutionspace +en the improved bird swarm algorithm is used to optimize the number of decision trees and the number of predictorvariables on the random forest classification model In the experiment the 18 benchmark functions 30 CEC2014 functions andthe 8 UCI datasets are tested to show that the improved algorithm and model are very competitive and outperform the otheralgorithms andmodels Finally the effective random forest classificationmodel was applied to actual oil logging prediction As theexperimental results show the three strategies can significantly boost the performance of the bird swarm algorithm and theproposed learning scheme can guarantee a more stable random forest classification model with higher accuracy and efficiencycompared to others

1 Introduction

+e concept of swarm intelligence was first proposed byHackwood and Beni in 1992 [1] Swarm intelligence algo-rithms have been proved that it can solve nondifferentiableproblems NP-hard problems and difficult nonlinearproblems which the traditional techniques cannot solve Forthis reason swarm intelligence algorithms are hotlyresearched in computer science and have been updated fromgeneration to generation For classic swarm intelligencealgorithms particle swarm optimization (PSO) [2] is used todefine the basic principle and equations of the swarm in-telligence algorithms In recent years many new swarmintelligence algorithms have been proposed such as artificialbee colony (ABC) algorithm [3] which is inspired by thestock of food location behavior of bees Artificial fish school

algorithm (AFSA) [4] and firefly algorithm (FA) [5] areinspired by the foraging process of fish and firefly and catswarm optimization (CSO) [6] is developed based on vig-ilance and foraging behavior of cats in nature According tothe foraging behavior vigilance behavior and flight be-havior of the bird swarms in nature Meng et al proposed anovel swarm intelligence algorithm called bird swarm al-gorithm (BSA) [7] Meanwhile due to these advantagesabove swarm intelligence algorithms have been applied tooptimize various fields such as PSO for mutation testingproblems [8] genetic algorithm (GA) for convolutionalneural networks parameters [9] FA for convolutional neuralnetwork problems [10] and whale optimization algorithm(WOA) for cloud computing environments [11] So BSAwhich will be used in this paper has been widely applied toengineering optimization problems

HindawiComputational Intelligence and NeuroscienceVolume 2020 Article ID 6858541 24 pageshttpsdoiorg10115520206858541

However the original swarm intelligence algorithmshave limitations in solving some practical problemsHybrid strategy which is one of the main research di-rections to improve the performance of swarm intelli-gence algorithms has become a research hotspot inmachine learning Tuba and Bacanin [12] modified theexploitation process of the original seeker optimizationalgorithm (SOA) approach by hybridizing it with FAwhich overcame shortcomings and outperformed otheralgorithms Strumberger et al [13] also has proposeddynamic search tree growth algorithm (TGA) and hy-bridized elephant herding optimization (EHO) with ABCand the simulation results have shown that the proposedapproach was viable and effective Yang [14] analyzedswarm intelligence algorithms by using differential evo-lution dynamic systems self-organization and Markovchain framework +e discussions demonstrate that thehybrid algorithms have some advantages over traditionalalgorithms Bacanin and Tuba [15] proposed a modifiedABC based on GA and the obtained results show that thehybrid ABC is able to provide competitive results andoutperform other counterparts Liu et al [16] presented amultistrategy brain storm optimization (BSO) with dy-namic parameter adjustment which is more competitivethan other related algorithms Peng et al [17] has pro-posed FA with luciferase inhibition mechanism to im-prove the effectiveness of selection +e simulation resultshave shown that the proposed approach has the bestperformance in some complex functions Peng et al [18]also developed a hybrid approach which is using the bestneighbor-guided solution search strategy to search ABCalgorithm +e experimental results indicate that theproposed ABC is very competitive and outperforms theother algorithms It can be seen that the hybrid strategy isa strategy to successfully improve the swarm intelligencealgorithm so the BSA algorithm will be improved byhybrid strategy in this paper

Similarly BSA can also be applied to multiple fieldsespecially in the field of parameter estimation and hybridstrategy is also the improvement method for BSA In 2017Xu et al [19] proposed improved boundary BSA (IBBSA) forchaotic system optimization of the Lorenz system and thecoupling motor system However the improved boundarylearning strategy has randomness which makes IBBSAgeneralization performance not high Yang and Liu [20]introduced the dynamic weight into the foraging formula ofBSA (IBSA) which provides a solution for problem that anti-same-frequency interference of shipborne radar +e resultshave shown that the dynamic weight is just introduced intothe foraging formula of BSA but IBSA ignored the impact ofpopulation initializationWang et al [21] designed a strategynamed ldquodisturbing the local optimumrdquo for helping theoriginal BSA converge to the global optimal solution fasterand more stably However ldquodisturbing the local optimumrdquoalso has randomness which makes the generalization per-formance of improved BSA not very well

Like many swarm intelligence algorithms BSA is alsofaced with the problem of being trapped in local optima andslow convergence +ese disadvantages limit the wider

application of BSA In this paper a dynamic multi-swarmdifferential learning quantum BSA called DMSDL-QBSA isproposed which introduced three hybrid strategies into theoriginal BSA to improve its effectiveness Motivated by thedefect of insufficient generalization ability in the literature[19 21] we will first establish a dynamic multi-swarm birdswarm algorithm (DMS-BSA) and merge the differentialevolution operator into each sub-swarm of the DMS-BSAand it improves the local search capability and global searchcapability of foraging behavior Second according to thecontempt for the impact of population initialization in theliterature [20] they used quantum behavior to optimize theparticle swarm optimization in order to obtain a good abilityto jump out of global optimum we will use the quantumsystem to initialize the search space of the bird Conse-quently it improves the convergence rate of the wholepopulation and avoids BSA into a local optimum In order tovalidate effectiveness of the proposed method we haveevaluated the performance of DMSDL-QBSA on classicalbenchmark functions and CEC2014 functions includingunimodal and multimodal functions in comparison with thestate-of-the-art methods and new popular algorithms +eexperimental results have shown that the three improvementstrategies are able to significantly boost the performance ofBSA

Based on the DMSDL-QBSA an effective hybridrandom forest (RF) model for actual oil logging predictionis established called DMSDL-QBSA-RF approach RF hasthe characteristics of being nonlinear and anti-interference[22] In addition it can decrease the possibility of over-fitting which often occurs in actual logging RF has beenwidely used in various classification problems but it hasnot yet been applied to the field of actual logging Pa-rameter estimation is a prerequisite to accomplish the RFclassification model +e two key parameters of RF are thenumber of decision trees and the number of predictorvariables the former is called [minus100 100]D and the latter iscalled mtry Meanwhile parameter estimation of the modelis a complex optimization problem that traditionalmethods might fail to solve Many works have proposed touse swarm intelligence algorithms to find the best pa-rameters of the RF model Ma and Fan [23] adopted AFSAand PSO to optimize the parameters of the RF Hou et al[24] used the DE to obtain an optimal set of initial pa-rameters for RF Liu et al [25] compared genetic algo-rithms simulated annealing and hill climbing algorithmsto optimize the parameters of the RF From these paperswe can see that metaheuristic algorithm must be suitablefor this problem In this study the DMSDL-QBSA wasused to optimize the two key parameters that can improvethe accuracy without overfitting for RF When investi-gating the performance of the DMSDL-QBSA-RF classi-fication model compared with 3 swarm intelligencealgorithm-based RF methods 8 two-dimensional UCIdatasets are applied As the experimental results show theproposed learning scheme can guarantee a more stable RFclassification model with higher predictive accuracycompared to other counterparts +e rest of the paper isorganized as follows

2 Computational Intelligence and Neuroscience

(i) In order to achieve a better balance between effi-ciency and velocity for BSA we have studied theeffects of four different hybrid strategies of thedynamic multi-swarm method differential evolu-tion and quantum behavior on the performance ofBSA

(ii) +e proposed DMSDL-QBSA has successfully op-timized ntree and mtry setting problem of RF +eresulting hybrid classification model has been rig-orously evaluated on oil logging prediction

(iii) +e proposed hybrid classification model deliversbetter classification performance and offers moreaccurate and faster results when compared toother swarm intelligence algorithm-based RFmodels

2 Bird Swarm Algorithm and Its Improvement

21 Bird Swarm Algorithm Principle BSA as proposed byMeng et al in 2015 is a new intelligent bionic algorithmbased on multigroup and multisearch methods it mimicsthe birdsrsquo foraging behavior vigilance behavior and flightbehavior and employs this swarm intelligence to solve theoptimization problem +e bird swarm algorithm can besimplified by the five rules

Rule 1 each bird can switch between vigilant behaviorand foraging behavior and both bird forages and keepsvigilance is mimicked as random decisionsRule 2 when foraging each bird records and updates itsprevious best experience and the swarmsrsquo previous bestexperience with food patches +e experience can alsobe used to search for food Instant sharing of socialinformation is across the groupRule 3 when keeping vigilance each bird tries tomove towards the center of the swarm +is behaviormay be influenced by disturbances caused by swarmcompetition Birds with more stocks are more likelyto be near swarmrsquos centers than birds with leasestocksRule 4 birds fly to another place regularly Whenflying to another location birds often switch betweenproduction and shrubs +e bird with the most stocksis the producer and the bird with the least is ascrounger Other birds with the highest and lowestreserves are randomly selected for producers andscroungersRule 5 producers actively seek food Scroungers ran-domly follow producers looking for food

According to Rule 1 we define that the time interval ofeach bird flight behavior FQ the probability of foragingbehavior P(P isin (0 1)) and a uniform random numberδ isin (0 1)

(1) Foraging behaviorIf the number of iteration is less than FQand δ leP the bird will be the foraging

behavior Rule 2 can be written mathematically asfollows

xt+1ij x

tij + p

tij minus x

tij1113872 1113873 times C times rand(0 1)

+ gtj minus x

tij1113872 1113873 times S times rand(0 1)

(1)

where C and S are two positive numbers the formeris called cognitive accelerated coefficients and thelatter is called social accelerated coefficients Herepij is the i-th birdrsquos best previous position and gj isthe best previous swarmrsquos position

(2) Vigilance behaviorIf the number of iteration is less than FQ and δ gtPthe bird will be the vigilance behavior Rule 3 can bewritten mathematically as follows

xt+1ij x

tij + A1 meant

j minus xtij1113872 1113873 times rand(0 1)

+ A2 ptkj minus x

tij1113872 1113873 times rand(minus1 1)

(2)

A1 a1 times exp minuspFiti

sumFit + εtimes N1113874 1113875 (3)

A2 a2 times exppFiti minus pFitk

pFitk minus pFiti1113868111386811138681113868

1113868111386811138681113868 + ε1113888 1113889 times

N times pFitksumFit + ε

1113888 1113889

(4)

where a1 and a2 are two positive constants in [0 2]pFiti is the best fitness value of i-th bird and sumFit isthe sum of the swarmsrsquo best fitness value Here εwhich is used to avoid zero-division error is thesmallest constant in the computer meanj denotes thej-th element of the whole swarmrsquos average position

(3) Flight behaviorIf the number of iteration equals FQ the birdwill be theflight behavior which can be divided into the behaviorsof the producers and scroungers by fitness Rule 3 andRule 4 can be written mathematically as follows

xt+1ij x

tij + randn(0 1) times x

tij (5)

xt+1ij x

tij + x

tkj minus x

tij1113872 1113873 times FL times rand(0 1) (6)

where FL (FL isin [0 2]) means that the scrounger wouldfollow the producer to search for food

22 e Bird Swarm Algorithm Based on Dynamic Multi-Swarm Method Differential Evolution andQuantum Behavior

221 e Foraging Behavior Based on Dynamic Multi-Swarm Method Dynamic multi-swarm method has beenwidely used in real-world applications because it is efficientand easy to implement In addition it is very common in the

Computational Intelligence and Neuroscience 3

improvement of swarm intelligent optimization such ascoevolutionary algorithm [26] the framework of evolu-tionary algorithms [27] multiobjective particle swarm op-timization [28] hybrid dynamic robust [29] and PSOalgorithm [30 31] However the PSO algorithm is easy to fallinto the local optimum and its generalization performance isnot high Consequently motivated by these literaturestudies we will establish a dynamic multi-swarm bird swarmalgorithm (DMS-BSA) and it improves the local searchcapability of foraging behavior

In DMS-PSO the whole population is divided into manysmall swarms which are often regrouped by using variousreorganization plans to exchange information +e velocityupdate strategy is

vt+1ij ωv

tij + c2r2 l

tij minus x

tij1113872 1113873 (7)

where lij is the best historical position achieved within thelocal community of the i-th particle

According to the characteristic of equation (1) we cansee that the foraging behavior formula of BSA is similar tothe particle velocity update formula of PSO So according tothe characteristic of equation (7) we can get the improvedforaging behavior formula as follows

xt+1ij x

tij + GV

tij minus x

tij1113872 1113873 times C times rand(0 1) (8)

where GV is called the guiding vector+e dynamic multi-swarm method is used to improve

the local search capability while the guiding vector GV canenhance the global search capability of foraging behaviorObviously we need to build a good guiding vector

222 e Guiding Vector Based on Differential EvolutionDifferential evolution (DE) is a powerful evolutionary al-gorithm with three differential evolution operators forsolving the tough global optimization problems [32] Be-sides DE has got more and more attention of scholars toevolve and improve in evolutionary computation such ashybrid multiple crossover operations [33] and proposed DEneighbor1 [34] due to its excellent global search capabilityFrom these literature studies we can see that DE has a goodglobal search capability so we will establish the guidingvector GV based on differential evolution operator to im-prove the global search capability of foraging behavior +edetailed implementation of GV is presented as follows

(1) Differential mutationAccording to the characteristic of equation (8) theldquoDEbest1rdquo ldquoDEbest2rdquo and ldquoDEcurrent-to-best1rdquomutation strategies are suitable In the experiments ofthe literature [31] they showed that the ldquoDEbest1rdquomutation strategy is the most suitable in DMS-PSO sowe choose this mutation strategy in BSA And the ldquoDElbest1rdquo mutation strategy can be written as followsDElbest1

vtij l

tij + F times p

tr1j minus p

tr2j1113872 1113873 (9)

Note that some components of the mutant vector vij

may violate predefined boundary constraints In thiscase the boundary processing is used It can beexpressed as follows

vtij

xlb vtij ltxlb

xub vtij gexub

⎧⎨

⎩ (10)

(2) CrossoverAfter differential mutation a binomial crossoveroperation exchanges some components of the mu-tant vector vij with the best previous position pij togenerate the target vector uij +e process can beexpressed as

utij

vtij randleCR or j rand(i)

ptij otherwise

⎧⎨

⎩ (11)

(3) SelectionBecause the purpose of BSA is to find the best fitnessa selection operation chooses a vector accompanied abetter fitness to enter the next generation to generatethe selection operator namely guiding vector GV+e process can be expressed as followsChoose a vector with better fitness to enter the nextgeneration

GVt+1ij

utij fitness ut

ij1113872 1113873le fitness xtij1113872 1113873

ptij otherwise

⎧⎪⎨

⎪⎩(12)

223 e Initialization of Search Space Based on QuantumBehavior Quantum behavior is a nonlinear and excellentsuperposition system With its simple and effective char-acteristics and good performance in global optimization ithas been applied to optimize many algorithms such asparticle swarm optimization [35] and pigeon-inspired op-timization algorithm [36] Consequently according to theliterature studies and its excellent global optimization per-formance we use the quantum system to initialize the searchspace of the bird

Quantum-behaved particle position can be writtenmathematically as follows

xt+1

pt plusmn

Lt

2ln

1u

(13)

pt

c1 times pbestt + 1 minus c1( 1113857 times gbestt (14)

Lt

2β Pt

minus xt

11138681113868111386811138681113868111386811138681113868 (15)

According to the characteristics of equations (13)ndash(15)we can get the improved search space initialization formulaas follows

4 Computational Intelligence and Neuroscience

xti lb +(ub minus lb) times rand (16)

xt+1i x

ti plusmn mbest minus x

ti

11138681113868111386811138681113868111386811138681113868 times ln

1u

(17)

where β is a positive number which can be respectivelycalled as a contraction expansion factor Here xt

i is theposition of the particle at the previous moment and mbest isthe average value of the best previous positions of all thebirds (Algorithm 1)

224 Procedures of the DMSDL-QBSA In Sections221ndash223 in order to improve the local search capabilityand the global search capability on BSA this paper hasimproved the BSA in three parts

(1) In order to improve the local search capability offoraging behavior on BSA we put forward equation(8) based on the dynamic multi-swarm method

(2) In order to get the guiding vector to improve theglobal search capability of foraging behavior on BSAwe put forward equations (9) (11) and (12) based ondifferential evolution

(3) In order to expand the initialization search space ofthe bird to improve the global search capability onBSA we put forward equations (16) and (17) basedon quantum behavior

Finally the steps of DMSDL-QBSA can be shown inAlgorithm 1

23 Simulation Experiment and Analysis +is sectionpresents the evaluation of DMSDL-QBSA using a series ofexperiments on benchmark functions and CEC2014 testfunctions All experiments in this paper are implementedusing the following MATLAB R2014b Win 7 (64-bit) Inter(R) Core (TM) i5-2450M CPU 250GHz 400GB RAMTo obtain fair results all the experiments were conductedunder the same conditions +e number of the populationsize is set as 30 in these algorithms And each algorithm runs30 times independently for each function

231 Benchmark Functions and CEC 2014 Test FunctionsWhen investigating the effective and universal performanceof DMSDL-QBSA compared with several hybrid algorithmsand popular algorithms 18 benchmark functions andCEC2014 test functions are applied In order to test theeffectiveness of the proposed DMSDL-QBSA 18 benchmarkfunctions [37] are adopted and all of which have an optimalvalue of 0 +e benchmark functions and their searchingranges are shown in Table 1 In this test suite f1 minus f9 areunimodal functions +ese unimodal functions are usuallyused to test and investigate whether the proposed algorithmhas a good convergence performance +en f10 minus f18 aremultimodal functions +ese multimodal functions are usedto test the global search capability of the proposed algorithm+e smaller the fitness value of functions the better thealgorithm performs Furthermore in order to better verify

the comprehensive performance of DMSDL-QBSA in amore comprehensively manner another 30 complexCEC2014 benchmarks are used +e CEC2014 benchmarkfunctions are simply described in Table 2

232 Parameter Settings In order to verify the effectivenessand generalization of the proposed DMSDL-QBSA theimproved DMSDL-QBSA is compared with several hybridalgorithms+ese algorithms are BSA [7] DE [32] DMSDL-PSO [31] and DMSDL-BSA Another 5 popular intelligencealgorithms such as grey wolf optimizer (GWO) [38] whaleoptimization algorithm (WOA) [39] sine cosine algorithm(SCA) [40] grasshopper optimization algorithm (GOA)[41] and sparrow search algorithm (SSA) [42] are used tocompare with DMSDL-QBSA+ese algorithms representedstate-of-the-art can be used to better verify the performanceof DMSDL-QBSA in a more comprehensively manner Forfair comparison the number of populations of all algorithmsis set to 30 respectively and other parameters of all algo-rithms are set according to their original papers +e pa-rameter settings of these involved algorithms are shown inTable 3 in detail

233 Comparison on Benchmark Functions with HybridAlgorithms According to Section 22 three hybrid strategies(dynamicmulti-swarmmethod DE and quantum behavior)have been combined with the basic BSA method Wheninvestigating the effectiveness of DMSDL-QBSA comparedwith several hybrid algorithms such as BSA DE DMSDL-PSO and DMSDL-BSA 18 benchmark functions are ap-plied Compared with DMSDL-QBSA quantum behaviordynamic is not used in the dynamic multi-swarm differentiallearning bird swarm algorithm (DMSDL-BSA) +e numberof function evaluations (FEs) is 10000 We selected twodifferent dimensionrsquos sizes (Dim) Dim 10 is the typicaldimensions for the benchmark functions And Dim 2 is forRF has two parameters that need to be optimized whichmeans that the optimization function is 2-dimensional

+e fitness value curves of a run of several algorithms onabout eight different functions are shown in Figures 1 and 2where the horizontal axis represents the number of iterationsand the vertical axis represents the fitness value We canobviously see the convergence speeds of several differentalgorithms +e maximum value (Max) the minimum value(Min) the mean value (Mean) and the variance (Var)obtained by several benchmark algorithms are shown inTables 4ndash7 where the best results are marked in bold Table 4and 5 show the performance of the several algorithms onunimodal functions when Dim 10 and 2 and Table 6 and 7show the performance of the several algorithms on multi-modal functions when Dim 10 and 2

(1) Unimodal functionsFrom the numerical testing results on 8 unimodalfunctions in Table 4 we can see that DMSDL-QBSAcan find the optimal solution for all unimodalfunctions and get the minimum value of 0 on f1 f2f3 f7 and f8 Both DMSDL-QBSA and DMSDL-

Computational Intelligence and Neuroscience 5

BSA can find the minimum value HoweverDMSDL-QBSA has the best mean value and varianceon each function +e main reason is that DMSDL-QBSA has better population diversity during theinitialization period In summary the DMSDL-QBSA has best performance on unimodal functionscompared to the other algorithms when Dim 10Obviously the DMSDL-QBSA has a relatively wellconvergence speed+e evolution curves of these algorithms on fourunimodal functions f1 f5 f6 and f9 are drawn inFigure 1 It can be detected from the figure that thecurve of DMSDL-QBSA descends fastest in the

number of iterations that are far less than 10000times For f1 f5 and f9 case DMSDL-QBSA hasthe fastest convergence speed compared with otheralgorithms However the original BSA got the worstsolution because it is trapped in the local optimumprematurely For function f6 these algorithms didnot find the value 0 However the convergence speedof the DMSDL-QBSA is significantly faster thanother algorithms in the early stage and the solutioneventually found is the best Overall owing to en-hance the diversity of population DMSDL-QBSAhas a relatively excellent convergence speed whenDim 2

Variable setting number of iterationsiter bird positions Xi local optimum Pi global optimal positiongbest global optimumPgbestand fitness of sub-swarm optimal position Plbest

Input population size N dimensionrsquos size D number of function evaluationsiter max the time interval of each bird flight behaviorFQ the probability of foraging behavior P constant parameter c1 c2 a1 a2 FL iter 0 and contraction expansion factor β

Output global optimal position gbest fitness of global optimal position Pgbest(1) Begin(2) Initialize the positions of N birds using equations (16)-(17) Xi(i 1 2 N)(3) Calculated fitness Fi(i 1 2 N) set Xi to be Pi and find Pgbest(4) While iterlt itermax + 1 do(5) lowast Dynamic sub-swarmlowast(6) Regroup Pi and Xi of the sub-swarms randomly(7) Sort the Plbest and refine the first [025lowastN] best Plbest(8) update the corresponding Pi(9) lowastDifferential learning schemelowast(10) For i 1 N

(11) Construct lbest using DMS-BSA(12) Differential evolution construct V using equations (9)-(10)(13) Crossover construct U using equation (11)(14) Selection construct GV using equation (12)(15) End For(16) lowastBirds position adjustinglowast(17) If (tFQne 0)

(18) For i 1 N

(19) If randltp

(20) Foraging behavior update the position Xi of birds using equation (8)(21) Else(22) Vigilance behavior update the position Xi of birds using equation (2)(23) End If(24) End For(25) Else(26) Flight behavior is divided into producers and scroungers(27) For i 1 N

(28) If Xi is a producer(29) Producers update the position Xi of birds using equation (5)(30) Else(31) Scroungers update the position Xi of birds using equation (6)(32) End If(33) End For(34) End If(35) Evaluate f(Xi)(36) Update gbest and Pgbest(37) iter iter + 1(38) End While(39) End

ALGORITHM 1 DMSDL-QBSA

6 Computational Intelligence and Neuroscience

According to the results of Table 5 DMSDL-QBSAgets the best performance on these 8 unimodalfunctions when Dim 2 DMSDL-QBSA finds theminimum value of 0 onf1f2f3f4f5f7f8 andf9 DMSDL-QBSA has better performance on f2f4 and f9 compared with DMSDL-BSA andDMSDL-PSO DE does not perform well on thesefunctions but BSA performs relatively well on f1f3 f6 f7 f8 and f9 +e main reason is that BSAhas a better convergence performance in the earlysearch Obviously DMSDL-QBSA can find the besttwo parameters for RF that need to be optimized

(2) Multimodal functionsFrom the numerical testing results on 8 multimodalfunctions in Table 6 we can see that DMSDL-QBSAcan find the optimal solution for all multimodalfunctions and get the minimum value of 0 on f11f13 and f16 DMSDL-QBSA has the best perfor-mance on f11 f12 f13 f14 f15 and f16 BSA worksthe best on f10 DMSDL-PSO performs not verywell And DMSDL-QBSA has the best mean valueand variance on most functions +e main reason isthat DMSDL-QBSA has a stronger global explora-tion capability based on the dynamic multi-swarmmethod and differential evolution In summary the

DMSDL-QBSA has relatively well performance onunimodal functions compared to the other algo-rithms when typical Dim 10 Obviously theDMSDL-QBSA has relatively well global searchcapability

+e evolution curves of these algorithms on four multi-modal functions f12 f13 f17 and f18 when Dim 2 aredepicted in Figure 2 We can see that DMSDL-QBSA can findthe optimal solution in the same iteration For f13 and f17case DMSDL-QBSA continues to decline However theoriginal BSA and DE get parallel straight lines because of theirpoor global convergence ability For functions f12 and f18although DMSDL-QBSA also trapped the local optimum itfind the minimum value compared to other algorithmsObviously the convergence speed of the DMSDL-QBSA issignificantly faster than other algorithms in the early stage andthe solution eventually found is the best In general owing toenhance the diversity of population DMSDL-QBSA has arelatively balanced global search capability when Dim 2

Furthermore from the numerical testing results on ninemultimodal functions in Table 7 we can see that DMSDL-QBSA has the best performance on f11 f12 f13 f14 f16f17 and f18 DMSDL-QBSA gets the minimum value of 0 onf11f13f16 andf17 BSA has got theminimum value of 0 onf11 f13 and f17 DE also has not got the minimum value of 0

Table 1 18 benchmark functions

Name Test function RangeSphere f1(x) 1113936

Di1 x2

i [minus100 100]D

Schwefel P222 f2 1113936Di1 |xi| + 1113937

Di1 |xi| [minus10 10]D

Schwefel P12 f3(x) 1113936Di1(1113936

ij xj)

2 [minus100 100]D

GeneralizedRosenbrock f4(x) 1113936

Dminus1i1 [100(x2

i minus xi+1)2 + (xi minus 1)2] [minus100 100]D

Step f5(x) 1113936Di1 (|xi + 05|)2 [minus100 100]D

Noise f6(x) 1113936Di1 ix4

i + rand[0 1) [minus128 128]D

SumSquares f7(x) 1113936Di1 ix2

i [minus10 10]D

Zakharov f8(x) 1113936Di1 x2

i + (1113936Di1 05ixi)

2 + (1113936Di1 05ixi)

4 [minus10 10]D

Schaffer f9(x y) 05 + (sin2(x2 minus y2) minus 05[1 + 0001(x2 minus y2)]2) [minus10 10]D

GeneralizedSchwefel 226 f10(x) 4189829D minus 1113936

Di1(x sin

|xi|

1113968) [minus500 500]D

GeneralizedRastrigin f11(x) 1113936

Di1[x2

i minus 10 cos(2πxi) + 10] [minus512 512]D

Ackley f12 20 + e minus 20 exp(minus02

(1D) 1113936Di1 x2

i

1113969

) minusexp((1D) 1113936Di1 cos 2 πxi) [minus32 32]D

GeneralizedGriewank f13(x) 1113936

Di1 (x2

i 4000) minus 1113937Di1 cos(xi

i

radic) + 1 [minus600 600]D

GeneralizedPenalized 1

f14(x) (πD) 10 sin2(πy1) + 1113936Dminus1i1 (yi minus 1)21113966

[1 + 10 sin2(πyi+1)] + (yD minus 1)2 + 1113936Di1 u(xi 10 100 4)

u(xi a k m)

k(xi minus a)m xi gt a

0 minusalexi le a

k(minusxi minus a)m xi lt minus a

⎧⎪⎨

⎪⎩ yi 1 + (14)xi

[minus50 50]D

GeneralizedPenalized 2

f15(x) (110) 10 sin2(3πx1) + 1113936Dminus1i1 (xi minus 1)2[1 + sin2(3πxi+1)]1113966

+(xD minus 1)2[1 + sin2(2πxD)] + 1113936Di1 u(xi 5 100 4)

[minus50 50]D

Alpine f16(x) 1113936Di1 |xi sin xi + 01xi| [minus10 10]D

Booth f17(x) (x1 + 2x2 minus 7)2 + (2x1 + x2 minus 5)2 [minus10 10]D

Levy f18(x) sin2(πx1) + 1113936Di1 (yi minus 1)2[1 + 10 sin2(πyi + 1)]1113966 1113967 + (yi minus 1)2[1 + 10 sin2(2πyD)] [minus10 10]D

Computational Intelligence and Neuroscience 7

on any functions DMSDL-BSA has got theminimum value of0 on f11 and f13 In summary the DMSDL-QBSA has asuperior global search capability on most multimodal func-tions when Dim 2 Obviously DMSDL-QBSA can find thebest two parameters for RF that need to be optimized becauseof its best global search capability

In this section it can be seen from Figures 1 and 2 andTables 4ndash7 that DMSDL-QBSA can obtain the best functionvalues for most cases It indicates that the hybrid strategies ofBSA dynamic multi-swarm method DE and quantum be-havior operators lead to the bird moves towards the best so-lutions AndDMSDL-QBSA has well ability of searching for thebest two parameters for RF with higher accuracy and efficiency

234 Comparison on Benchmark Functions with PopularAlgorithms When comparing the timeliness and appli-cability of DMSDL-QBSA compared with several pop-ular algorithms such as GWO WOA SCA GOA andSSA 18 benchmark functions are applied And GWOWOA GOA and SSA are swarm intelligence algorithmsIn this experiment the dimensionrsquos size of these func-tions is10 +e number of function evaluations (FEs)is100000 +e maximum value (Max) the minimumvalue (Min) the mean value (Mean) and the variance(Var) obtained by several different algorithms are shownin Tables 8 and 9 where the best results are marked inbold

Table 2 Summary of the CECprime14 test functions

Type No Functions Filowast Fi (xlowast)

Unimodal functionsF1 Rotated High-Conditioned Elliptic Function 100F2 Rotated Bent Cigar Function 200F3 Rotated Discus Function 300

Simple multimodal functions

F4 Shifted and Rotated Rosenbrockrsquos Function 400F5 Shifted and Rotated Ackleyrsquos Function 500F6 Shifted and Rotated Weierstrass Function 600F7 Shifted and Rotated Griewankrsquos Function 700F8 Shifted Rastriginrsquos Function 800F9 Shifted and Rotated Rastriginrsquos Function 900F10 Shifted Schwefelrsquos Function 1000F11 Shifted and Rotated Schwefelrsquos Function 1100F12 Shifted and Rotated Katsuura Function 1200F13 Shifted and Rotated HappyCat function 1300F14 Shifted and Rotated HGBat Function 1400F15 Shifted and Rotated Expanded Griewankrsquos Plus Rosenbrockrsquos Function 1500F16 Shifted and Rotated Expanded Scafferrsquos F6 Function 1600

Hybrid functions

F17 Hybrid function 1 (N 3) 1700F18 Hybrid function 2 (N 3) 1800F19 Hybrid function 3 (N 4) 1900F20 Hybrid function 4 (N 4) 2000F21 Hybrid function 5 (N 5) 2100F22 Hybrid function 6 (N 5) 2200

Composition functions

F23 Composition function 1 (N 5) 2300F24 Composition function 2 (N 3) 2400F25 Composition function 3 (N 3) 2500F26 Composition function 4 (N 5) 2600F27 Composition function 5 (N 5) 2700F28 Composition function 6 (N 5) 2800F29 Composition function 7 (N 3) 2900F30 Composition function 8 (N 3) 3000

Search range [minus100 100]D

Table 3 Parameter settings

Algorithm Parameter settingsBSA c1 c2 15 a1 a2 1 FQ 10DE F 05 CR 01DMSDL-PSO c1 c2 149445 F 05 CR 01 sub-swarm 10DMSDL-BSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10DMSDL-QBSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10

8 Computational Intelligence and Neuroscience

From the test results in Table 8 we can see that DMSDL-QBSA has the best performance on each unimodal functionGWO finds the value 0 on f1 f2 f3 f7 and f8 WOAobtains 0 on f1 f2 and f7 SSA works the best on f1 andf7 With the experiment of multimodal function evalua-tions Table 9 shows that DMSDL-QBSA has the best per-formance on f11f12f13f14f15f16 and f18 SSA has thebest performance on f10 GWO gets the minimum on f11WOA and SCA obtains the optimal value on f11 and f13Obviously compared with these popular algorithmsDMSDL-QBSA is a competitive algorithm for solving severalfunctions and the swarm intelligence algorithms performbetter than other algorithms +e results of Tables 8 and 9show that DMSDL-QBSA has the best performance on themost test benchmark functions

235 Comparison on CEC2014 Test Functions with HybridAlgorithms When comparing the comprehensive perfor-mance of proposed DMSDL-QBSA compared with severalhybrid algorithms such as BSA DE DMSDL-PSO andDMSDL-BSA 30 CEC2014 test functions are applied In thisexperiment the dimensionrsquos size (Dim) is set to 10 +enumber of function evaluations (FEs) is 100000 Experi-mental comparisons included the maximum value (Max)the minimum value (Min) the mean value (Mean) and thevariance (Var) are given in Tables 10 and 11 where the bestresults are marked in bold

Based on the mean value (Mean) on the CEC2014 testfunctions DMSDL-QBSA has the best performance on F2F3 F4 F6 F7 F8 F9 F10 F11 F15 F16 F17 F21 F26 F27 F29and F30 DMSDL-BSA does show an advantage on F1 F12

1050

100

10ndash50

10ndash100

10ndash150

10ndash200

10ndash250

10ndash300

Fitn

ess v

alue

Iteration

Sphere

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Fitn

ess v

alue

Step

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Noise100

10ndash1

10ndash2

10ndash3

10ndash4

10ndash5

10ndash6

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

Fitn

ess v

alue

Schaffer100

10ndash5

10ndash10

10ndash15

10ndash20

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 1 Fitness value curves of 5 hybrid algorithms on (a)f1 (b)f5 (c)f6 and (d)f9 (Dim 2)

Computational Intelligence and Neuroscience 9

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 2: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

However the original swarm intelligence algorithmshave limitations in solving some practical problemsHybrid strategy which is one of the main research di-rections to improve the performance of swarm intelli-gence algorithms has become a research hotspot inmachine learning Tuba and Bacanin [12] modified theexploitation process of the original seeker optimizationalgorithm (SOA) approach by hybridizing it with FAwhich overcame shortcomings and outperformed otheralgorithms Strumberger et al [13] also has proposeddynamic search tree growth algorithm (TGA) and hy-bridized elephant herding optimization (EHO) with ABCand the simulation results have shown that the proposedapproach was viable and effective Yang [14] analyzedswarm intelligence algorithms by using differential evo-lution dynamic systems self-organization and Markovchain framework +e discussions demonstrate that thehybrid algorithms have some advantages over traditionalalgorithms Bacanin and Tuba [15] proposed a modifiedABC based on GA and the obtained results show that thehybrid ABC is able to provide competitive results andoutperform other counterparts Liu et al [16] presented amultistrategy brain storm optimization (BSO) with dy-namic parameter adjustment which is more competitivethan other related algorithms Peng et al [17] has pro-posed FA with luciferase inhibition mechanism to im-prove the effectiveness of selection +e simulation resultshave shown that the proposed approach has the bestperformance in some complex functions Peng et al [18]also developed a hybrid approach which is using the bestneighbor-guided solution search strategy to search ABCalgorithm +e experimental results indicate that theproposed ABC is very competitive and outperforms theother algorithms It can be seen that the hybrid strategy isa strategy to successfully improve the swarm intelligencealgorithm so the BSA algorithm will be improved byhybrid strategy in this paper

Similarly BSA can also be applied to multiple fieldsespecially in the field of parameter estimation and hybridstrategy is also the improvement method for BSA In 2017Xu et al [19] proposed improved boundary BSA (IBBSA) forchaotic system optimization of the Lorenz system and thecoupling motor system However the improved boundarylearning strategy has randomness which makes IBBSAgeneralization performance not high Yang and Liu [20]introduced the dynamic weight into the foraging formula ofBSA (IBSA) which provides a solution for problem that anti-same-frequency interference of shipborne radar +e resultshave shown that the dynamic weight is just introduced intothe foraging formula of BSA but IBSA ignored the impact ofpopulation initializationWang et al [21] designed a strategynamed ldquodisturbing the local optimumrdquo for helping theoriginal BSA converge to the global optimal solution fasterand more stably However ldquodisturbing the local optimumrdquoalso has randomness which makes the generalization per-formance of improved BSA not very well

Like many swarm intelligence algorithms BSA is alsofaced with the problem of being trapped in local optima andslow convergence +ese disadvantages limit the wider

application of BSA In this paper a dynamic multi-swarmdifferential learning quantum BSA called DMSDL-QBSA isproposed which introduced three hybrid strategies into theoriginal BSA to improve its effectiveness Motivated by thedefect of insufficient generalization ability in the literature[19 21] we will first establish a dynamic multi-swarm birdswarm algorithm (DMS-BSA) and merge the differentialevolution operator into each sub-swarm of the DMS-BSAand it improves the local search capability and global searchcapability of foraging behavior Second according to thecontempt for the impact of population initialization in theliterature [20] they used quantum behavior to optimize theparticle swarm optimization in order to obtain a good abilityto jump out of global optimum we will use the quantumsystem to initialize the search space of the bird Conse-quently it improves the convergence rate of the wholepopulation and avoids BSA into a local optimum In order tovalidate effectiveness of the proposed method we haveevaluated the performance of DMSDL-QBSA on classicalbenchmark functions and CEC2014 functions includingunimodal and multimodal functions in comparison with thestate-of-the-art methods and new popular algorithms +eexperimental results have shown that the three improvementstrategies are able to significantly boost the performance ofBSA

Based on the DMSDL-QBSA an effective hybridrandom forest (RF) model for actual oil logging predictionis established called DMSDL-QBSA-RF approach RF hasthe characteristics of being nonlinear and anti-interference[22] In addition it can decrease the possibility of over-fitting which often occurs in actual logging RF has beenwidely used in various classification problems but it hasnot yet been applied to the field of actual logging Pa-rameter estimation is a prerequisite to accomplish the RFclassification model +e two key parameters of RF are thenumber of decision trees and the number of predictorvariables the former is called [minus100 100]D and the latter iscalled mtry Meanwhile parameter estimation of the modelis a complex optimization problem that traditionalmethods might fail to solve Many works have proposed touse swarm intelligence algorithms to find the best pa-rameters of the RF model Ma and Fan [23] adopted AFSAand PSO to optimize the parameters of the RF Hou et al[24] used the DE to obtain an optimal set of initial pa-rameters for RF Liu et al [25] compared genetic algo-rithms simulated annealing and hill climbing algorithmsto optimize the parameters of the RF From these paperswe can see that metaheuristic algorithm must be suitablefor this problem In this study the DMSDL-QBSA wasused to optimize the two key parameters that can improvethe accuracy without overfitting for RF When investi-gating the performance of the DMSDL-QBSA-RF classi-fication model compared with 3 swarm intelligencealgorithm-based RF methods 8 two-dimensional UCIdatasets are applied As the experimental results show theproposed learning scheme can guarantee a more stable RFclassification model with higher predictive accuracycompared to other counterparts +e rest of the paper isorganized as follows

2 Computational Intelligence and Neuroscience

(i) In order to achieve a better balance between effi-ciency and velocity for BSA we have studied theeffects of four different hybrid strategies of thedynamic multi-swarm method differential evolu-tion and quantum behavior on the performance ofBSA

(ii) +e proposed DMSDL-QBSA has successfully op-timized ntree and mtry setting problem of RF +eresulting hybrid classification model has been rig-orously evaluated on oil logging prediction

(iii) +e proposed hybrid classification model deliversbetter classification performance and offers moreaccurate and faster results when compared toother swarm intelligence algorithm-based RFmodels

2 Bird Swarm Algorithm and Its Improvement

21 Bird Swarm Algorithm Principle BSA as proposed byMeng et al in 2015 is a new intelligent bionic algorithmbased on multigroup and multisearch methods it mimicsthe birdsrsquo foraging behavior vigilance behavior and flightbehavior and employs this swarm intelligence to solve theoptimization problem +e bird swarm algorithm can besimplified by the five rules

Rule 1 each bird can switch between vigilant behaviorand foraging behavior and both bird forages and keepsvigilance is mimicked as random decisionsRule 2 when foraging each bird records and updates itsprevious best experience and the swarmsrsquo previous bestexperience with food patches +e experience can alsobe used to search for food Instant sharing of socialinformation is across the groupRule 3 when keeping vigilance each bird tries tomove towards the center of the swarm +is behaviormay be influenced by disturbances caused by swarmcompetition Birds with more stocks are more likelyto be near swarmrsquos centers than birds with leasestocksRule 4 birds fly to another place regularly Whenflying to another location birds often switch betweenproduction and shrubs +e bird with the most stocksis the producer and the bird with the least is ascrounger Other birds with the highest and lowestreserves are randomly selected for producers andscroungersRule 5 producers actively seek food Scroungers ran-domly follow producers looking for food

According to Rule 1 we define that the time interval ofeach bird flight behavior FQ the probability of foragingbehavior P(P isin (0 1)) and a uniform random numberδ isin (0 1)

(1) Foraging behaviorIf the number of iteration is less than FQand δ leP the bird will be the foraging

behavior Rule 2 can be written mathematically asfollows

xt+1ij x

tij + p

tij minus x

tij1113872 1113873 times C times rand(0 1)

+ gtj minus x

tij1113872 1113873 times S times rand(0 1)

(1)

where C and S are two positive numbers the formeris called cognitive accelerated coefficients and thelatter is called social accelerated coefficients Herepij is the i-th birdrsquos best previous position and gj isthe best previous swarmrsquos position

(2) Vigilance behaviorIf the number of iteration is less than FQ and δ gtPthe bird will be the vigilance behavior Rule 3 can bewritten mathematically as follows

xt+1ij x

tij + A1 meant

j minus xtij1113872 1113873 times rand(0 1)

+ A2 ptkj minus x

tij1113872 1113873 times rand(minus1 1)

(2)

A1 a1 times exp minuspFiti

sumFit + εtimes N1113874 1113875 (3)

A2 a2 times exppFiti minus pFitk

pFitk minus pFiti1113868111386811138681113868

1113868111386811138681113868 + ε1113888 1113889 times

N times pFitksumFit + ε

1113888 1113889

(4)

where a1 and a2 are two positive constants in [0 2]pFiti is the best fitness value of i-th bird and sumFit isthe sum of the swarmsrsquo best fitness value Here εwhich is used to avoid zero-division error is thesmallest constant in the computer meanj denotes thej-th element of the whole swarmrsquos average position

(3) Flight behaviorIf the number of iteration equals FQ the birdwill be theflight behavior which can be divided into the behaviorsof the producers and scroungers by fitness Rule 3 andRule 4 can be written mathematically as follows

xt+1ij x

tij + randn(0 1) times x

tij (5)

xt+1ij x

tij + x

tkj minus x

tij1113872 1113873 times FL times rand(0 1) (6)

where FL (FL isin [0 2]) means that the scrounger wouldfollow the producer to search for food

22 e Bird Swarm Algorithm Based on Dynamic Multi-Swarm Method Differential Evolution andQuantum Behavior

221 e Foraging Behavior Based on Dynamic Multi-Swarm Method Dynamic multi-swarm method has beenwidely used in real-world applications because it is efficientand easy to implement In addition it is very common in the

Computational Intelligence and Neuroscience 3

improvement of swarm intelligent optimization such ascoevolutionary algorithm [26] the framework of evolu-tionary algorithms [27] multiobjective particle swarm op-timization [28] hybrid dynamic robust [29] and PSOalgorithm [30 31] However the PSO algorithm is easy to fallinto the local optimum and its generalization performance isnot high Consequently motivated by these literaturestudies we will establish a dynamic multi-swarm bird swarmalgorithm (DMS-BSA) and it improves the local searchcapability of foraging behavior

In DMS-PSO the whole population is divided into manysmall swarms which are often regrouped by using variousreorganization plans to exchange information +e velocityupdate strategy is

vt+1ij ωv

tij + c2r2 l

tij minus x

tij1113872 1113873 (7)

where lij is the best historical position achieved within thelocal community of the i-th particle

According to the characteristic of equation (1) we cansee that the foraging behavior formula of BSA is similar tothe particle velocity update formula of PSO So according tothe characteristic of equation (7) we can get the improvedforaging behavior formula as follows

xt+1ij x

tij + GV

tij minus x

tij1113872 1113873 times C times rand(0 1) (8)

where GV is called the guiding vector+e dynamic multi-swarm method is used to improve

the local search capability while the guiding vector GV canenhance the global search capability of foraging behaviorObviously we need to build a good guiding vector

222 e Guiding Vector Based on Differential EvolutionDifferential evolution (DE) is a powerful evolutionary al-gorithm with three differential evolution operators forsolving the tough global optimization problems [32] Be-sides DE has got more and more attention of scholars toevolve and improve in evolutionary computation such ashybrid multiple crossover operations [33] and proposed DEneighbor1 [34] due to its excellent global search capabilityFrom these literature studies we can see that DE has a goodglobal search capability so we will establish the guidingvector GV based on differential evolution operator to im-prove the global search capability of foraging behavior +edetailed implementation of GV is presented as follows

(1) Differential mutationAccording to the characteristic of equation (8) theldquoDEbest1rdquo ldquoDEbest2rdquo and ldquoDEcurrent-to-best1rdquomutation strategies are suitable In the experiments ofthe literature [31] they showed that the ldquoDEbest1rdquomutation strategy is the most suitable in DMS-PSO sowe choose this mutation strategy in BSA And the ldquoDElbest1rdquo mutation strategy can be written as followsDElbest1

vtij l

tij + F times p

tr1j minus p

tr2j1113872 1113873 (9)

Note that some components of the mutant vector vij

may violate predefined boundary constraints In thiscase the boundary processing is used It can beexpressed as follows

vtij

xlb vtij ltxlb

xub vtij gexub

⎧⎨

⎩ (10)

(2) CrossoverAfter differential mutation a binomial crossoveroperation exchanges some components of the mu-tant vector vij with the best previous position pij togenerate the target vector uij +e process can beexpressed as

utij

vtij randleCR or j rand(i)

ptij otherwise

⎧⎨

⎩ (11)

(3) SelectionBecause the purpose of BSA is to find the best fitnessa selection operation chooses a vector accompanied abetter fitness to enter the next generation to generatethe selection operator namely guiding vector GV+e process can be expressed as followsChoose a vector with better fitness to enter the nextgeneration

GVt+1ij

utij fitness ut

ij1113872 1113873le fitness xtij1113872 1113873

ptij otherwise

⎧⎪⎨

⎪⎩(12)

223 e Initialization of Search Space Based on QuantumBehavior Quantum behavior is a nonlinear and excellentsuperposition system With its simple and effective char-acteristics and good performance in global optimization ithas been applied to optimize many algorithms such asparticle swarm optimization [35] and pigeon-inspired op-timization algorithm [36] Consequently according to theliterature studies and its excellent global optimization per-formance we use the quantum system to initialize the searchspace of the bird

Quantum-behaved particle position can be writtenmathematically as follows

xt+1

pt plusmn

Lt

2ln

1u

(13)

pt

c1 times pbestt + 1 minus c1( 1113857 times gbestt (14)

Lt

2β Pt

minus xt

11138681113868111386811138681113868111386811138681113868 (15)

According to the characteristics of equations (13)ndash(15)we can get the improved search space initialization formulaas follows

4 Computational Intelligence and Neuroscience

xti lb +(ub minus lb) times rand (16)

xt+1i x

ti plusmn mbest minus x

ti

11138681113868111386811138681113868111386811138681113868 times ln

1u

(17)

where β is a positive number which can be respectivelycalled as a contraction expansion factor Here xt

i is theposition of the particle at the previous moment and mbest isthe average value of the best previous positions of all thebirds (Algorithm 1)

224 Procedures of the DMSDL-QBSA In Sections221ndash223 in order to improve the local search capabilityand the global search capability on BSA this paper hasimproved the BSA in three parts

(1) In order to improve the local search capability offoraging behavior on BSA we put forward equation(8) based on the dynamic multi-swarm method

(2) In order to get the guiding vector to improve theglobal search capability of foraging behavior on BSAwe put forward equations (9) (11) and (12) based ondifferential evolution

(3) In order to expand the initialization search space ofthe bird to improve the global search capability onBSA we put forward equations (16) and (17) basedon quantum behavior

Finally the steps of DMSDL-QBSA can be shown inAlgorithm 1

23 Simulation Experiment and Analysis +is sectionpresents the evaluation of DMSDL-QBSA using a series ofexperiments on benchmark functions and CEC2014 testfunctions All experiments in this paper are implementedusing the following MATLAB R2014b Win 7 (64-bit) Inter(R) Core (TM) i5-2450M CPU 250GHz 400GB RAMTo obtain fair results all the experiments were conductedunder the same conditions +e number of the populationsize is set as 30 in these algorithms And each algorithm runs30 times independently for each function

231 Benchmark Functions and CEC 2014 Test FunctionsWhen investigating the effective and universal performanceof DMSDL-QBSA compared with several hybrid algorithmsand popular algorithms 18 benchmark functions andCEC2014 test functions are applied In order to test theeffectiveness of the proposed DMSDL-QBSA 18 benchmarkfunctions [37] are adopted and all of which have an optimalvalue of 0 +e benchmark functions and their searchingranges are shown in Table 1 In this test suite f1 minus f9 areunimodal functions +ese unimodal functions are usuallyused to test and investigate whether the proposed algorithmhas a good convergence performance +en f10 minus f18 aremultimodal functions +ese multimodal functions are usedto test the global search capability of the proposed algorithm+e smaller the fitness value of functions the better thealgorithm performs Furthermore in order to better verify

the comprehensive performance of DMSDL-QBSA in amore comprehensively manner another 30 complexCEC2014 benchmarks are used +e CEC2014 benchmarkfunctions are simply described in Table 2

232 Parameter Settings In order to verify the effectivenessand generalization of the proposed DMSDL-QBSA theimproved DMSDL-QBSA is compared with several hybridalgorithms+ese algorithms are BSA [7] DE [32] DMSDL-PSO [31] and DMSDL-BSA Another 5 popular intelligencealgorithms such as grey wolf optimizer (GWO) [38] whaleoptimization algorithm (WOA) [39] sine cosine algorithm(SCA) [40] grasshopper optimization algorithm (GOA)[41] and sparrow search algorithm (SSA) [42] are used tocompare with DMSDL-QBSA+ese algorithms representedstate-of-the-art can be used to better verify the performanceof DMSDL-QBSA in a more comprehensively manner Forfair comparison the number of populations of all algorithmsis set to 30 respectively and other parameters of all algo-rithms are set according to their original papers +e pa-rameter settings of these involved algorithms are shown inTable 3 in detail

233 Comparison on Benchmark Functions with HybridAlgorithms According to Section 22 three hybrid strategies(dynamicmulti-swarmmethod DE and quantum behavior)have been combined with the basic BSA method Wheninvestigating the effectiveness of DMSDL-QBSA comparedwith several hybrid algorithms such as BSA DE DMSDL-PSO and DMSDL-BSA 18 benchmark functions are ap-plied Compared with DMSDL-QBSA quantum behaviordynamic is not used in the dynamic multi-swarm differentiallearning bird swarm algorithm (DMSDL-BSA) +e numberof function evaluations (FEs) is 10000 We selected twodifferent dimensionrsquos sizes (Dim) Dim 10 is the typicaldimensions for the benchmark functions And Dim 2 is forRF has two parameters that need to be optimized whichmeans that the optimization function is 2-dimensional

+e fitness value curves of a run of several algorithms onabout eight different functions are shown in Figures 1 and 2where the horizontal axis represents the number of iterationsand the vertical axis represents the fitness value We canobviously see the convergence speeds of several differentalgorithms +e maximum value (Max) the minimum value(Min) the mean value (Mean) and the variance (Var)obtained by several benchmark algorithms are shown inTables 4ndash7 where the best results are marked in bold Table 4and 5 show the performance of the several algorithms onunimodal functions when Dim 10 and 2 and Table 6 and 7show the performance of the several algorithms on multi-modal functions when Dim 10 and 2

(1) Unimodal functionsFrom the numerical testing results on 8 unimodalfunctions in Table 4 we can see that DMSDL-QBSAcan find the optimal solution for all unimodalfunctions and get the minimum value of 0 on f1 f2f3 f7 and f8 Both DMSDL-QBSA and DMSDL-

Computational Intelligence and Neuroscience 5

BSA can find the minimum value HoweverDMSDL-QBSA has the best mean value and varianceon each function +e main reason is that DMSDL-QBSA has better population diversity during theinitialization period In summary the DMSDL-QBSA has best performance on unimodal functionscompared to the other algorithms when Dim 10Obviously the DMSDL-QBSA has a relatively wellconvergence speed+e evolution curves of these algorithms on fourunimodal functions f1 f5 f6 and f9 are drawn inFigure 1 It can be detected from the figure that thecurve of DMSDL-QBSA descends fastest in the

number of iterations that are far less than 10000times For f1 f5 and f9 case DMSDL-QBSA hasthe fastest convergence speed compared with otheralgorithms However the original BSA got the worstsolution because it is trapped in the local optimumprematurely For function f6 these algorithms didnot find the value 0 However the convergence speedof the DMSDL-QBSA is significantly faster thanother algorithms in the early stage and the solutioneventually found is the best Overall owing to en-hance the diversity of population DMSDL-QBSAhas a relatively excellent convergence speed whenDim 2

Variable setting number of iterationsiter bird positions Xi local optimum Pi global optimal positiongbest global optimumPgbestand fitness of sub-swarm optimal position Plbest

Input population size N dimensionrsquos size D number of function evaluationsiter max the time interval of each bird flight behaviorFQ the probability of foraging behavior P constant parameter c1 c2 a1 a2 FL iter 0 and contraction expansion factor β

Output global optimal position gbest fitness of global optimal position Pgbest(1) Begin(2) Initialize the positions of N birds using equations (16)-(17) Xi(i 1 2 N)(3) Calculated fitness Fi(i 1 2 N) set Xi to be Pi and find Pgbest(4) While iterlt itermax + 1 do(5) lowast Dynamic sub-swarmlowast(6) Regroup Pi and Xi of the sub-swarms randomly(7) Sort the Plbest and refine the first [025lowastN] best Plbest(8) update the corresponding Pi(9) lowastDifferential learning schemelowast(10) For i 1 N

(11) Construct lbest using DMS-BSA(12) Differential evolution construct V using equations (9)-(10)(13) Crossover construct U using equation (11)(14) Selection construct GV using equation (12)(15) End For(16) lowastBirds position adjustinglowast(17) If (tFQne 0)

(18) For i 1 N

(19) If randltp

(20) Foraging behavior update the position Xi of birds using equation (8)(21) Else(22) Vigilance behavior update the position Xi of birds using equation (2)(23) End If(24) End For(25) Else(26) Flight behavior is divided into producers and scroungers(27) For i 1 N

(28) If Xi is a producer(29) Producers update the position Xi of birds using equation (5)(30) Else(31) Scroungers update the position Xi of birds using equation (6)(32) End If(33) End For(34) End If(35) Evaluate f(Xi)(36) Update gbest and Pgbest(37) iter iter + 1(38) End While(39) End

ALGORITHM 1 DMSDL-QBSA

6 Computational Intelligence and Neuroscience

According to the results of Table 5 DMSDL-QBSAgets the best performance on these 8 unimodalfunctions when Dim 2 DMSDL-QBSA finds theminimum value of 0 onf1f2f3f4f5f7f8 andf9 DMSDL-QBSA has better performance on f2f4 and f9 compared with DMSDL-BSA andDMSDL-PSO DE does not perform well on thesefunctions but BSA performs relatively well on f1f3 f6 f7 f8 and f9 +e main reason is that BSAhas a better convergence performance in the earlysearch Obviously DMSDL-QBSA can find the besttwo parameters for RF that need to be optimized

(2) Multimodal functionsFrom the numerical testing results on 8 multimodalfunctions in Table 6 we can see that DMSDL-QBSAcan find the optimal solution for all multimodalfunctions and get the minimum value of 0 on f11f13 and f16 DMSDL-QBSA has the best perfor-mance on f11 f12 f13 f14 f15 and f16 BSA worksthe best on f10 DMSDL-PSO performs not verywell And DMSDL-QBSA has the best mean valueand variance on most functions +e main reason isthat DMSDL-QBSA has a stronger global explora-tion capability based on the dynamic multi-swarmmethod and differential evolution In summary the

DMSDL-QBSA has relatively well performance onunimodal functions compared to the other algo-rithms when typical Dim 10 Obviously theDMSDL-QBSA has relatively well global searchcapability

+e evolution curves of these algorithms on four multi-modal functions f12 f13 f17 and f18 when Dim 2 aredepicted in Figure 2 We can see that DMSDL-QBSA can findthe optimal solution in the same iteration For f13 and f17case DMSDL-QBSA continues to decline However theoriginal BSA and DE get parallel straight lines because of theirpoor global convergence ability For functions f12 and f18although DMSDL-QBSA also trapped the local optimum itfind the minimum value compared to other algorithmsObviously the convergence speed of the DMSDL-QBSA issignificantly faster than other algorithms in the early stage andthe solution eventually found is the best In general owing toenhance the diversity of population DMSDL-QBSA has arelatively balanced global search capability when Dim 2

Furthermore from the numerical testing results on ninemultimodal functions in Table 7 we can see that DMSDL-QBSA has the best performance on f11 f12 f13 f14 f16f17 and f18 DMSDL-QBSA gets the minimum value of 0 onf11f13f16 andf17 BSA has got theminimum value of 0 onf11 f13 and f17 DE also has not got the minimum value of 0

Table 1 18 benchmark functions

Name Test function RangeSphere f1(x) 1113936

Di1 x2

i [minus100 100]D

Schwefel P222 f2 1113936Di1 |xi| + 1113937

Di1 |xi| [minus10 10]D

Schwefel P12 f3(x) 1113936Di1(1113936

ij xj)

2 [minus100 100]D

GeneralizedRosenbrock f4(x) 1113936

Dminus1i1 [100(x2

i minus xi+1)2 + (xi minus 1)2] [minus100 100]D

Step f5(x) 1113936Di1 (|xi + 05|)2 [minus100 100]D

Noise f6(x) 1113936Di1 ix4

i + rand[0 1) [minus128 128]D

SumSquares f7(x) 1113936Di1 ix2

i [minus10 10]D

Zakharov f8(x) 1113936Di1 x2

i + (1113936Di1 05ixi)

2 + (1113936Di1 05ixi)

4 [minus10 10]D

Schaffer f9(x y) 05 + (sin2(x2 minus y2) minus 05[1 + 0001(x2 minus y2)]2) [minus10 10]D

GeneralizedSchwefel 226 f10(x) 4189829D minus 1113936

Di1(x sin

|xi|

1113968) [minus500 500]D

GeneralizedRastrigin f11(x) 1113936

Di1[x2

i minus 10 cos(2πxi) + 10] [minus512 512]D

Ackley f12 20 + e minus 20 exp(minus02

(1D) 1113936Di1 x2

i

1113969

) minusexp((1D) 1113936Di1 cos 2 πxi) [minus32 32]D

GeneralizedGriewank f13(x) 1113936

Di1 (x2

i 4000) minus 1113937Di1 cos(xi

i

radic) + 1 [minus600 600]D

GeneralizedPenalized 1

f14(x) (πD) 10 sin2(πy1) + 1113936Dminus1i1 (yi minus 1)21113966

[1 + 10 sin2(πyi+1)] + (yD minus 1)2 + 1113936Di1 u(xi 10 100 4)

u(xi a k m)

k(xi minus a)m xi gt a

0 minusalexi le a

k(minusxi minus a)m xi lt minus a

⎧⎪⎨

⎪⎩ yi 1 + (14)xi

[minus50 50]D

GeneralizedPenalized 2

f15(x) (110) 10 sin2(3πx1) + 1113936Dminus1i1 (xi minus 1)2[1 + sin2(3πxi+1)]1113966

+(xD minus 1)2[1 + sin2(2πxD)] + 1113936Di1 u(xi 5 100 4)

[minus50 50]D

Alpine f16(x) 1113936Di1 |xi sin xi + 01xi| [minus10 10]D

Booth f17(x) (x1 + 2x2 minus 7)2 + (2x1 + x2 minus 5)2 [minus10 10]D

Levy f18(x) sin2(πx1) + 1113936Di1 (yi minus 1)2[1 + 10 sin2(πyi + 1)]1113966 1113967 + (yi minus 1)2[1 + 10 sin2(2πyD)] [minus10 10]D

Computational Intelligence and Neuroscience 7

on any functions DMSDL-BSA has got theminimum value of0 on f11 and f13 In summary the DMSDL-QBSA has asuperior global search capability on most multimodal func-tions when Dim 2 Obviously DMSDL-QBSA can find thebest two parameters for RF that need to be optimized becauseof its best global search capability

In this section it can be seen from Figures 1 and 2 andTables 4ndash7 that DMSDL-QBSA can obtain the best functionvalues for most cases It indicates that the hybrid strategies ofBSA dynamic multi-swarm method DE and quantum be-havior operators lead to the bird moves towards the best so-lutions AndDMSDL-QBSA has well ability of searching for thebest two parameters for RF with higher accuracy and efficiency

234 Comparison on Benchmark Functions with PopularAlgorithms When comparing the timeliness and appli-cability of DMSDL-QBSA compared with several pop-ular algorithms such as GWO WOA SCA GOA andSSA 18 benchmark functions are applied And GWOWOA GOA and SSA are swarm intelligence algorithmsIn this experiment the dimensionrsquos size of these func-tions is10 +e number of function evaluations (FEs)is100000 +e maximum value (Max) the minimumvalue (Min) the mean value (Mean) and the variance(Var) obtained by several different algorithms are shownin Tables 8 and 9 where the best results are marked inbold

Table 2 Summary of the CECprime14 test functions

Type No Functions Filowast Fi (xlowast)

Unimodal functionsF1 Rotated High-Conditioned Elliptic Function 100F2 Rotated Bent Cigar Function 200F3 Rotated Discus Function 300

Simple multimodal functions

F4 Shifted and Rotated Rosenbrockrsquos Function 400F5 Shifted and Rotated Ackleyrsquos Function 500F6 Shifted and Rotated Weierstrass Function 600F7 Shifted and Rotated Griewankrsquos Function 700F8 Shifted Rastriginrsquos Function 800F9 Shifted and Rotated Rastriginrsquos Function 900F10 Shifted Schwefelrsquos Function 1000F11 Shifted and Rotated Schwefelrsquos Function 1100F12 Shifted and Rotated Katsuura Function 1200F13 Shifted and Rotated HappyCat function 1300F14 Shifted and Rotated HGBat Function 1400F15 Shifted and Rotated Expanded Griewankrsquos Plus Rosenbrockrsquos Function 1500F16 Shifted and Rotated Expanded Scafferrsquos F6 Function 1600

Hybrid functions

F17 Hybrid function 1 (N 3) 1700F18 Hybrid function 2 (N 3) 1800F19 Hybrid function 3 (N 4) 1900F20 Hybrid function 4 (N 4) 2000F21 Hybrid function 5 (N 5) 2100F22 Hybrid function 6 (N 5) 2200

Composition functions

F23 Composition function 1 (N 5) 2300F24 Composition function 2 (N 3) 2400F25 Composition function 3 (N 3) 2500F26 Composition function 4 (N 5) 2600F27 Composition function 5 (N 5) 2700F28 Composition function 6 (N 5) 2800F29 Composition function 7 (N 3) 2900F30 Composition function 8 (N 3) 3000

Search range [minus100 100]D

Table 3 Parameter settings

Algorithm Parameter settingsBSA c1 c2 15 a1 a2 1 FQ 10DE F 05 CR 01DMSDL-PSO c1 c2 149445 F 05 CR 01 sub-swarm 10DMSDL-BSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10DMSDL-QBSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10

8 Computational Intelligence and Neuroscience

From the test results in Table 8 we can see that DMSDL-QBSA has the best performance on each unimodal functionGWO finds the value 0 on f1 f2 f3 f7 and f8 WOAobtains 0 on f1 f2 and f7 SSA works the best on f1 andf7 With the experiment of multimodal function evalua-tions Table 9 shows that DMSDL-QBSA has the best per-formance on f11f12f13f14f15f16 and f18 SSA has thebest performance on f10 GWO gets the minimum on f11WOA and SCA obtains the optimal value on f11 and f13Obviously compared with these popular algorithmsDMSDL-QBSA is a competitive algorithm for solving severalfunctions and the swarm intelligence algorithms performbetter than other algorithms +e results of Tables 8 and 9show that DMSDL-QBSA has the best performance on themost test benchmark functions

235 Comparison on CEC2014 Test Functions with HybridAlgorithms When comparing the comprehensive perfor-mance of proposed DMSDL-QBSA compared with severalhybrid algorithms such as BSA DE DMSDL-PSO andDMSDL-BSA 30 CEC2014 test functions are applied In thisexperiment the dimensionrsquos size (Dim) is set to 10 +enumber of function evaluations (FEs) is 100000 Experi-mental comparisons included the maximum value (Max)the minimum value (Min) the mean value (Mean) and thevariance (Var) are given in Tables 10 and 11 where the bestresults are marked in bold

Based on the mean value (Mean) on the CEC2014 testfunctions DMSDL-QBSA has the best performance on F2F3 F4 F6 F7 F8 F9 F10 F11 F15 F16 F17 F21 F26 F27 F29and F30 DMSDL-BSA does show an advantage on F1 F12

1050

100

10ndash50

10ndash100

10ndash150

10ndash200

10ndash250

10ndash300

Fitn

ess v

alue

Iteration

Sphere

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Fitn

ess v

alue

Step

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Noise100

10ndash1

10ndash2

10ndash3

10ndash4

10ndash5

10ndash6

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

Fitn

ess v

alue

Schaffer100

10ndash5

10ndash10

10ndash15

10ndash20

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 1 Fitness value curves of 5 hybrid algorithms on (a)f1 (b)f5 (c)f6 and (d)f9 (Dim 2)

Computational Intelligence and Neuroscience 9

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 3: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

(i) In order to achieve a better balance between effi-ciency and velocity for BSA we have studied theeffects of four different hybrid strategies of thedynamic multi-swarm method differential evolu-tion and quantum behavior on the performance ofBSA

(ii) +e proposed DMSDL-QBSA has successfully op-timized ntree and mtry setting problem of RF +eresulting hybrid classification model has been rig-orously evaluated on oil logging prediction

(iii) +e proposed hybrid classification model deliversbetter classification performance and offers moreaccurate and faster results when compared toother swarm intelligence algorithm-based RFmodels

2 Bird Swarm Algorithm and Its Improvement

21 Bird Swarm Algorithm Principle BSA as proposed byMeng et al in 2015 is a new intelligent bionic algorithmbased on multigroup and multisearch methods it mimicsthe birdsrsquo foraging behavior vigilance behavior and flightbehavior and employs this swarm intelligence to solve theoptimization problem +e bird swarm algorithm can besimplified by the five rules

Rule 1 each bird can switch between vigilant behaviorand foraging behavior and both bird forages and keepsvigilance is mimicked as random decisionsRule 2 when foraging each bird records and updates itsprevious best experience and the swarmsrsquo previous bestexperience with food patches +e experience can alsobe used to search for food Instant sharing of socialinformation is across the groupRule 3 when keeping vigilance each bird tries tomove towards the center of the swarm +is behaviormay be influenced by disturbances caused by swarmcompetition Birds with more stocks are more likelyto be near swarmrsquos centers than birds with leasestocksRule 4 birds fly to another place regularly Whenflying to another location birds often switch betweenproduction and shrubs +e bird with the most stocksis the producer and the bird with the least is ascrounger Other birds with the highest and lowestreserves are randomly selected for producers andscroungersRule 5 producers actively seek food Scroungers ran-domly follow producers looking for food

According to Rule 1 we define that the time interval ofeach bird flight behavior FQ the probability of foragingbehavior P(P isin (0 1)) and a uniform random numberδ isin (0 1)

(1) Foraging behaviorIf the number of iteration is less than FQand δ leP the bird will be the foraging

behavior Rule 2 can be written mathematically asfollows

xt+1ij x

tij + p

tij minus x

tij1113872 1113873 times C times rand(0 1)

+ gtj minus x

tij1113872 1113873 times S times rand(0 1)

(1)

where C and S are two positive numbers the formeris called cognitive accelerated coefficients and thelatter is called social accelerated coefficients Herepij is the i-th birdrsquos best previous position and gj isthe best previous swarmrsquos position

(2) Vigilance behaviorIf the number of iteration is less than FQ and δ gtPthe bird will be the vigilance behavior Rule 3 can bewritten mathematically as follows

xt+1ij x

tij + A1 meant

j minus xtij1113872 1113873 times rand(0 1)

+ A2 ptkj minus x

tij1113872 1113873 times rand(minus1 1)

(2)

A1 a1 times exp minuspFiti

sumFit + εtimes N1113874 1113875 (3)

A2 a2 times exppFiti minus pFitk

pFitk minus pFiti1113868111386811138681113868

1113868111386811138681113868 + ε1113888 1113889 times

N times pFitksumFit + ε

1113888 1113889

(4)

where a1 and a2 are two positive constants in [0 2]pFiti is the best fitness value of i-th bird and sumFit isthe sum of the swarmsrsquo best fitness value Here εwhich is used to avoid zero-division error is thesmallest constant in the computer meanj denotes thej-th element of the whole swarmrsquos average position

(3) Flight behaviorIf the number of iteration equals FQ the birdwill be theflight behavior which can be divided into the behaviorsof the producers and scroungers by fitness Rule 3 andRule 4 can be written mathematically as follows

xt+1ij x

tij + randn(0 1) times x

tij (5)

xt+1ij x

tij + x

tkj minus x

tij1113872 1113873 times FL times rand(0 1) (6)

where FL (FL isin [0 2]) means that the scrounger wouldfollow the producer to search for food

22 e Bird Swarm Algorithm Based on Dynamic Multi-Swarm Method Differential Evolution andQuantum Behavior

221 e Foraging Behavior Based on Dynamic Multi-Swarm Method Dynamic multi-swarm method has beenwidely used in real-world applications because it is efficientand easy to implement In addition it is very common in the

Computational Intelligence and Neuroscience 3

improvement of swarm intelligent optimization such ascoevolutionary algorithm [26] the framework of evolu-tionary algorithms [27] multiobjective particle swarm op-timization [28] hybrid dynamic robust [29] and PSOalgorithm [30 31] However the PSO algorithm is easy to fallinto the local optimum and its generalization performance isnot high Consequently motivated by these literaturestudies we will establish a dynamic multi-swarm bird swarmalgorithm (DMS-BSA) and it improves the local searchcapability of foraging behavior

In DMS-PSO the whole population is divided into manysmall swarms which are often regrouped by using variousreorganization plans to exchange information +e velocityupdate strategy is

vt+1ij ωv

tij + c2r2 l

tij minus x

tij1113872 1113873 (7)

where lij is the best historical position achieved within thelocal community of the i-th particle

According to the characteristic of equation (1) we cansee that the foraging behavior formula of BSA is similar tothe particle velocity update formula of PSO So according tothe characteristic of equation (7) we can get the improvedforaging behavior formula as follows

xt+1ij x

tij + GV

tij minus x

tij1113872 1113873 times C times rand(0 1) (8)

where GV is called the guiding vector+e dynamic multi-swarm method is used to improve

the local search capability while the guiding vector GV canenhance the global search capability of foraging behaviorObviously we need to build a good guiding vector

222 e Guiding Vector Based on Differential EvolutionDifferential evolution (DE) is a powerful evolutionary al-gorithm with three differential evolution operators forsolving the tough global optimization problems [32] Be-sides DE has got more and more attention of scholars toevolve and improve in evolutionary computation such ashybrid multiple crossover operations [33] and proposed DEneighbor1 [34] due to its excellent global search capabilityFrom these literature studies we can see that DE has a goodglobal search capability so we will establish the guidingvector GV based on differential evolution operator to im-prove the global search capability of foraging behavior +edetailed implementation of GV is presented as follows

(1) Differential mutationAccording to the characteristic of equation (8) theldquoDEbest1rdquo ldquoDEbest2rdquo and ldquoDEcurrent-to-best1rdquomutation strategies are suitable In the experiments ofthe literature [31] they showed that the ldquoDEbest1rdquomutation strategy is the most suitable in DMS-PSO sowe choose this mutation strategy in BSA And the ldquoDElbest1rdquo mutation strategy can be written as followsDElbest1

vtij l

tij + F times p

tr1j minus p

tr2j1113872 1113873 (9)

Note that some components of the mutant vector vij

may violate predefined boundary constraints In thiscase the boundary processing is used It can beexpressed as follows

vtij

xlb vtij ltxlb

xub vtij gexub

⎧⎨

⎩ (10)

(2) CrossoverAfter differential mutation a binomial crossoveroperation exchanges some components of the mu-tant vector vij with the best previous position pij togenerate the target vector uij +e process can beexpressed as

utij

vtij randleCR or j rand(i)

ptij otherwise

⎧⎨

⎩ (11)

(3) SelectionBecause the purpose of BSA is to find the best fitnessa selection operation chooses a vector accompanied abetter fitness to enter the next generation to generatethe selection operator namely guiding vector GV+e process can be expressed as followsChoose a vector with better fitness to enter the nextgeneration

GVt+1ij

utij fitness ut

ij1113872 1113873le fitness xtij1113872 1113873

ptij otherwise

⎧⎪⎨

⎪⎩(12)

223 e Initialization of Search Space Based on QuantumBehavior Quantum behavior is a nonlinear and excellentsuperposition system With its simple and effective char-acteristics and good performance in global optimization ithas been applied to optimize many algorithms such asparticle swarm optimization [35] and pigeon-inspired op-timization algorithm [36] Consequently according to theliterature studies and its excellent global optimization per-formance we use the quantum system to initialize the searchspace of the bird

Quantum-behaved particle position can be writtenmathematically as follows

xt+1

pt plusmn

Lt

2ln

1u

(13)

pt

c1 times pbestt + 1 minus c1( 1113857 times gbestt (14)

Lt

2β Pt

minus xt

11138681113868111386811138681113868111386811138681113868 (15)

According to the characteristics of equations (13)ndash(15)we can get the improved search space initialization formulaas follows

4 Computational Intelligence and Neuroscience

xti lb +(ub minus lb) times rand (16)

xt+1i x

ti plusmn mbest minus x

ti

11138681113868111386811138681113868111386811138681113868 times ln

1u

(17)

where β is a positive number which can be respectivelycalled as a contraction expansion factor Here xt

i is theposition of the particle at the previous moment and mbest isthe average value of the best previous positions of all thebirds (Algorithm 1)

224 Procedures of the DMSDL-QBSA In Sections221ndash223 in order to improve the local search capabilityand the global search capability on BSA this paper hasimproved the BSA in three parts

(1) In order to improve the local search capability offoraging behavior on BSA we put forward equation(8) based on the dynamic multi-swarm method

(2) In order to get the guiding vector to improve theglobal search capability of foraging behavior on BSAwe put forward equations (9) (11) and (12) based ondifferential evolution

(3) In order to expand the initialization search space ofthe bird to improve the global search capability onBSA we put forward equations (16) and (17) basedon quantum behavior

Finally the steps of DMSDL-QBSA can be shown inAlgorithm 1

23 Simulation Experiment and Analysis +is sectionpresents the evaluation of DMSDL-QBSA using a series ofexperiments on benchmark functions and CEC2014 testfunctions All experiments in this paper are implementedusing the following MATLAB R2014b Win 7 (64-bit) Inter(R) Core (TM) i5-2450M CPU 250GHz 400GB RAMTo obtain fair results all the experiments were conductedunder the same conditions +e number of the populationsize is set as 30 in these algorithms And each algorithm runs30 times independently for each function

231 Benchmark Functions and CEC 2014 Test FunctionsWhen investigating the effective and universal performanceof DMSDL-QBSA compared with several hybrid algorithmsand popular algorithms 18 benchmark functions andCEC2014 test functions are applied In order to test theeffectiveness of the proposed DMSDL-QBSA 18 benchmarkfunctions [37] are adopted and all of which have an optimalvalue of 0 +e benchmark functions and their searchingranges are shown in Table 1 In this test suite f1 minus f9 areunimodal functions +ese unimodal functions are usuallyused to test and investigate whether the proposed algorithmhas a good convergence performance +en f10 minus f18 aremultimodal functions +ese multimodal functions are usedto test the global search capability of the proposed algorithm+e smaller the fitness value of functions the better thealgorithm performs Furthermore in order to better verify

the comprehensive performance of DMSDL-QBSA in amore comprehensively manner another 30 complexCEC2014 benchmarks are used +e CEC2014 benchmarkfunctions are simply described in Table 2

232 Parameter Settings In order to verify the effectivenessand generalization of the proposed DMSDL-QBSA theimproved DMSDL-QBSA is compared with several hybridalgorithms+ese algorithms are BSA [7] DE [32] DMSDL-PSO [31] and DMSDL-BSA Another 5 popular intelligencealgorithms such as grey wolf optimizer (GWO) [38] whaleoptimization algorithm (WOA) [39] sine cosine algorithm(SCA) [40] grasshopper optimization algorithm (GOA)[41] and sparrow search algorithm (SSA) [42] are used tocompare with DMSDL-QBSA+ese algorithms representedstate-of-the-art can be used to better verify the performanceof DMSDL-QBSA in a more comprehensively manner Forfair comparison the number of populations of all algorithmsis set to 30 respectively and other parameters of all algo-rithms are set according to their original papers +e pa-rameter settings of these involved algorithms are shown inTable 3 in detail

233 Comparison on Benchmark Functions with HybridAlgorithms According to Section 22 three hybrid strategies(dynamicmulti-swarmmethod DE and quantum behavior)have been combined with the basic BSA method Wheninvestigating the effectiveness of DMSDL-QBSA comparedwith several hybrid algorithms such as BSA DE DMSDL-PSO and DMSDL-BSA 18 benchmark functions are ap-plied Compared with DMSDL-QBSA quantum behaviordynamic is not used in the dynamic multi-swarm differentiallearning bird swarm algorithm (DMSDL-BSA) +e numberof function evaluations (FEs) is 10000 We selected twodifferent dimensionrsquos sizes (Dim) Dim 10 is the typicaldimensions for the benchmark functions And Dim 2 is forRF has two parameters that need to be optimized whichmeans that the optimization function is 2-dimensional

+e fitness value curves of a run of several algorithms onabout eight different functions are shown in Figures 1 and 2where the horizontal axis represents the number of iterationsand the vertical axis represents the fitness value We canobviously see the convergence speeds of several differentalgorithms +e maximum value (Max) the minimum value(Min) the mean value (Mean) and the variance (Var)obtained by several benchmark algorithms are shown inTables 4ndash7 where the best results are marked in bold Table 4and 5 show the performance of the several algorithms onunimodal functions when Dim 10 and 2 and Table 6 and 7show the performance of the several algorithms on multi-modal functions when Dim 10 and 2

(1) Unimodal functionsFrom the numerical testing results on 8 unimodalfunctions in Table 4 we can see that DMSDL-QBSAcan find the optimal solution for all unimodalfunctions and get the minimum value of 0 on f1 f2f3 f7 and f8 Both DMSDL-QBSA and DMSDL-

Computational Intelligence and Neuroscience 5

BSA can find the minimum value HoweverDMSDL-QBSA has the best mean value and varianceon each function +e main reason is that DMSDL-QBSA has better population diversity during theinitialization period In summary the DMSDL-QBSA has best performance on unimodal functionscompared to the other algorithms when Dim 10Obviously the DMSDL-QBSA has a relatively wellconvergence speed+e evolution curves of these algorithms on fourunimodal functions f1 f5 f6 and f9 are drawn inFigure 1 It can be detected from the figure that thecurve of DMSDL-QBSA descends fastest in the

number of iterations that are far less than 10000times For f1 f5 and f9 case DMSDL-QBSA hasthe fastest convergence speed compared with otheralgorithms However the original BSA got the worstsolution because it is trapped in the local optimumprematurely For function f6 these algorithms didnot find the value 0 However the convergence speedof the DMSDL-QBSA is significantly faster thanother algorithms in the early stage and the solutioneventually found is the best Overall owing to en-hance the diversity of population DMSDL-QBSAhas a relatively excellent convergence speed whenDim 2

Variable setting number of iterationsiter bird positions Xi local optimum Pi global optimal positiongbest global optimumPgbestand fitness of sub-swarm optimal position Plbest

Input population size N dimensionrsquos size D number of function evaluationsiter max the time interval of each bird flight behaviorFQ the probability of foraging behavior P constant parameter c1 c2 a1 a2 FL iter 0 and contraction expansion factor β

Output global optimal position gbest fitness of global optimal position Pgbest(1) Begin(2) Initialize the positions of N birds using equations (16)-(17) Xi(i 1 2 N)(3) Calculated fitness Fi(i 1 2 N) set Xi to be Pi and find Pgbest(4) While iterlt itermax + 1 do(5) lowast Dynamic sub-swarmlowast(6) Regroup Pi and Xi of the sub-swarms randomly(7) Sort the Plbest and refine the first [025lowastN] best Plbest(8) update the corresponding Pi(9) lowastDifferential learning schemelowast(10) For i 1 N

(11) Construct lbest using DMS-BSA(12) Differential evolution construct V using equations (9)-(10)(13) Crossover construct U using equation (11)(14) Selection construct GV using equation (12)(15) End For(16) lowastBirds position adjustinglowast(17) If (tFQne 0)

(18) For i 1 N

(19) If randltp

(20) Foraging behavior update the position Xi of birds using equation (8)(21) Else(22) Vigilance behavior update the position Xi of birds using equation (2)(23) End If(24) End For(25) Else(26) Flight behavior is divided into producers and scroungers(27) For i 1 N

(28) If Xi is a producer(29) Producers update the position Xi of birds using equation (5)(30) Else(31) Scroungers update the position Xi of birds using equation (6)(32) End If(33) End For(34) End If(35) Evaluate f(Xi)(36) Update gbest and Pgbest(37) iter iter + 1(38) End While(39) End

ALGORITHM 1 DMSDL-QBSA

6 Computational Intelligence and Neuroscience

According to the results of Table 5 DMSDL-QBSAgets the best performance on these 8 unimodalfunctions when Dim 2 DMSDL-QBSA finds theminimum value of 0 onf1f2f3f4f5f7f8 andf9 DMSDL-QBSA has better performance on f2f4 and f9 compared with DMSDL-BSA andDMSDL-PSO DE does not perform well on thesefunctions but BSA performs relatively well on f1f3 f6 f7 f8 and f9 +e main reason is that BSAhas a better convergence performance in the earlysearch Obviously DMSDL-QBSA can find the besttwo parameters for RF that need to be optimized

(2) Multimodal functionsFrom the numerical testing results on 8 multimodalfunctions in Table 6 we can see that DMSDL-QBSAcan find the optimal solution for all multimodalfunctions and get the minimum value of 0 on f11f13 and f16 DMSDL-QBSA has the best perfor-mance on f11 f12 f13 f14 f15 and f16 BSA worksthe best on f10 DMSDL-PSO performs not verywell And DMSDL-QBSA has the best mean valueand variance on most functions +e main reason isthat DMSDL-QBSA has a stronger global explora-tion capability based on the dynamic multi-swarmmethod and differential evolution In summary the

DMSDL-QBSA has relatively well performance onunimodal functions compared to the other algo-rithms when typical Dim 10 Obviously theDMSDL-QBSA has relatively well global searchcapability

+e evolution curves of these algorithms on four multi-modal functions f12 f13 f17 and f18 when Dim 2 aredepicted in Figure 2 We can see that DMSDL-QBSA can findthe optimal solution in the same iteration For f13 and f17case DMSDL-QBSA continues to decline However theoriginal BSA and DE get parallel straight lines because of theirpoor global convergence ability For functions f12 and f18although DMSDL-QBSA also trapped the local optimum itfind the minimum value compared to other algorithmsObviously the convergence speed of the DMSDL-QBSA issignificantly faster than other algorithms in the early stage andthe solution eventually found is the best In general owing toenhance the diversity of population DMSDL-QBSA has arelatively balanced global search capability when Dim 2

Furthermore from the numerical testing results on ninemultimodal functions in Table 7 we can see that DMSDL-QBSA has the best performance on f11 f12 f13 f14 f16f17 and f18 DMSDL-QBSA gets the minimum value of 0 onf11f13f16 andf17 BSA has got theminimum value of 0 onf11 f13 and f17 DE also has not got the minimum value of 0

Table 1 18 benchmark functions

Name Test function RangeSphere f1(x) 1113936

Di1 x2

i [minus100 100]D

Schwefel P222 f2 1113936Di1 |xi| + 1113937

Di1 |xi| [minus10 10]D

Schwefel P12 f3(x) 1113936Di1(1113936

ij xj)

2 [minus100 100]D

GeneralizedRosenbrock f4(x) 1113936

Dminus1i1 [100(x2

i minus xi+1)2 + (xi minus 1)2] [minus100 100]D

Step f5(x) 1113936Di1 (|xi + 05|)2 [minus100 100]D

Noise f6(x) 1113936Di1 ix4

i + rand[0 1) [minus128 128]D

SumSquares f7(x) 1113936Di1 ix2

i [minus10 10]D

Zakharov f8(x) 1113936Di1 x2

i + (1113936Di1 05ixi)

2 + (1113936Di1 05ixi)

4 [minus10 10]D

Schaffer f9(x y) 05 + (sin2(x2 minus y2) minus 05[1 + 0001(x2 minus y2)]2) [minus10 10]D

GeneralizedSchwefel 226 f10(x) 4189829D minus 1113936

Di1(x sin

|xi|

1113968) [minus500 500]D

GeneralizedRastrigin f11(x) 1113936

Di1[x2

i minus 10 cos(2πxi) + 10] [minus512 512]D

Ackley f12 20 + e minus 20 exp(minus02

(1D) 1113936Di1 x2

i

1113969

) minusexp((1D) 1113936Di1 cos 2 πxi) [minus32 32]D

GeneralizedGriewank f13(x) 1113936

Di1 (x2

i 4000) minus 1113937Di1 cos(xi

i

radic) + 1 [minus600 600]D

GeneralizedPenalized 1

f14(x) (πD) 10 sin2(πy1) + 1113936Dminus1i1 (yi minus 1)21113966

[1 + 10 sin2(πyi+1)] + (yD minus 1)2 + 1113936Di1 u(xi 10 100 4)

u(xi a k m)

k(xi minus a)m xi gt a

0 minusalexi le a

k(minusxi minus a)m xi lt minus a

⎧⎪⎨

⎪⎩ yi 1 + (14)xi

[minus50 50]D

GeneralizedPenalized 2

f15(x) (110) 10 sin2(3πx1) + 1113936Dminus1i1 (xi minus 1)2[1 + sin2(3πxi+1)]1113966

+(xD minus 1)2[1 + sin2(2πxD)] + 1113936Di1 u(xi 5 100 4)

[minus50 50]D

Alpine f16(x) 1113936Di1 |xi sin xi + 01xi| [minus10 10]D

Booth f17(x) (x1 + 2x2 minus 7)2 + (2x1 + x2 minus 5)2 [minus10 10]D

Levy f18(x) sin2(πx1) + 1113936Di1 (yi minus 1)2[1 + 10 sin2(πyi + 1)]1113966 1113967 + (yi minus 1)2[1 + 10 sin2(2πyD)] [minus10 10]D

Computational Intelligence and Neuroscience 7

on any functions DMSDL-BSA has got theminimum value of0 on f11 and f13 In summary the DMSDL-QBSA has asuperior global search capability on most multimodal func-tions when Dim 2 Obviously DMSDL-QBSA can find thebest two parameters for RF that need to be optimized becauseof its best global search capability

In this section it can be seen from Figures 1 and 2 andTables 4ndash7 that DMSDL-QBSA can obtain the best functionvalues for most cases It indicates that the hybrid strategies ofBSA dynamic multi-swarm method DE and quantum be-havior operators lead to the bird moves towards the best so-lutions AndDMSDL-QBSA has well ability of searching for thebest two parameters for RF with higher accuracy and efficiency

234 Comparison on Benchmark Functions with PopularAlgorithms When comparing the timeliness and appli-cability of DMSDL-QBSA compared with several pop-ular algorithms such as GWO WOA SCA GOA andSSA 18 benchmark functions are applied And GWOWOA GOA and SSA are swarm intelligence algorithmsIn this experiment the dimensionrsquos size of these func-tions is10 +e number of function evaluations (FEs)is100000 +e maximum value (Max) the minimumvalue (Min) the mean value (Mean) and the variance(Var) obtained by several different algorithms are shownin Tables 8 and 9 where the best results are marked inbold

Table 2 Summary of the CECprime14 test functions

Type No Functions Filowast Fi (xlowast)

Unimodal functionsF1 Rotated High-Conditioned Elliptic Function 100F2 Rotated Bent Cigar Function 200F3 Rotated Discus Function 300

Simple multimodal functions

F4 Shifted and Rotated Rosenbrockrsquos Function 400F5 Shifted and Rotated Ackleyrsquos Function 500F6 Shifted and Rotated Weierstrass Function 600F7 Shifted and Rotated Griewankrsquos Function 700F8 Shifted Rastriginrsquos Function 800F9 Shifted and Rotated Rastriginrsquos Function 900F10 Shifted Schwefelrsquos Function 1000F11 Shifted and Rotated Schwefelrsquos Function 1100F12 Shifted and Rotated Katsuura Function 1200F13 Shifted and Rotated HappyCat function 1300F14 Shifted and Rotated HGBat Function 1400F15 Shifted and Rotated Expanded Griewankrsquos Plus Rosenbrockrsquos Function 1500F16 Shifted and Rotated Expanded Scafferrsquos F6 Function 1600

Hybrid functions

F17 Hybrid function 1 (N 3) 1700F18 Hybrid function 2 (N 3) 1800F19 Hybrid function 3 (N 4) 1900F20 Hybrid function 4 (N 4) 2000F21 Hybrid function 5 (N 5) 2100F22 Hybrid function 6 (N 5) 2200

Composition functions

F23 Composition function 1 (N 5) 2300F24 Composition function 2 (N 3) 2400F25 Composition function 3 (N 3) 2500F26 Composition function 4 (N 5) 2600F27 Composition function 5 (N 5) 2700F28 Composition function 6 (N 5) 2800F29 Composition function 7 (N 3) 2900F30 Composition function 8 (N 3) 3000

Search range [minus100 100]D

Table 3 Parameter settings

Algorithm Parameter settingsBSA c1 c2 15 a1 a2 1 FQ 10DE F 05 CR 01DMSDL-PSO c1 c2 149445 F 05 CR 01 sub-swarm 10DMSDL-BSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10DMSDL-QBSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10

8 Computational Intelligence and Neuroscience

From the test results in Table 8 we can see that DMSDL-QBSA has the best performance on each unimodal functionGWO finds the value 0 on f1 f2 f3 f7 and f8 WOAobtains 0 on f1 f2 and f7 SSA works the best on f1 andf7 With the experiment of multimodal function evalua-tions Table 9 shows that DMSDL-QBSA has the best per-formance on f11f12f13f14f15f16 and f18 SSA has thebest performance on f10 GWO gets the minimum on f11WOA and SCA obtains the optimal value on f11 and f13Obviously compared with these popular algorithmsDMSDL-QBSA is a competitive algorithm for solving severalfunctions and the swarm intelligence algorithms performbetter than other algorithms +e results of Tables 8 and 9show that DMSDL-QBSA has the best performance on themost test benchmark functions

235 Comparison on CEC2014 Test Functions with HybridAlgorithms When comparing the comprehensive perfor-mance of proposed DMSDL-QBSA compared with severalhybrid algorithms such as BSA DE DMSDL-PSO andDMSDL-BSA 30 CEC2014 test functions are applied In thisexperiment the dimensionrsquos size (Dim) is set to 10 +enumber of function evaluations (FEs) is 100000 Experi-mental comparisons included the maximum value (Max)the minimum value (Min) the mean value (Mean) and thevariance (Var) are given in Tables 10 and 11 where the bestresults are marked in bold

Based on the mean value (Mean) on the CEC2014 testfunctions DMSDL-QBSA has the best performance on F2F3 F4 F6 F7 F8 F9 F10 F11 F15 F16 F17 F21 F26 F27 F29and F30 DMSDL-BSA does show an advantage on F1 F12

1050

100

10ndash50

10ndash100

10ndash150

10ndash200

10ndash250

10ndash300

Fitn

ess v

alue

Iteration

Sphere

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Fitn

ess v

alue

Step

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Noise100

10ndash1

10ndash2

10ndash3

10ndash4

10ndash5

10ndash6

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

Fitn

ess v

alue

Schaffer100

10ndash5

10ndash10

10ndash15

10ndash20

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 1 Fitness value curves of 5 hybrid algorithms on (a)f1 (b)f5 (c)f6 and (d)f9 (Dim 2)

Computational Intelligence and Neuroscience 9

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 4: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

improvement of swarm intelligent optimization such ascoevolutionary algorithm [26] the framework of evolu-tionary algorithms [27] multiobjective particle swarm op-timization [28] hybrid dynamic robust [29] and PSOalgorithm [30 31] However the PSO algorithm is easy to fallinto the local optimum and its generalization performance isnot high Consequently motivated by these literaturestudies we will establish a dynamic multi-swarm bird swarmalgorithm (DMS-BSA) and it improves the local searchcapability of foraging behavior

In DMS-PSO the whole population is divided into manysmall swarms which are often regrouped by using variousreorganization plans to exchange information +e velocityupdate strategy is

vt+1ij ωv

tij + c2r2 l

tij minus x

tij1113872 1113873 (7)

where lij is the best historical position achieved within thelocal community of the i-th particle

According to the characteristic of equation (1) we cansee that the foraging behavior formula of BSA is similar tothe particle velocity update formula of PSO So according tothe characteristic of equation (7) we can get the improvedforaging behavior formula as follows

xt+1ij x

tij + GV

tij minus x

tij1113872 1113873 times C times rand(0 1) (8)

where GV is called the guiding vector+e dynamic multi-swarm method is used to improve

the local search capability while the guiding vector GV canenhance the global search capability of foraging behaviorObviously we need to build a good guiding vector

222 e Guiding Vector Based on Differential EvolutionDifferential evolution (DE) is a powerful evolutionary al-gorithm with three differential evolution operators forsolving the tough global optimization problems [32] Be-sides DE has got more and more attention of scholars toevolve and improve in evolutionary computation such ashybrid multiple crossover operations [33] and proposed DEneighbor1 [34] due to its excellent global search capabilityFrom these literature studies we can see that DE has a goodglobal search capability so we will establish the guidingvector GV based on differential evolution operator to im-prove the global search capability of foraging behavior +edetailed implementation of GV is presented as follows

(1) Differential mutationAccording to the characteristic of equation (8) theldquoDEbest1rdquo ldquoDEbest2rdquo and ldquoDEcurrent-to-best1rdquomutation strategies are suitable In the experiments ofthe literature [31] they showed that the ldquoDEbest1rdquomutation strategy is the most suitable in DMS-PSO sowe choose this mutation strategy in BSA And the ldquoDElbest1rdquo mutation strategy can be written as followsDElbest1

vtij l

tij + F times p

tr1j minus p

tr2j1113872 1113873 (9)

Note that some components of the mutant vector vij

may violate predefined boundary constraints In thiscase the boundary processing is used It can beexpressed as follows

vtij

xlb vtij ltxlb

xub vtij gexub

⎧⎨

⎩ (10)

(2) CrossoverAfter differential mutation a binomial crossoveroperation exchanges some components of the mu-tant vector vij with the best previous position pij togenerate the target vector uij +e process can beexpressed as

utij

vtij randleCR or j rand(i)

ptij otherwise

⎧⎨

⎩ (11)

(3) SelectionBecause the purpose of BSA is to find the best fitnessa selection operation chooses a vector accompanied abetter fitness to enter the next generation to generatethe selection operator namely guiding vector GV+e process can be expressed as followsChoose a vector with better fitness to enter the nextgeneration

GVt+1ij

utij fitness ut

ij1113872 1113873le fitness xtij1113872 1113873

ptij otherwise

⎧⎪⎨

⎪⎩(12)

223 e Initialization of Search Space Based on QuantumBehavior Quantum behavior is a nonlinear and excellentsuperposition system With its simple and effective char-acteristics and good performance in global optimization ithas been applied to optimize many algorithms such asparticle swarm optimization [35] and pigeon-inspired op-timization algorithm [36] Consequently according to theliterature studies and its excellent global optimization per-formance we use the quantum system to initialize the searchspace of the bird

Quantum-behaved particle position can be writtenmathematically as follows

xt+1

pt plusmn

Lt

2ln

1u

(13)

pt

c1 times pbestt + 1 minus c1( 1113857 times gbestt (14)

Lt

2β Pt

minus xt

11138681113868111386811138681113868111386811138681113868 (15)

According to the characteristics of equations (13)ndash(15)we can get the improved search space initialization formulaas follows

4 Computational Intelligence and Neuroscience

xti lb +(ub minus lb) times rand (16)

xt+1i x

ti plusmn mbest minus x

ti

11138681113868111386811138681113868111386811138681113868 times ln

1u

(17)

where β is a positive number which can be respectivelycalled as a contraction expansion factor Here xt

i is theposition of the particle at the previous moment and mbest isthe average value of the best previous positions of all thebirds (Algorithm 1)

224 Procedures of the DMSDL-QBSA In Sections221ndash223 in order to improve the local search capabilityand the global search capability on BSA this paper hasimproved the BSA in three parts

(1) In order to improve the local search capability offoraging behavior on BSA we put forward equation(8) based on the dynamic multi-swarm method

(2) In order to get the guiding vector to improve theglobal search capability of foraging behavior on BSAwe put forward equations (9) (11) and (12) based ondifferential evolution

(3) In order to expand the initialization search space ofthe bird to improve the global search capability onBSA we put forward equations (16) and (17) basedon quantum behavior

Finally the steps of DMSDL-QBSA can be shown inAlgorithm 1

23 Simulation Experiment and Analysis +is sectionpresents the evaluation of DMSDL-QBSA using a series ofexperiments on benchmark functions and CEC2014 testfunctions All experiments in this paper are implementedusing the following MATLAB R2014b Win 7 (64-bit) Inter(R) Core (TM) i5-2450M CPU 250GHz 400GB RAMTo obtain fair results all the experiments were conductedunder the same conditions +e number of the populationsize is set as 30 in these algorithms And each algorithm runs30 times independently for each function

231 Benchmark Functions and CEC 2014 Test FunctionsWhen investigating the effective and universal performanceof DMSDL-QBSA compared with several hybrid algorithmsand popular algorithms 18 benchmark functions andCEC2014 test functions are applied In order to test theeffectiveness of the proposed DMSDL-QBSA 18 benchmarkfunctions [37] are adopted and all of which have an optimalvalue of 0 +e benchmark functions and their searchingranges are shown in Table 1 In this test suite f1 minus f9 areunimodal functions +ese unimodal functions are usuallyused to test and investigate whether the proposed algorithmhas a good convergence performance +en f10 minus f18 aremultimodal functions +ese multimodal functions are usedto test the global search capability of the proposed algorithm+e smaller the fitness value of functions the better thealgorithm performs Furthermore in order to better verify

the comprehensive performance of DMSDL-QBSA in amore comprehensively manner another 30 complexCEC2014 benchmarks are used +e CEC2014 benchmarkfunctions are simply described in Table 2

232 Parameter Settings In order to verify the effectivenessand generalization of the proposed DMSDL-QBSA theimproved DMSDL-QBSA is compared with several hybridalgorithms+ese algorithms are BSA [7] DE [32] DMSDL-PSO [31] and DMSDL-BSA Another 5 popular intelligencealgorithms such as grey wolf optimizer (GWO) [38] whaleoptimization algorithm (WOA) [39] sine cosine algorithm(SCA) [40] grasshopper optimization algorithm (GOA)[41] and sparrow search algorithm (SSA) [42] are used tocompare with DMSDL-QBSA+ese algorithms representedstate-of-the-art can be used to better verify the performanceof DMSDL-QBSA in a more comprehensively manner Forfair comparison the number of populations of all algorithmsis set to 30 respectively and other parameters of all algo-rithms are set according to their original papers +e pa-rameter settings of these involved algorithms are shown inTable 3 in detail

233 Comparison on Benchmark Functions with HybridAlgorithms According to Section 22 three hybrid strategies(dynamicmulti-swarmmethod DE and quantum behavior)have been combined with the basic BSA method Wheninvestigating the effectiveness of DMSDL-QBSA comparedwith several hybrid algorithms such as BSA DE DMSDL-PSO and DMSDL-BSA 18 benchmark functions are ap-plied Compared with DMSDL-QBSA quantum behaviordynamic is not used in the dynamic multi-swarm differentiallearning bird swarm algorithm (DMSDL-BSA) +e numberof function evaluations (FEs) is 10000 We selected twodifferent dimensionrsquos sizes (Dim) Dim 10 is the typicaldimensions for the benchmark functions And Dim 2 is forRF has two parameters that need to be optimized whichmeans that the optimization function is 2-dimensional

+e fitness value curves of a run of several algorithms onabout eight different functions are shown in Figures 1 and 2where the horizontal axis represents the number of iterationsand the vertical axis represents the fitness value We canobviously see the convergence speeds of several differentalgorithms +e maximum value (Max) the minimum value(Min) the mean value (Mean) and the variance (Var)obtained by several benchmark algorithms are shown inTables 4ndash7 where the best results are marked in bold Table 4and 5 show the performance of the several algorithms onunimodal functions when Dim 10 and 2 and Table 6 and 7show the performance of the several algorithms on multi-modal functions when Dim 10 and 2

(1) Unimodal functionsFrom the numerical testing results on 8 unimodalfunctions in Table 4 we can see that DMSDL-QBSAcan find the optimal solution for all unimodalfunctions and get the minimum value of 0 on f1 f2f3 f7 and f8 Both DMSDL-QBSA and DMSDL-

Computational Intelligence and Neuroscience 5

BSA can find the minimum value HoweverDMSDL-QBSA has the best mean value and varianceon each function +e main reason is that DMSDL-QBSA has better population diversity during theinitialization period In summary the DMSDL-QBSA has best performance on unimodal functionscompared to the other algorithms when Dim 10Obviously the DMSDL-QBSA has a relatively wellconvergence speed+e evolution curves of these algorithms on fourunimodal functions f1 f5 f6 and f9 are drawn inFigure 1 It can be detected from the figure that thecurve of DMSDL-QBSA descends fastest in the

number of iterations that are far less than 10000times For f1 f5 and f9 case DMSDL-QBSA hasthe fastest convergence speed compared with otheralgorithms However the original BSA got the worstsolution because it is trapped in the local optimumprematurely For function f6 these algorithms didnot find the value 0 However the convergence speedof the DMSDL-QBSA is significantly faster thanother algorithms in the early stage and the solutioneventually found is the best Overall owing to en-hance the diversity of population DMSDL-QBSAhas a relatively excellent convergence speed whenDim 2

Variable setting number of iterationsiter bird positions Xi local optimum Pi global optimal positiongbest global optimumPgbestand fitness of sub-swarm optimal position Plbest

Input population size N dimensionrsquos size D number of function evaluationsiter max the time interval of each bird flight behaviorFQ the probability of foraging behavior P constant parameter c1 c2 a1 a2 FL iter 0 and contraction expansion factor β

Output global optimal position gbest fitness of global optimal position Pgbest(1) Begin(2) Initialize the positions of N birds using equations (16)-(17) Xi(i 1 2 N)(3) Calculated fitness Fi(i 1 2 N) set Xi to be Pi and find Pgbest(4) While iterlt itermax + 1 do(5) lowast Dynamic sub-swarmlowast(6) Regroup Pi and Xi of the sub-swarms randomly(7) Sort the Plbest and refine the first [025lowastN] best Plbest(8) update the corresponding Pi(9) lowastDifferential learning schemelowast(10) For i 1 N

(11) Construct lbest using DMS-BSA(12) Differential evolution construct V using equations (9)-(10)(13) Crossover construct U using equation (11)(14) Selection construct GV using equation (12)(15) End For(16) lowastBirds position adjustinglowast(17) If (tFQne 0)

(18) For i 1 N

(19) If randltp

(20) Foraging behavior update the position Xi of birds using equation (8)(21) Else(22) Vigilance behavior update the position Xi of birds using equation (2)(23) End If(24) End For(25) Else(26) Flight behavior is divided into producers and scroungers(27) For i 1 N

(28) If Xi is a producer(29) Producers update the position Xi of birds using equation (5)(30) Else(31) Scroungers update the position Xi of birds using equation (6)(32) End If(33) End For(34) End If(35) Evaluate f(Xi)(36) Update gbest and Pgbest(37) iter iter + 1(38) End While(39) End

ALGORITHM 1 DMSDL-QBSA

6 Computational Intelligence and Neuroscience

According to the results of Table 5 DMSDL-QBSAgets the best performance on these 8 unimodalfunctions when Dim 2 DMSDL-QBSA finds theminimum value of 0 onf1f2f3f4f5f7f8 andf9 DMSDL-QBSA has better performance on f2f4 and f9 compared with DMSDL-BSA andDMSDL-PSO DE does not perform well on thesefunctions but BSA performs relatively well on f1f3 f6 f7 f8 and f9 +e main reason is that BSAhas a better convergence performance in the earlysearch Obviously DMSDL-QBSA can find the besttwo parameters for RF that need to be optimized

(2) Multimodal functionsFrom the numerical testing results on 8 multimodalfunctions in Table 6 we can see that DMSDL-QBSAcan find the optimal solution for all multimodalfunctions and get the minimum value of 0 on f11f13 and f16 DMSDL-QBSA has the best perfor-mance on f11 f12 f13 f14 f15 and f16 BSA worksthe best on f10 DMSDL-PSO performs not verywell And DMSDL-QBSA has the best mean valueand variance on most functions +e main reason isthat DMSDL-QBSA has a stronger global explora-tion capability based on the dynamic multi-swarmmethod and differential evolution In summary the

DMSDL-QBSA has relatively well performance onunimodal functions compared to the other algo-rithms when typical Dim 10 Obviously theDMSDL-QBSA has relatively well global searchcapability

+e evolution curves of these algorithms on four multi-modal functions f12 f13 f17 and f18 when Dim 2 aredepicted in Figure 2 We can see that DMSDL-QBSA can findthe optimal solution in the same iteration For f13 and f17case DMSDL-QBSA continues to decline However theoriginal BSA and DE get parallel straight lines because of theirpoor global convergence ability For functions f12 and f18although DMSDL-QBSA also trapped the local optimum itfind the minimum value compared to other algorithmsObviously the convergence speed of the DMSDL-QBSA issignificantly faster than other algorithms in the early stage andthe solution eventually found is the best In general owing toenhance the diversity of population DMSDL-QBSA has arelatively balanced global search capability when Dim 2

Furthermore from the numerical testing results on ninemultimodal functions in Table 7 we can see that DMSDL-QBSA has the best performance on f11 f12 f13 f14 f16f17 and f18 DMSDL-QBSA gets the minimum value of 0 onf11f13f16 andf17 BSA has got theminimum value of 0 onf11 f13 and f17 DE also has not got the minimum value of 0

Table 1 18 benchmark functions

Name Test function RangeSphere f1(x) 1113936

Di1 x2

i [minus100 100]D

Schwefel P222 f2 1113936Di1 |xi| + 1113937

Di1 |xi| [minus10 10]D

Schwefel P12 f3(x) 1113936Di1(1113936

ij xj)

2 [minus100 100]D

GeneralizedRosenbrock f4(x) 1113936

Dminus1i1 [100(x2

i minus xi+1)2 + (xi minus 1)2] [minus100 100]D

Step f5(x) 1113936Di1 (|xi + 05|)2 [minus100 100]D

Noise f6(x) 1113936Di1 ix4

i + rand[0 1) [minus128 128]D

SumSquares f7(x) 1113936Di1 ix2

i [minus10 10]D

Zakharov f8(x) 1113936Di1 x2

i + (1113936Di1 05ixi)

2 + (1113936Di1 05ixi)

4 [minus10 10]D

Schaffer f9(x y) 05 + (sin2(x2 minus y2) minus 05[1 + 0001(x2 minus y2)]2) [minus10 10]D

GeneralizedSchwefel 226 f10(x) 4189829D minus 1113936

Di1(x sin

|xi|

1113968) [minus500 500]D

GeneralizedRastrigin f11(x) 1113936

Di1[x2

i minus 10 cos(2πxi) + 10] [minus512 512]D

Ackley f12 20 + e minus 20 exp(minus02

(1D) 1113936Di1 x2

i

1113969

) minusexp((1D) 1113936Di1 cos 2 πxi) [minus32 32]D

GeneralizedGriewank f13(x) 1113936

Di1 (x2

i 4000) minus 1113937Di1 cos(xi

i

radic) + 1 [minus600 600]D

GeneralizedPenalized 1

f14(x) (πD) 10 sin2(πy1) + 1113936Dminus1i1 (yi minus 1)21113966

[1 + 10 sin2(πyi+1)] + (yD minus 1)2 + 1113936Di1 u(xi 10 100 4)

u(xi a k m)

k(xi minus a)m xi gt a

0 minusalexi le a

k(minusxi minus a)m xi lt minus a

⎧⎪⎨

⎪⎩ yi 1 + (14)xi

[minus50 50]D

GeneralizedPenalized 2

f15(x) (110) 10 sin2(3πx1) + 1113936Dminus1i1 (xi minus 1)2[1 + sin2(3πxi+1)]1113966

+(xD minus 1)2[1 + sin2(2πxD)] + 1113936Di1 u(xi 5 100 4)

[minus50 50]D

Alpine f16(x) 1113936Di1 |xi sin xi + 01xi| [minus10 10]D

Booth f17(x) (x1 + 2x2 minus 7)2 + (2x1 + x2 minus 5)2 [minus10 10]D

Levy f18(x) sin2(πx1) + 1113936Di1 (yi minus 1)2[1 + 10 sin2(πyi + 1)]1113966 1113967 + (yi minus 1)2[1 + 10 sin2(2πyD)] [minus10 10]D

Computational Intelligence and Neuroscience 7

on any functions DMSDL-BSA has got theminimum value of0 on f11 and f13 In summary the DMSDL-QBSA has asuperior global search capability on most multimodal func-tions when Dim 2 Obviously DMSDL-QBSA can find thebest two parameters for RF that need to be optimized becauseof its best global search capability

In this section it can be seen from Figures 1 and 2 andTables 4ndash7 that DMSDL-QBSA can obtain the best functionvalues for most cases It indicates that the hybrid strategies ofBSA dynamic multi-swarm method DE and quantum be-havior operators lead to the bird moves towards the best so-lutions AndDMSDL-QBSA has well ability of searching for thebest two parameters for RF with higher accuracy and efficiency

234 Comparison on Benchmark Functions with PopularAlgorithms When comparing the timeliness and appli-cability of DMSDL-QBSA compared with several pop-ular algorithms such as GWO WOA SCA GOA andSSA 18 benchmark functions are applied And GWOWOA GOA and SSA are swarm intelligence algorithmsIn this experiment the dimensionrsquos size of these func-tions is10 +e number of function evaluations (FEs)is100000 +e maximum value (Max) the minimumvalue (Min) the mean value (Mean) and the variance(Var) obtained by several different algorithms are shownin Tables 8 and 9 where the best results are marked inbold

Table 2 Summary of the CECprime14 test functions

Type No Functions Filowast Fi (xlowast)

Unimodal functionsF1 Rotated High-Conditioned Elliptic Function 100F2 Rotated Bent Cigar Function 200F3 Rotated Discus Function 300

Simple multimodal functions

F4 Shifted and Rotated Rosenbrockrsquos Function 400F5 Shifted and Rotated Ackleyrsquos Function 500F6 Shifted and Rotated Weierstrass Function 600F7 Shifted and Rotated Griewankrsquos Function 700F8 Shifted Rastriginrsquos Function 800F9 Shifted and Rotated Rastriginrsquos Function 900F10 Shifted Schwefelrsquos Function 1000F11 Shifted and Rotated Schwefelrsquos Function 1100F12 Shifted and Rotated Katsuura Function 1200F13 Shifted and Rotated HappyCat function 1300F14 Shifted and Rotated HGBat Function 1400F15 Shifted and Rotated Expanded Griewankrsquos Plus Rosenbrockrsquos Function 1500F16 Shifted and Rotated Expanded Scafferrsquos F6 Function 1600

Hybrid functions

F17 Hybrid function 1 (N 3) 1700F18 Hybrid function 2 (N 3) 1800F19 Hybrid function 3 (N 4) 1900F20 Hybrid function 4 (N 4) 2000F21 Hybrid function 5 (N 5) 2100F22 Hybrid function 6 (N 5) 2200

Composition functions

F23 Composition function 1 (N 5) 2300F24 Composition function 2 (N 3) 2400F25 Composition function 3 (N 3) 2500F26 Composition function 4 (N 5) 2600F27 Composition function 5 (N 5) 2700F28 Composition function 6 (N 5) 2800F29 Composition function 7 (N 3) 2900F30 Composition function 8 (N 3) 3000

Search range [minus100 100]D

Table 3 Parameter settings

Algorithm Parameter settingsBSA c1 c2 15 a1 a2 1 FQ 10DE F 05 CR 01DMSDL-PSO c1 c2 149445 F 05 CR 01 sub-swarm 10DMSDL-BSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10DMSDL-QBSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10

8 Computational Intelligence and Neuroscience

From the test results in Table 8 we can see that DMSDL-QBSA has the best performance on each unimodal functionGWO finds the value 0 on f1 f2 f3 f7 and f8 WOAobtains 0 on f1 f2 and f7 SSA works the best on f1 andf7 With the experiment of multimodal function evalua-tions Table 9 shows that DMSDL-QBSA has the best per-formance on f11f12f13f14f15f16 and f18 SSA has thebest performance on f10 GWO gets the minimum on f11WOA and SCA obtains the optimal value on f11 and f13Obviously compared with these popular algorithmsDMSDL-QBSA is a competitive algorithm for solving severalfunctions and the swarm intelligence algorithms performbetter than other algorithms +e results of Tables 8 and 9show that DMSDL-QBSA has the best performance on themost test benchmark functions

235 Comparison on CEC2014 Test Functions with HybridAlgorithms When comparing the comprehensive perfor-mance of proposed DMSDL-QBSA compared with severalhybrid algorithms such as BSA DE DMSDL-PSO andDMSDL-BSA 30 CEC2014 test functions are applied In thisexperiment the dimensionrsquos size (Dim) is set to 10 +enumber of function evaluations (FEs) is 100000 Experi-mental comparisons included the maximum value (Max)the minimum value (Min) the mean value (Mean) and thevariance (Var) are given in Tables 10 and 11 where the bestresults are marked in bold

Based on the mean value (Mean) on the CEC2014 testfunctions DMSDL-QBSA has the best performance on F2F3 F4 F6 F7 F8 F9 F10 F11 F15 F16 F17 F21 F26 F27 F29and F30 DMSDL-BSA does show an advantage on F1 F12

1050

100

10ndash50

10ndash100

10ndash150

10ndash200

10ndash250

10ndash300

Fitn

ess v

alue

Iteration

Sphere

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Fitn

ess v

alue

Step

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Noise100

10ndash1

10ndash2

10ndash3

10ndash4

10ndash5

10ndash6

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

Fitn

ess v

alue

Schaffer100

10ndash5

10ndash10

10ndash15

10ndash20

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 1 Fitness value curves of 5 hybrid algorithms on (a)f1 (b)f5 (c)f6 and (d)f9 (Dim 2)

Computational Intelligence and Neuroscience 9

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 5: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

xti lb +(ub minus lb) times rand (16)

xt+1i x

ti plusmn mbest minus x

ti

11138681113868111386811138681113868111386811138681113868 times ln

1u

(17)

where β is a positive number which can be respectivelycalled as a contraction expansion factor Here xt

i is theposition of the particle at the previous moment and mbest isthe average value of the best previous positions of all thebirds (Algorithm 1)

224 Procedures of the DMSDL-QBSA In Sections221ndash223 in order to improve the local search capabilityand the global search capability on BSA this paper hasimproved the BSA in three parts

(1) In order to improve the local search capability offoraging behavior on BSA we put forward equation(8) based on the dynamic multi-swarm method

(2) In order to get the guiding vector to improve theglobal search capability of foraging behavior on BSAwe put forward equations (9) (11) and (12) based ondifferential evolution

(3) In order to expand the initialization search space ofthe bird to improve the global search capability onBSA we put forward equations (16) and (17) basedon quantum behavior

Finally the steps of DMSDL-QBSA can be shown inAlgorithm 1

23 Simulation Experiment and Analysis +is sectionpresents the evaluation of DMSDL-QBSA using a series ofexperiments on benchmark functions and CEC2014 testfunctions All experiments in this paper are implementedusing the following MATLAB R2014b Win 7 (64-bit) Inter(R) Core (TM) i5-2450M CPU 250GHz 400GB RAMTo obtain fair results all the experiments were conductedunder the same conditions +e number of the populationsize is set as 30 in these algorithms And each algorithm runs30 times independently for each function

231 Benchmark Functions and CEC 2014 Test FunctionsWhen investigating the effective and universal performanceof DMSDL-QBSA compared with several hybrid algorithmsand popular algorithms 18 benchmark functions andCEC2014 test functions are applied In order to test theeffectiveness of the proposed DMSDL-QBSA 18 benchmarkfunctions [37] are adopted and all of which have an optimalvalue of 0 +e benchmark functions and their searchingranges are shown in Table 1 In this test suite f1 minus f9 areunimodal functions +ese unimodal functions are usuallyused to test and investigate whether the proposed algorithmhas a good convergence performance +en f10 minus f18 aremultimodal functions +ese multimodal functions are usedto test the global search capability of the proposed algorithm+e smaller the fitness value of functions the better thealgorithm performs Furthermore in order to better verify

the comprehensive performance of DMSDL-QBSA in amore comprehensively manner another 30 complexCEC2014 benchmarks are used +e CEC2014 benchmarkfunctions are simply described in Table 2

232 Parameter Settings In order to verify the effectivenessand generalization of the proposed DMSDL-QBSA theimproved DMSDL-QBSA is compared with several hybridalgorithms+ese algorithms are BSA [7] DE [32] DMSDL-PSO [31] and DMSDL-BSA Another 5 popular intelligencealgorithms such as grey wolf optimizer (GWO) [38] whaleoptimization algorithm (WOA) [39] sine cosine algorithm(SCA) [40] grasshopper optimization algorithm (GOA)[41] and sparrow search algorithm (SSA) [42] are used tocompare with DMSDL-QBSA+ese algorithms representedstate-of-the-art can be used to better verify the performanceof DMSDL-QBSA in a more comprehensively manner Forfair comparison the number of populations of all algorithmsis set to 30 respectively and other parameters of all algo-rithms are set according to their original papers +e pa-rameter settings of these involved algorithms are shown inTable 3 in detail

233 Comparison on Benchmark Functions with HybridAlgorithms According to Section 22 three hybrid strategies(dynamicmulti-swarmmethod DE and quantum behavior)have been combined with the basic BSA method Wheninvestigating the effectiveness of DMSDL-QBSA comparedwith several hybrid algorithms such as BSA DE DMSDL-PSO and DMSDL-BSA 18 benchmark functions are ap-plied Compared with DMSDL-QBSA quantum behaviordynamic is not used in the dynamic multi-swarm differentiallearning bird swarm algorithm (DMSDL-BSA) +e numberof function evaluations (FEs) is 10000 We selected twodifferent dimensionrsquos sizes (Dim) Dim 10 is the typicaldimensions for the benchmark functions And Dim 2 is forRF has two parameters that need to be optimized whichmeans that the optimization function is 2-dimensional

+e fitness value curves of a run of several algorithms onabout eight different functions are shown in Figures 1 and 2where the horizontal axis represents the number of iterationsand the vertical axis represents the fitness value We canobviously see the convergence speeds of several differentalgorithms +e maximum value (Max) the minimum value(Min) the mean value (Mean) and the variance (Var)obtained by several benchmark algorithms are shown inTables 4ndash7 where the best results are marked in bold Table 4and 5 show the performance of the several algorithms onunimodal functions when Dim 10 and 2 and Table 6 and 7show the performance of the several algorithms on multi-modal functions when Dim 10 and 2

(1) Unimodal functionsFrom the numerical testing results on 8 unimodalfunctions in Table 4 we can see that DMSDL-QBSAcan find the optimal solution for all unimodalfunctions and get the minimum value of 0 on f1 f2f3 f7 and f8 Both DMSDL-QBSA and DMSDL-

Computational Intelligence and Neuroscience 5

BSA can find the minimum value HoweverDMSDL-QBSA has the best mean value and varianceon each function +e main reason is that DMSDL-QBSA has better population diversity during theinitialization period In summary the DMSDL-QBSA has best performance on unimodal functionscompared to the other algorithms when Dim 10Obviously the DMSDL-QBSA has a relatively wellconvergence speed+e evolution curves of these algorithms on fourunimodal functions f1 f5 f6 and f9 are drawn inFigure 1 It can be detected from the figure that thecurve of DMSDL-QBSA descends fastest in the

number of iterations that are far less than 10000times For f1 f5 and f9 case DMSDL-QBSA hasthe fastest convergence speed compared with otheralgorithms However the original BSA got the worstsolution because it is trapped in the local optimumprematurely For function f6 these algorithms didnot find the value 0 However the convergence speedof the DMSDL-QBSA is significantly faster thanother algorithms in the early stage and the solutioneventually found is the best Overall owing to en-hance the diversity of population DMSDL-QBSAhas a relatively excellent convergence speed whenDim 2

Variable setting number of iterationsiter bird positions Xi local optimum Pi global optimal positiongbest global optimumPgbestand fitness of sub-swarm optimal position Plbest

Input population size N dimensionrsquos size D number of function evaluationsiter max the time interval of each bird flight behaviorFQ the probability of foraging behavior P constant parameter c1 c2 a1 a2 FL iter 0 and contraction expansion factor β

Output global optimal position gbest fitness of global optimal position Pgbest(1) Begin(2) Initialize the positions of N birds using equations (16)-(17) Xi(i 1 2 N)(3) Calculated fitness Fi(i 1 2 N) set Xi to be Pi and find Pgbest(4) While iterlt itermax + 1 do(5) lowast Dynamic sub-swarmlowast(6) Regroup Pi and Xi of the sub-swarms randomly(7) Sort the Plbest and refine the first [025lowastN] best Plbest(8) update the corresponding Pi(9) lowastDifferential learning schemelowast(10) For i 1 N

(11) Construct lbest using DMS-BSA(12) Differential evolution construct V using equations (9)-(10)(13) Crossover construct U using equation (11)(14) Selection construct GV using equation (12)(15) End For(16) lowastBirds position adjustinglowast(17) If (tFQne 0)

(18) For i 1 N

(19) If randltp

(20) Foraging behavior update the position Xi of birds using equation (8)(21) Else(22) Vigilance behavior update the position Xi of birds using equation (2)(23) End If(24) End For(25) Else(26) Flight behavior is divided into producers and scroungers(27) For i 1 N

(28) If Xi is a producer(29) Producers update the position Xi of birds using equation (5)(30) Else(31) Scroungers update the position Xi of birds using equation (6)(32) End If(33) End For(34) End If(35) Evaluate f(Xi)(36) Update gbest and Pgbest(37) iter iter + 1(38) End While(39) End

ALGORITHM 1 DMSDL-QBSA

6 Computational Intelligence and Neuroscience

According to the results of Table 5 DMSDL-QBSAgets the best performance on these 8 unimodalfunctions when Dim 2 DMSDL-QBSA finds theminimum value of 0 onf1f2f3f4f5f7f8 andf9 DMSDL-QBSA has better performance on f2f4 and f9 compared with DMSDL-BSA andDMSDL-PSO DE does not perform well on thesefunctions but BSA performs relatively well on f1f3 f6 f7 f8 and f9 +e main reason is that BSAhas a better convergence performance in the earlysearch Obviously DMSDL-QBSA can find the besttwo parameters for RF that need to be optimized

(2) Multimodal functionsFrom the numerical testing results on 8 multimodalfunctions in Table 6 we can see that DMSDL-QBSAcan find the optimal solution for all multimodalfunctions and get the minimum value of 0 on f11f13 and f16 DMSDL-QBSA has the best perfor-mance on f11 f12 f13 f14 f15 and f16 BSA worksthe best on f10 DMSDL-PSO performs not verywell And DMSDL-QBSA has the best mean valueand variance on most functions +e main reason isthat DMSDL-QBSA has a stronger global explora-tion capability based on the dynamic multi-swarmmethod and differential evolution In summary the

DMSDL-QBSA has relatively well performance onunimodal functions compared to the other algo-rithms when typical Dim 10 Obviously theDMSDL-QBSA has relatively well global searchcapability

+e evolution curves of these algorithms on four multi-modal functions f12 f13 f17 and f18 when Dim 2 aredepicted in Figure 2 We can see that DMSDL-QBSA can findthe optimal solution in the same iteration For f13 and f17case DMSDL-QBSA continues to decline However theoriginal BSA and DE get parallel straight lines because of theirpoor global convergence ability For functions f12 and f18although DMSDL-QBSA also trapped the local optimum itfind the minimum value compared to other algorithmsObviously the convergence speed of the DMSDL-QBSA issignificantly faster than other algorithms in the early stage andthe solution eventually found is the best In general owing toenhance the diversity of population DMSDL-QBSA has arelatively balanced global search capability when Dim 2

Furthermore from the numerical testing results on ninemultimodal functions in Table 7 we can see that DMSDL-QBSA has the best performance on f11 f12 f13 f14 f16f17 and f18 DMSDL-QBSA gets the minimum value of 0 onf11f13f16 andf17 BSA has got theminimum value of 0 onf11 f13 and f17 DE also has not got the minimum value of 0

Table 1 18 benchmark functions

Name Test function RangeSphere f1(x) 1113936

Di1 x2

i [minus100 100]D

Schwefel P222 f2 1113936Di1 |xi| + 1113937

Di1 |xi| [minus10 10]D

Schwefel P12 f3(x) 1113936Di1(1113936

ij xj)

2 [minus100 100]D

GeneralizedRosenbrock f4(x) 1113936

Dminus1i1 [100(x2

i minus xi+1)2 + (xi minus 1)2] [minus100 100]D

Step f5(x) 1113936Di1 (|xi + 05|)2 [minus100 100]D

Noise f6(x) 1113936Di1 ix4

i + rand[0 1) [minus128 128]D

SumSquares f7(x) 1113936Di1 ix2

i [minus10 10]D

Zakharov f8(x) 1113936Di1 x2

i + (1113936Di1 05ixi)

2 + (1113936Di1 05ixi)

4 [minus10 10]D

Schaffer f9(x y) 05 + (sin2(x2 minus y2) minus 05[1 + 0001(x2 minus y2)]2) [minus10 10]D

GeneralizedSchwefel 226 f10(x) 4189829D minus 1113936

Di1(x sin

|xi|

1113968) [minus500 500]D

GeneralizedRastrigin f11(x) 1113936

Di1[x2

i minus 10 cos(2πxi) + 10] [minus512 512]D

Ackley f12 20 + e minus 20 exp(minus02

(1D) 1113936Di1 x2

i

1113969

) minusexp((1D) 1113936Di1 cos 2 πxi) [minus32 32]D

GeneralizedGriewank f13(x) 1113936

Di1 (x2

i 4000) minus 1113937Di1 cos(xi

i

radic) + 1 [minus600 600]D

GeneralizedPenalized 1

f14(x) (πD) 10 sin2(πy1) + 1113936Dminus1i1 (yi minus 1)21113966

[1 + 10 sin2(πyi+1)] + (yD minus 1)2 + 1113936Di1 u(xi 10 100 4)

u(xi a k m)

k(xi minus a)m xi gt a

0 minusalexi le a

k(minusxi minus a)m xi lt minus a

⎧⎪⎨

⎪⎩ yi 1 + (14)xi

[minus50 50]D

GeneralizedPenalized 2

f15(x) (110) 10 sin2(3πx1) + 1113936Dminus1i1 (xi minus 1)2[1 + sin2(3πxi+1)]1113966

+(xD minus 1)2[1 + sin2(2πxD)] + 1113936Di1 u(xi 5 100 4)

[minus50 50]D

Alpine f16(x) 1113936Di1 |xi sin xi + 01xi| [minus10 10]D

Booth f17(x) (x1 + 2x2 minus 7)2 + (2x1 + x2 minus 5)2 [minus10 10]D

Levy f18(x) sin2(πx1) + 1113936Di1 (yi minus 1)2[1 + 10 sin2(πyi + 1)]1113966 1113967 + (yi minus 1)2[1 + 10 sin2(2πyD)] [minus10 10]D

Computational Intelligence and Neuroscience 7

on any functions DMSDL-BSA has got theminimum value of0 on f11 and f13 In summary the DMSDL-QBSA has asuperior global search capability on most multimodal func-tions when Dim 2 Obviously DMSDL-QBSA can find thebest two parameters for RF that need to be optimized becauseof its best global search capability

In this section it can be seen from Figures 1 and 2 andTables 4ndash7 that DMSDL-QBSA can obtain the best functionvalues for most cases It indicates that the hybrid strategies ofBSA dynamic multi-swarm method DE and quantum be-havior operators lead to the bird moves towards the best so-lutions AndDMSDL-QBSA has well ability of searching for thebest two parameters for RF with higher accuracy and efficiency

234 Comparison on Benchmark Functions with PopularAlgorithms When comparing the timeliness and appli-cability of DMSDL-QBSA compared with several pop-ular algorithms such as GWO WOA SCA GOA andSSA 18 benchmark functions are applied And GWOWOA GOA and SSA are swarm intelligence algorithmsIn this experiment the dimensionrsquos size of these func-tions is10 +e number of function evaluations (FEs)is100000 +e maximum value (Max) the minimumvalue (Min) the mean value (Mean) and the variance(Var) obtained by several different algorithms are shownin Tables 8 and 9 where the best results are marked inbold

Table 2 Summary of the CECprime14 test functions

Type No Functions Filowast Fi (xlowast)

Unimodal functionsF1 Rotated High-Conditioned Elliptic Function 100F2 Rotated Bent Cigar Function 200F3 Rotated Discus Function 300

Simple multimodal functions

F4 Shifted and Rotated Rosenbrockrsquos Function 400F5 Shifted and Rotated Ackleyrsquos Function 500F6 Shifted and Rotated Weierstrass Function 600F7 Shifted and Rotated Griewankrsquos Function 700F8 Shifted Rastriginrsquos Function 800F9 Shifted and Rotated Rastriginrsquos Function 900F10 Shifted Schwefelrsquos Function 1000F11 Shifted and Rotated Schwefelrsquos Function 1100F12 Shifted and Rotated Katsuura Function 1200F13 Shifted and Rotated HappyCat function 1300F14 Shifted and Rotated HGBat Function 1400F15 Shifted and Rotated Expanded Griewankrsquos Plus Rosenbrockrsquos Function 1500F16 Shifted and Rotated Expanded Scafferrsquos F6 Function 1600

Hybrid functions

F17 Hybrid function 1 (N 3) 1700F18 Hybrid function 2 (N 3) 1800F19 Hybrid function 3 (N 4) 1900F20 Hybrid function 4 (N 4) 2000F21 Hybrid function 5 (N 5) 2100F22 Hybrid function 6 (N 5) 2200

Composition functions

F23 Composition function 1 (N 5) 2300F24 Composition function 2 (N 3) 2400F25 Composition function 3 (N 3) 2500F26 Composition function 4 (N 5) 2600F27 Composition function 5 (N 5) 2700F28 Composition function 6 (N 5) 2800F29 Composition function 7 (N 3) 2900F30 Composition function 8 (N 3) 3000

Search range [minus100 100]D

Table 3 Parameter settings

Algorithm Parameter settingsBSA c1 c2 15 a1 a2 1 FQ 10DE F 05 CR 01DMSDL-PSO c1 c2 149445 F 05 CR 01 sub-swarm 10DMSDL-BSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10DMSDL-QBSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10

8 Computational Intelligence and Neuroscience

From the test results in Table 8 we can see that DMSDL-QBSA has the best performance on each unimodal functionGWO finds the value 0 on f1 f2 f3 f7 and f8 WOAobtains 0 on f1 f2 and f7 SSA works the best on f1 andf7 With the experiment of multimodal function evalua-tions Table 9 shows that DMSDL-QBSA has the best per-formance on f11f12f13f14f15f16 and f18 SSA has thebest performance on f10 GWO gets the minimum on f11WOA and SCA obtains the optimal value on f11 and f13Obviously compared with these popular algorithmsDMSDL-QBSA is a competitive algorithm for solving severalfunctions and the swarm intelligence algorithms performbetter than other algorithms +e results of Tables 8 and 9show that DMSDL-QBSA has the best performance on themost test benchmark functions

235 Comparison on CEC2014 Test Functions with HybridAlgorithms When comparing the comprehensive perfor-mance of proposed DMSDL-QBSA compared with severalhybrid algorithms such as BSA DE DMSDL-PSO andDMSDL-BSA 30 CEC2014 test functions are applied In thisexperiment the dimensionrsquos size (Dim) is set to 10 +enumber of function evaluations (FEs) is 100000 Experi-mental comparisons included the maximum value (Max)the minimum value (Min) the mean value (Mean) and thevariance (Var) are given in Tables 10 and 11 where the bestresults are marked in bold

Based on the mean value (Mean) on the CEC2014 testfunctions DMSDL-QBSA has the best performance on F2F3 F4 F6 F7 F8 F9 F10 F11 F15 F16 F17 F21 F26 F27 F29and F30 DMSDL-BSA does show an advantage on F1 F12

1050

100

10ndash50

10ndash100

10ndash150

10ndash200

10ndash250

10ndash300

Fitn

ess v

alue

Iteration

Sphere

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Fitn

ess v

alue

Step

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Noise100

10ndash1

10ndash2

10ndash3

10ndash4

10ndash5

10ndash6

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

Fitn

ess v

alue

Schaffer100

10ndash5

10ndash10

10ndash15

10ndash20

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 1 Fitness value curves of 5 hybrid algorithms on (a)f1 (b)f5 (c)f6 and (d)f9 (Dim 2)

Computational Intelligence and Neuroscience 9

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 6: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

BSA can find the minimum value HoweverDMSDL-QBSA has the best mean value and varianceon each function +e main reason is that DMSDL-QBSA has better population diversity during theinitialization period In summary the DMSDL-QBSA has best performance on unimodal functionscompared to the other algorithms when Dim 10Obviously the DMSDL-QBSA has a relatively wellconvergence speed+e evolution curves of these algorithms on fourunimodal functions f1 f5 f6 and f9 are drawn inFigure 1 It can be detected from the figure that thecurve of DMSDL-QBSA descends fastest in the

number of iterations that are far less than 10000times For f1 f5 and f9 case DMSDL-QBSA hasthe fastest convergence speed compared with otheralgorithms However the original BSA got the worstsolution because it is trapped in the local optimumprematurely For function f6 these algorithms didnot find the value 0 However the convergence speedof the DMSDL-QBSA is significantly faster thanother algorithms in the early stage and the solutioneventually found is the best Overall owing to en-hance the diversity of population DMSDL-QBSAhas a relatively excellent convergence speed whenDim 2

Variable setting number of iterationsiter bird positions Xi local optimum Pi global optimal positiongbest global optimumPgbestand fitness of sub-swarm optimal position Plbest

Input population size N dimensionrsquos size D number of function evaluationsiter max the time interval of each bird flight behaviorFQ the probability of foraging behavior P constant parameter c1 c2 a1 a2 FL iter 0 and contraction expansion factor β

Output global optimal position gbest fitness of global optimal position Pgbest(1) Begin(2) Initialize the positions of N birds using equations (16)-(17) Xi(i 1 2 N)(3) Calculated fitness Fi(i 1 2 N) set Xi to be Pi and find Pgbest(4) While iterlt itermax + 1 do(5) lowast Dynamic sub-swarmlowast(6) Regroup Pi and Xi of the sub-swarms randomly(7) Sort the Plbest and refine the first [025lowastN] best Plbest(8) update the corresponding Pi(9) lowastDifferential learning schemelowast(10) For i 1 N

(11) Construct lbest using DMS-BSA(12) Differential evolution construct V using equations (9)-(10)(13) Crossover construct U using equation (11)(14) Selection construct GV using equation (12)(15) End For(16) lowastBirds position adjustinglowast(17) If (tFQne 0)

(18) For i 1 N

(19) If randltp

(20) Foraging behavior update the position Xi of birds using equation (8)(21) Else(22) Vigilance behavior update the position Xi of birds using equation (2)(23) End If(24) End For(25) Else(26) Flight behavior is divided into producers and scroungers(27) For i 1 N

(28) If Xi is a producer(29) Producers update the position Xi of birds using equation (5)(30) Else(31) Scroungers update the position Xi of birds using equation (6)(32) End If(33) End For(34) End If(35) Evaluate f(Xi)(36) Update gbest and Pgbest(37) iter iter + 1(38) End While(39) End

ALGORITHM 1 DMSDL-QBSA

6 Computational Intelligence and Neuroscience

According to the results of Table 5 DMSDL-QBSAgets the best performance on these 8 unimodalfunctions when Dim 2 DMSDL-QBSA finds theminimum value of 0 onf1f2f3f4f5f7f8 andf9 DMSDL-QBSA has better performance on f2f4 and f9 compared with DMSDL-BSA andDMSDL-PSO DE does not perform well on thesefunctions but BSA performs relatively well on f1f3 f6 f7 f8 and f9 +e main reason is that BSAhas a better convergence performance in the earlysearch Obviously DMSDL-QBSA can find the besttwo parameters for RF that need to be optimized

(2) Multimodal functionsFrom the numerical testing results on 8 multimodalfunctions in Table 6 we can see that DMSDL-QBSAcan find the optimal solution for all multimodalfunctions and get the minimum value of 0 on f11f13 and f16 DMSDL-QBSA has the best perfor-mance on f11 f12 f13 f14 f15 and f16 BSA worksthe best on f10 DMSDL-PSO performs not verywell And DMSDL-QBSA has the best mean valueand variance on most functions +e main reason isthat DMSDL-QBSA has a stronger global explora-tion capability based on the dynamic multi-swarmmethod and differential evolution In summary the

DMSDL-QBSA has relatively well performance onunimodal functions compared to the other algo-rithms when typical Dim 10 Obviously theDMSDL-QBSA has relatively well global searchcapability

+e evolution curves of these algorithms on four multi-modal functions f12 f13 f17 and f18 when Dim 2 aredepicted in Figure 2 We can see that DMSDL-QBSA can findthe optimal solution in the same iteration For f13 and f17case DMSDL-QBSA continues to decline However theoriginal BSA and DE get parallel straight lines because of theirpoor global convergence ability For functions f12 and f18although DMSDL-QBSA also trapped the local optimum itfind the minimum value compared to other algorithmsObviously the convergence speed of the DMSDL-QBSA issignificantly faster than other algorithms in the early stage andthe solution eventually found is the best In general owing toenhance the diversity of population DMSDL-QBSA has arelatively balanced global search capability when Dim 2

Furthermore from the numerical testing results on ninemultimodal functions in Table 7 we can see that DMSDL-QBSA has the best performance on f11 f12 f13 f14 f16f17 and f18 DMSDL-QBSA gets the minimum value of 0 onf11f13f16 andf17 BSA has got theminimum value of 0 onf11 f13 and f17 DE also has not got the minimum value of 0

Table 1 18 benchmark functions

Name Test function RangeSphere f1(x) 1113936

Di1 x2

i [minus100 100]D

Schwefel P222 f2 1113936Di1 |xi| + 1113937

Di1 |xi| [minus10 10]D

Schwefel P12 f3(x) 1113936Di1(1113936

ij xj)

2 [minus100 100]D

GeneralizedRosenbrock f4(x) 1113936

Dminus1i1 [100(x2

i minus xi+1)2 + (xi minus 1)2] [minus100 100]D

Step f5(x) 1113936Di1 (|xi + 05|)2 [minus100 100]D

Noise f6(x) 1113936Di1 ix4

i + rand[0 1) [minus128 128]D

SumSquares f7(x) 1113936Di1 ix2

i [minus10 10]D

Zakharov f8(x) 1113936Di1 x2

i + (1113936Di1 05ixi)

2 + (1113936Di1 05ixi)

4 [minus10 10]D

Schaffer f9(x y) 05 + (sin2(x2 minus y2) minus 05[1 + 0001(x2 minus y2)]2) [minus10 10]D

GeneralizedSchwefel 226 f10(x) 4189829D minus 1113936

Di1(x sin

|xi|

1113968) [minus500 500]D

GeneralizedRastrigin f11(x) 1113936

Di1[x2

i minus 10 cos(2πxi) + 10] [minus512 512]D

Ackley f12 20 + e minus 20 exp(minus02

(1D) 1113936Di1 x2

i

1113969

) minusexp((1D) 1113936Di1 cos 2 πxi) [minus32 32]D

GeneralizedGriewank f13(x) 1113936

Di1 (x2

i 4000) minus 1113937Di1 cos(xi

i

radic) + 1 [minus600 600]D

GeneralizedPenalized 1

f14(x) (πD) 10 sin2(πy1) + 1113936Dminus1i1 (yi minus 1)21113966

[1 + 10 sin2(πyi+1)] + (yD minus 1)2 + 1113936Di1 u(xi 10 100 4)

u(xi a k m)

k(xi minus a)m xi gt a

0 minusalexi le a

k(minusxi minus a)m xi lt minus a

⎧⎪⎨

⎪⎩ yi 1 + (14)xi

[minus50 50]D

GeneralizedPenalized 2

f15(x) (110) 10 sin2(3πx1) + 1113936Dminus1i1 (xi minus 1)2[1 + sin2(3πxi+1)]1113966

+(xD minus 1)2[1 + sin2(2πxD)] + 1113936Di1 u(xi 5 100 4)

[minus50 50]D

Alpine f16(x) 1113936Di1 |xi sin xi + 01xi| [minus10 10]D

Booth f17(x) (x1 + 2x2 minus 7)2 + (2x1 + x2 minus 5)2 [minus10 10]D

Levy f18(x) sin2(πx1) + 1113936Di1 (yi minus 1)2[1 + 10 sin2(πyi + 1)]1113966 1113967 + (yi minus 1)2[1 + 10 sin2(2πyD)] [minus10 10]D

Computational Intelligence and Neuroscience 7

on any functions DMSDL-BSA has got theminimum value of0 on f11 and f13 In summary the DMSDL-QBSA has asuperior global search capability on most multimodal func-tions when Dim 2 Obviously DMSDL-QBSA can find thebest two parameters for RF that need to be optimized becauseof its best global search capability

In this section it can be seen from Figures 1 and 2 andTables 4ndash7 that DMSDL-QBSA can obtain the best functionvalues for most cases It indicates that the hybrid strategies ofBSA dynamic multi-swarm method DE and quantum be-havior operators lead to the bird moves towards the best so-lutions AndDMSDL-QBSA has well ability of searching for thebest two parameters for RF with higher accuracy and efficiency

234 Comparison on Benchmark Functions with PopularAlgorithms When comparing the timeliness and appli-cability of DMSDL-QBSA compared with several pop-ular algorithms such as GWO WOA SCA GOA andSSA 18 benchmark functions are applied And GWOWOA GOA and SSA are swarm intelligence algorithmsIn this experiment the dimensionrsquos size of these func-tions is10 +e number of function evaluations (FEs)is100000 +e maximum value (Max) the minimumvalue (Min) the mean value (Mean) and the variance(Var) obtained by several different algorithms are shownin Tables 8 and 9 where the best results are marked inbold

Table 2 Summary of the CECprime14 test functions

Type No Functions Filowast Fi (xlowast)

Unimodal functionsF1 Rotated High-Conditioned Elliptic Function 100F2 Rotated Bent Cigar Function 200F3 Rotated Discus Function 300

Simple multimodal functions

F4 Shifted and Rotated Rosenbrockrsquos Function 400F5 Shifted and Rotated Ackleyrsquos Function 500F6 Shifted and Rotated Weierstrass Function 600F7 Shifted and Rotated Griewankrsquos Function 700F8 Shifted Rastriginrsquos Function 800F9 Shifted and Rotated Rastriginrsquos Function 900F10 Shifted Schwefelrsquos Function 1000F11 Shifted and Rotated Schwefelrsquos Function 1100F12 Shifted and Rotated Katsuura Function 1200F13 Shifted and Rotated HappyCat function 1300F14 Shifted and Rotated HGBat Function 1400F15 Shifted and Rotated Expanded Griewankrsquos Plus Rosenbrockrsquos Function 1500F16 Shifted and Rotated Expanded Scafferrsquos F6 Function 1600

Hybrid functions

F17 Hybrid function 1 (N 3) 1700F18 Hybrid function 2 (N 3) 1800F19 Hybrid function 3 (N 4) 1900F20 Hybrid function 4 (N 4) 2000F21 Hybrid function 5 (N 5) 2100F22 Hybrid function 6 (N 5) 2200

Composition functions

F23 Composition function 1 (N 5) 2300F24 Composition function 2 (N 3) 2400F25 Composition function 3 (N 3) 2500F26 Composition function 4 (N 5) 2600F27 Composition function 5 (N 5) 2700F28 Composition function 6 (N 5) 2800F29 Composition function 7 (N 3) 2900F30 Composition function 8 (N 3) 3000

Search range [minus100 100]D

Table 3 Parameter settings

Algorithm Parameter settingsBSA c1 c2 15 a1 a2 1 FQ 10DE F 05 CR 01DMSDL-PSO c1 c2 149445 F 05 CR 01 sub-swarm 10DMSDL-BSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10DMSDL-QBSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10

8 Computational Intelligence and Neuroscience

From the test results in Table 8 we can see that DMSDL-QBSA has the best performance on each unimodal functionGWO finds the value 0 on f1 f2 f3 f7 and f8 WOAobtains 0 on f1 f2 and f7 SSA works the best on f1 andf7 With the experiment of multimodal function evalua-tions Table 9 shows that DMSDL-QBSA has the best per-formance on f11f12f13f14f15f16 and f18 SSA has thebest performance on f10 GWO gets the minimum on f11WOA and SCA obtains the optimal value on f11 and f13Obviously compared with these popular algorithmsDMSDL-QBSA is a competitive algorithm for solving severalfunctions and the swarm intelligence algorithms performbetter than other algorithms +e results of Tables 8 and 9show that DMSDL-QBSA has the best performance on themost test benchmark functions

235 Comparison on CEC2014 Test Functions with HybridAlgorithms When comparing the comprehensive perfor-mance of proposed DMSDL-QBSA compared with severalhybrid algorithms such as BSA DE DMSDL-PSO andDMSDL-BSA 30 CEC2014 test functions are applied In thisexperiment the dimensionrsquos size (Dim) is set to 10 +enumber of function evaluations (FEs) is 100000 Experi-mental comparisons included the maximum value (Max)the minimum value (Min) the mean value (Mean) and thevariance (Var) are given in Tables 10 and 11 where the bestresults are marked in bold

Based on the mean value (Mean) on the CEC2014 testfunctions DMSDL-QBSA has the best performance on F2F3 F4 F6 F7 F8 F9 F10 F11 F15 F16 F17 F21 F26 F27 F29and F30 DMSDL-BSA does show an advantage on F1 F12

1050

100

10ndash50

10ndash100

10ndash150

10ndash200

10ndash250

10ndash300

Fitn

ess v

alue

Iteration

Sphere

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Fitn

ess v

alue

Step

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Noise100

10ndash1

10ndash2

10ndash3

10ndash4

10ndash5

10ndash6

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

Fitn

ess v

alue

Schaffer100

10ndash5

10ndash10

10ndash15

10ndash20

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 1 Fitness value curves of 5 hybrid algorithms on (a)f1 (b)f5 (c)f6 and (d)f9 (Dim 2)

Computational Intelligence and Neuroscience 9

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 7: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

According to the results of Table 5 DMSDL-QBSAgets the best performance on these 8 unimodalfunctions when Dim 2 DMSDL-QBSA finds theminimum value of 0 onf1f2f3f4f5f7f8 andf9 DMSDL-QBSA has better performance on f2f4 and f9 compared with DMSDL-BSA andDMSDL-PSO DE does not perform well on thesefunctions but BSA performs relatively well on f1f3 f6 f7 f8 and f9 +e main reason is that BSAhas a better convergence performance in the earlysearch Obviously DMSDL-QBSA can find the besttwo parameters for RF that need to be optimized

(2) Multimodal functionsFrom the numerical testing results on 8 multimodalfunctions in Table 6 we can see that DMSDL-QBSAcan find the optimal solution for all multimodalfunctions and get the minimum value of 0 on f11f13 and f16 DMSDL-QBSA has the best perfor-mance on f11 f12 f13 f14 f15 and f16 BSA worksthe best on f10 DMSDL-PSO performs not verywell And DMSDL-QBSA has the best mean valueand variance on most functions +e main reason isthat DMSDL-QBSA has a stronger global explora-tion capability based on the dynamic multi-swarmmethod and differential evolution In summary the

DMSDL-QBSA has relatively well performance onunimodal functions compared to the other algo-rithms when typical Dim 10 Obviously theDMSDL-QBSA has relatively well global searchcapability

+e evolution curves of these algorithms on four multi-modal functions f12 f13 f17 and f18 when Dim 2 aredepicted in Figure 2 We can see that DMSDL-QBSA can findthe optimal solution in the same iteration For f13 and f17case DMSDL-QBSA continues to decline However theoriginal BSA and DE get parallel straight lines because of theirpoor global convergence ability For functions f12 and f18although DMSDL-QBSA also trapped the local optimum itfind the minimum value compared to other algorithmsObviously the convergence speed of the DMSDL-QBSA issignificantly faster than other algorithms in the early stage andthe solution eventually found is the best In general owing toenhance the diversity of population DMSDL-QBSA has arelatively balanced global search capability when Dim 2

Furthermore from the numerical testing results on ninemultimodal functions in Table 7 we can see that DMSDL-QBSA has the best performance on f11 f12 f13 f14 f16f17 and f18 DMSDL-QBSA gets the minimum value of 0 onf11f13f16 andf17 BSA has got theminimum value of 0 onf11 f13 and f17 DE also has not got the minimum value of 0

Table 1 18 benchmark functions

Name Test function RangeSphere f1(x) 1113936

Di1 x2

i [minus100 100]D

Schwefel P222 f2 1113936Di1 |xi| + 1113937

Di1 |xi| [minus10 10]D

Schwefel P12 f3(x) 1113936Di1(1113936

ij xj)

2 [minus100 100]D

GeneralizedRosenbrock f4(x) 1113936

Dminus1i1 [100(x2

i minus xi+1)2 + (xi minus 1)2] [minus100 100]D

Step f5(x) 1113936Di1 (|xi + 05|)2 [minus100 100]D

Noise f6(x) 1113936Di1 ix4

i + rand[0 1) [minus128 128]D

SumSquares f7(x) 1113936Di1 ix2

i [minus10 10]D

Zakharov f8(x) 1113936Di1 x2

i + (1113936Di1 05ixi)

2 + (1113936Di1 05ixi)

4 [minus10 10]D

Schaffer f9(x y) 05 + (sin2(x2 minus y2) minus 05[1 + 0001(x2 minus y2)]2) [minus10 10]D

GeneralizedSchwefel 226 f10(x) 4189829D minus 1113936

Di1(x sin

|xi|

1113968) [minus500 500]D

GeneralizedRastrigin f11(x) 1113936

Di1[x2

i minus 10 cos(2πxi) + 10] [minus512 512]D

Ackley f12 20 + e minus 20 exp(minus02

(1D) 1113936Di1 x2

i

1113969

) minusexp((1D) 1113936Di1 cos 2 πxi) [minus32 32]D

GeneralizedGriewank f13(x) 1113936

Di1 (x2

i 4000) minus 1113937Di1 cos(xi

i

radic) + 1 [minus600 600]D

GeneralizedPenalized 1

f14(x) (πD) 10 sin2(πy1) + 1113936Dminus1i1 (yi minus 1)21113966

[1 + 10 sin2(πyi+1)] + (yD minus 1)2 + 1113936Di1 u(xi 10 100 4)

u(xi a k m)

k(xi minus a)m xi gt a

0 minusalexi le a

k(minusxi minus a)m xi lt minus a

⎧⎪⎨

⎪⎩ yi 1 + (14)xi

[minus50 50]D

GeneralizedPenalized 2

f15(x) (110) 10 sin2(3πx1) + 1113936Dminus1i1 (xi minus 1)2[1 + sin2(3πxi+1)]1113966

+(xD minus 1)2[1 + sin2(2πxD)] + 1113936Di1 u(xi 5 100 4)

[minus50 50]D

Alpine f16(x) 1113936Di1 |xi sin xi + 01xi| [minus10 10]D

Booth f17(x) (x1 + 2x2 minus 7)2 + (2x1 + x2 minus 5)2 [minus10 10]D

Levy f18(x) sin2(πx1) + 1113936Di1 (yi minus 1)2[1 + 10 sin2(πyi + 1)]1113966 1113967 + (yi minus 1)2[1 + 10 sin2(2πyD)] [minus10 10]D

Computational Intelligence and Neuroscience 7

on any functions DMSDL-BSA has got theminimum value of0 on f11 and f13 In summary the DMSDL-QBSA has asuperior global search capability on most multimodal func-tions when Dim 2 Obviously DMSDL-QBSA can find thebest two parameters for RF that need to be optimized becauseof its best global search capability

In this section it can be seen from Figures 1 and 2 andTables 4ndash7 that DMSDL-QBSA can obtain the best functionvalues for most cases It indicates that the hybrid strategies ofBSA dynamic multi-swarm method DE and quantum be-havior operators lead to the bird moves towards the best so-lutions AndDMSDL-QBSA has well ability of searching for thebest two parameters for RF with higher accuracy and efficiency

234 Comparison on Benchmark Functions with PopularAlgorithms When comparing the timeliness and appli-cability of DMSDL-QBSA compared with several pop-ular algorithms such as GWO WOA SCA GOA andSSA 18 benchmark functions are applied And GWOWOA GOA and SSA are swarm intelligence algorithmsIn this experiment the dimensionrsquos size of these func-tions is10 +e number of function evaluations (FEs)is100000 +e maximum value (Max) the minimumvalue (Min) the mean value (Mean) and the variance(Var) obtained by several different algorithms are shownin Tables 8 and 9 where the best results are marked inbold

Table 2 Summary of the CECprime14 test functions

Type No Functions Filowast Fi (xlowast)

Unimodal functionsF1 Rotated High-Conditioned Elliptic Function 100F2 Rotated Bent Cigar Function 200F3 Rotated Discus Function 300

Simple multimodal functions

F4 Shifted and Rotated Rosenbrockrsquos Function 400F5 Shifted and Rotated Ackleyrsquos Function 500F6 Shifted and Rotated Weierstrass Function 600F7 Shifted and Rotated Griewankrsquos Function 700F8 Shifted Rastriginrsquos Function 800F9 Shifted and Rotated Rastriginrsquos Function 900F10 Shifted Schwefelrsquos Function 1000F11 Shifted and Rotated Schwefelrsquos Function 1100F12 Shifted and Rotated Katsuura Function 1200F13 Shifted and Rotated HappyCat function 1300F14 Shifted and Rotated HGBat Function 1400F15 Shifted and Rotated Expanded Griewankrsquos Plus Rosenbrockrsquos Function 1500F16 Shifted and Rotated Expanded Scafferrsquos F6 Function 1600

Hybrid functions

F17 Hybrid function 1 (N 3) 1700F18 Hybrid function 2 (N 3) 1800F19 Hybrid function 3 (N 4) 1900F20 Hybrid function 4 (N 4) 2000F21 Hybrid function 5 (N 5) 2100F22 Hybrid function 6 (N 5) 2200

Composition functions

F23 Composition function 1 (N 5) 2300F24 Composition function 2 (N 3) 2400F25 Composition function 3 (N 3) 2500F26 Composition function 4 (N 5) 2600F27 Composition function 5 (N 5) 2700F28 Composition function 6 (N 5) 2800F29 Composition function 7 (N 3) 2900F30 Composition function 8 (N 3) 3000

Search range [minus100 100]D

Table 3 Parameter settings

Algorithm Parameter settingsBSA c1 c2 15 a1 a2 1 FQ 10DE F 05 CR 01DMSDL-PSO c1 c2 149445 F 05 CR 01 sub-swarm 10DMSDL-BSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10DMSDL-QBSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10

8 Computational Intelligence and Neuroscience

From the test results in Table 8 we can see that DMSDL-QBSA has the best performance on each unimodal functionGWO finds the value 0 on f1 f2 f3 f7 and f8 WOAobtains 0 on f1 f2 and f7 SSA works the best on f1 andf7 With the experiment of multimodal function evalua-tions Table 9 shows that DMSDL-QBSA has the best per-formance on f11f12f13f14f15f16 and f18 SSA has thebest performance on f10 GWO gets the minimum on f11WOA and SCA obtains the optimal value on f11 and f13Obviously compared with these popular algorithmsDMSDL-QBSA is a competitive algorithm for solving severalfunctions and the swarm intelligence algorithms performbetter than other algorithms +e results of Tables 8 and 9show that DMSDL-QBSA has the best performance on themost test benchmark functions

235 Comparison on CEC2014 Test Functions with HybridAlgorithms When comparing the comprehensive perfor-mance of proposed DMSDL-QBSA compared with severalhybrid algorithms such as BSA DE DMSDL-PSO andDMSDL-BSA 30 CEC2014 test functions are applied In thisexperiment the dimensionrsquos size (Dim) is set to 10 +enumber of function evaluations (FEs) is 100000 Experi-mental comparisons included the maximum value (Max)the minimum value (Min) the mean value (Mean) and thevariance (Var) are given in Tables 10 and 11 where the bestresults are marked in bold

Based on the mean value (Mean) on the CEC2014 testfunctions DMSDL-QBSA has the best performance on F2F3 F4 F6 F7 F8 F9 F10 F11 F15 F16 F17 F21 F26 F27 F29and F30 DMSDL-BSA does show an advantage on F1 F12

1050

100

10ndash50

10ndash100

10ndash150

10ndash200

10ndash250

10ndash300

Fitn

ess v

alue

Iteration

Sphere

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Fitn

ess v

alue

Step

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Noise100

10ndash1

10ndash2

10ndash3

10ndash4

10ndash5

10ndash6

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

Fitn

ess v

alue

Schaffer100

10ndash5

10ndash10

10ndash15

10ndash20

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 1 Fitness value curves of 5 hybrid algorithms on (a)f1 (b)f5 (c)f6 and (d)f9 (Dim 2)

Computational Intelligence and Neuroscience 9

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 8: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

on any functions DMSDL-BSA has got theminimum value of0 on f11 and f13 In summary the DMSDL-QBSA has asuperior global search capability on most multimodal func-tions when Dim 2 Obviously DMSDL-QBSA can find thebest two parameters for RF that need to be optimized becauseof its best global search capability

In this section it can be seen from Figures 1 and 2 andTables 4ndash7 that DMSDL-QBSA can obtain the best functionvalues for most cases It indicates that the hybrid strategies ofBSA dynamic multi-swarm method DE and quantum be-havior operators lead to the bird moves towards the best so-lutions AndDMSDL-QBSA has well ability of searching for thebest two parameters for RF with higher accuracy and efficiency

234 Comparison on Benchmark Functions with PopularAlgorithms When comparing the timeliness and appli-cability of DMSDL-QBSA compared with several pop-ular algorithms such as GWO WOA SCA GOA andSSA 18 benchmark functions are applied And GWOWOA GOA and SSA are swarm intelligence algorithmsIn this experiment the dimensionrsquos size of these func-tions is10 +e number of function evaluations (FEs)is100000 +e maximum value (Max) the minimumvalue (Min) the mean value (Mean) and the variance(Var) obtained by several different algorithms are shownin Tables 8 and 9 where the best results are marked inbold

Table 2 Summary of the CECprime14 test functions

Type No Functions Filowast Fi (xlowast)

Unimodal functionsF1 Rotated High-Conditioned Elliptic Function 100F2 Rotated Bent Cigar Function 200F3 Rotated Discus Function 300

Simple multimodal functions

F4 Shifted and Rotated Rosenbrockrsquos Function 400F5 Shifted and Rotated Ackleyrsquos Function 500F6 Shifted and Rotated Weierstrass Function 600F7 Shifted and Rotated Griewankrsquos Function 700F8 Shifted Rastriginrsquos Function 800F9 Shifted and Rotated Rastriginrsquos Function 900F10 Shifted Schwefelrsquos Function 1000F11 Shifted and Rotated Schwefelrsquos Function 1100F12 Shifted and Rotated Katsuura Function 1200F13 Shifted and Rotated HappyCat function 1300F14 Shifted and Rotated HGBat Function 1400F15 Shifted and Rotated Expanded Griewankrsquos Plus Rosenbrockrsquos Function 1500F16 Shifted and Rotated Expanded Scafferrsquos F6 Function 1600

Hybrid functions

F17 Hybrid function 1 (N 3) 1700F18 Hybrid function 2 (N 3) 1800F19 Hybrid function 3 (N 4) 1900F20 Hybrid function 4 (N 4) 2000F21 Hybrid function 5 (N 5) 2100F22 Hybrid function 6 (N 5) 2200

Composition functions

F23 Composition function 1 (N 5) 2300F24 Composition function 2 (N 3) 2400F25 Composition function 3 (N 3) 2500F26 Composition function 4 (N 5) 2600F27 Composition function 5 (N 5) 2700F28 Composition function 6 (N 5) 2800F29 Composition function 7 (N 3) 2900F30 Composition function 8 (N 3) 3000

Search range [minus100 100]D

Table 3 Parameter settings

Algorithm Parameter settingsBSA c1 c2 15 a1 a2 1 FQ 10DE F 05 CR 01DMSDL-PSO c1 c2 149445 F 05 CR 01 sub-swarm 10DMSDL-BSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10DMSDL-QBSA c1 c2 15 a1 a2 1 FQ 10 F 05 CR 01 sub-swarm 10

8 Computational Intelligence and Neuroscience

From the test results in Table 8 we can see that DMSDL-QBSA has the best performance on each unimodal functionGWO finds the value 0 on f1 f2 f3 f7 and f8 WOAobtains 0 on f1 f2 and f7 SSA works the best on f1 andf7 With the experiment of multimodal function evalua-tions Table 9 shows that DMSDL-QBSA has the best per-formance on f11f12f13f14f15f16 and f18 SSA has thebest performance on f10 GWO gets the minimum on f11WOA and SCA obtains the optimal value on f11 and f13Obviously compared with these popular algorithmsDMSDL-QBSA is a competitive algorithm for solving severalfunctions and the swarm intelligence algorithms performbetter than other algorithms +e results of Tables 8 and 9show that DMSDL-QBSA has the best performance on themost test benchmark functions

235 Comparison on CEC2014 Test Functions with HybridAlgorithms When comparing the comprehensive perfor-mance of proposed DMSDL-QBSA compared with severalhybrid algorithms such as BSA DE DMSDL-PSO andDMSDL-BSA 30 CEC2014 test functions are applied In thisexperiment the dimensionrsquos size (Dim) is set to 10 +enumber of function evaluations (FEs) is 100000 Experi-mental comparisons included the maximum value (Max)the minimum value (Min) the mean value (Mean) and thevariance (Var) are given in Tables 10 and 11 where the bestresults are marked in bold

Based on the mean value (Mean) on the CEC2014 testfunctions DMSDL-QBSA has the best performance on F2F3 F4 F6 F7 F8 F9 F10 F11 F15 F16 F17 F21 F26 F27 F29and F30 DMSDL-BSA does show an advantage on F1 F12

1050

100

10ndash50

10ndash100

10ndash150

10ndash200

10ndash250

10ndash300

Fitn

ess v

alue

Iteration

Sphere

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Fitn

ess v

alue

Step

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Noise100

10ndash1

10ndash2

10ndash3

10ndash4

10ndash5

10ndash6

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

Fitn

ess v

alue

Schaffer100

10ndash5

10ndash10

10ndash15

10ndash20

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 1 Fitness value curves of 5 hybrid algorithms on (a)f1 (b)f5 (c)f6 and (d)f9 (Dim 2)

Computational Intelligence and Neuroscience 9

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 9: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

From the test results in Table 8 we can see that DMSDL-QBSA has the best performance on each unimodal functionGWO finds the value 0 on f1 f2 f3 f7 and f8 WOAobtains 0 on f1 f2 and f7 SSA works the best on f1 andf7 With the experiment of multimodal function evalua-tions Table 9 shows that DMSDL-QBSA has the best per-formance on f11f12f13f14f15f16 and f18 SSA has thebest performance on f10 GWO gets the minimum on f11WOA and SCA obtains the optimal value on f11 and f13Obviously compared with these popular algorithmsDMSDL-QBSA is a competitive algorithm for solving severalfunctions and the swarm intelligence algorithms performbetter than other algorithms +e results of Tables 8 and 9show that DMSDL-QBSA has the best performance on themost test benchmark functions

235 Comparison on CEC2014 Test Functions with HybridAlgorithms When comparing the comprehensive perfor-mance of proposed DMSDL-QBSA compared with severalhybrid algorithms such as BSA DE DMSDL-PSO andDMSDL-BSA 30 CEC2014 test functions are applied In thisexperiment the dimensionrsquos size (Dim) is set to 10 +enumber of function evaluations (FEs) is 100000 Experi-mental comparisons included the maximum value (Max)the minimum value (Min) the mean value (Mean) and thevariance (Var) are given in Tables 10 and 11 where the bestresults are marked in bold

Based on the mean value (Mean) on the CEC2014 testfunctions DMSDL-QBSA has the best performance on F2F3 F4 F6 F7 F8 F9 F10 F11 F15 F16 F17 F21 F26 F27 F29and F30 DMSDL-BSA does show an advantage on F1 F12

1050

100

10ndash50

10ndash100

10ndash150

10ndash200

10ndash250

10ndash300

Fitn

ess v

alue

Iteration

Sphere

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Fitn

ess v

alue

Step

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Noise100

10ndash1

10ndash2

10ndash3

10ndash4

10ndash5

10ndash6

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

Fitn

ess v

alue

Schaffer100

10ndash5

10ndash10

10ndash15

10ndash20

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 1 Fitness value curves of 5 hybrid algorithms on (a)f1 (b)f5 (c)f6 and (d)f9 (Dim 2)

Computational Intelligence and Neuroscience 9

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 10: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

F13 F14 F18 F19 F20 F24 F25 and F28 According to theresults we can observe that DMSDL-QBSA can find theminimal value on 17 CEC2014 test functions DMSDL-BSAgets the minimum value on F1 F12 F13 F14 F18 F19 F24 andF30 and DMSDL-PSO obtains the minimum value on F4 F7and F23 Owing to enhance the capability of exploitationDMSDL-QBSA is better than DMSDL-BSA and DMSDL-PSO on most functions From the results of tests it can beseen that DMSDL-QBSA performs better than BSA DEDMSDL-PSO and DMSDL-BSA It can be observed thatDMSDL-QBSA obtains optimal value It can be concluded

that DMSDL-QBSA has better global search ability andbetter robustness on these test suites

3 Optimize RF Classification Model Based onImproved BSA Algorithm

31 RF Classification Model RF as proposed by Breimanet al is an ensemble learning model based on bagging andrandom subspace methods +e whole modeling processincludes building decision trees and decision processes +eprocess of constructing decision trees is mainly composed of

Fitn

ess v

alue

Ackley105

100

10ndash5

10ndash10

10ndash15

10ndash25

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(a)

Griewank

Fitn

ess v

alue

105

100

10ndash5

10ndash10

10ndash15

10ndash25

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(b)

Fitn

ess v

alue

Booth105

100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

Iteration

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

100 101 102 103 104

(c)

LevyFi

tnes

s val

ue100

10ndash5

10ndash10

10ndash15

10ndash20

10ndash25

10ndash30

10ndash35

BSADEDMSDL-PSO

DMSDL-BSADMSDL-QBSA

Iteration100 101 102 103 104

(d)

Figure 2 Fitness value curves of 5 hybrid algorithms on (a)f12 (b)f13 (c)f17 and (d)f18 (Dim 2)

10 Computational Intelligence and Neuroscience

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 11: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

ntree decision trees and each of which consists of nonleafnodes and leaf nodes +e leaf node is a child node of thenode branch It is supposed that the dataset hasM attributesWhen each leaf node of the decision tree needs to be seg-mented the mtry attributes are randomly selected from theM attributes as the reselected splitting variables of this node+is process can be defined as follows

Sj Pi i 1 2 mtry

111386811138681113868111386811138681113882 1113883 (18)

where Sj is the splitting variable of the j-th leaf node of thedecision tree and Pi is the probability that mtry reselectedattributes are selected as the splitting attribute of the node

+e nonleaf node is a parent node that classifies trainingdata as a left or right child node +e function of k-th de-cision tree is as follows

hk(c | x) 0 fl xk( 1113857ltfl xk( 1113857 + τ

1 fl xk( 1113857gefl xk( 1113857 + τ1113896 l 1 2 ntree

(19)

where c 0 or 1 where the symbol 0 indicates that the k-throw of data is classified as a negative label and the symbol 1

indicates that the k-th row of data is classified as a positivelabel Here fl is the training function of the l-th decisiontree based on the splitting variable S Xk is the k-th row ofdata in the dataset by random sampling with replacement+e symbol τ is a positive constant which is used as thethreshold value of the training decision

When decision processes are trained each row of datawill be input into a leaf node of each decision tree +eaverage of ntree decision tree classification results is used asthe final classification result +is process can be writtenmathematically as follows

ck l times1

ntree1113944

ntree

k1hk(c | x) (20)

where l is the number of decision trees which judged k-throw of data as c

From the above principle we can see that it is mainlynecessary to determine two parameters of ntree and mtry inthe RF modeling process In order to verify the influence ofthese two parameters on the classification accuracy of the RFclassification model the Ionosphere dataset is used to testthe influence of the two parameters on the performance of

Table 4 Comparison on nine unimodal functions with 5 hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 10317E+ 04 12466E+ 04 17232E+ 04 10874E+ 04 74436E+ 01Min 0 40214E+ 03 16580Eminus 02 0 0Mean 66202E+ 00 46622E+ 03 60643E+ 01 63444E+ 00 33920Eminus 02Var 20892E+ 02 82293E+ 02 73868E+ 02 19784E+ 02 12343E+ 00

f2

Max 33278E+ 02 18153E+ 02 80436E+01 49554E+02 29845E+ 01Min 49286Eminus 182 15889E+ 01 10536Eminus 01 30074Eminus 182 0Mean 57340Eminus 02 18349E+ 01 29779E+ 00 69700Eminus 02 11220Eminus 02Var 35768E+ 00 48296E+ 00 24966E+ 00 51243E+ 00 42864Eminus 01

f3

Max 13078E+ 04 13949E+ 04 19382E+ 04 12899E+ 04 84935E+ 01Min 34873Eminus 250 40327E+ 03 65860Eminus 02 16352Eminus 249 0Mean 76735E+ 00 46130E+ 03 86149E+ 01 75623E+ 00 33260Eminus 02Var 24929E+ 02 84876E+ 02 81698E+ 02 24169E+ 02 12827E+ 00

f4

Max 25311E+ 09 23900E+ 09 48639E+ 09 37041E+ 09 35739E+ 08Min 52310E+ 00 27690E+ 08 84802E+ 00 50021E+ 00 89799E+ 00Mean 69192E+ 05 33334E+ 08 11841E+ 07 99162E+ 05 68518E+ 04Var 36005E+ 07 14428E+ 08 18149E+ 08 45261E+ 07 40563E+ 06

f5

Max 11619E+ 04 13773E+ 04 16188E+ 04 13194E+ 04 51960E+ 03Min 55043Eminus 15 56109E+ 03 11894Eminus 02 42090Eminus 15 15157E+ 00Mean 59547E+ 00 63278E+ 03 52064E+ 01 65198E+ 00 35440E+ 00Var 20533E+ 02 96605E+ 02 62095E+ 02 22457E+ 02 78983E+ 01

f6

Max 32456E+ 00 73566E+ 00 89320E+ 00 28822E+ 00 14244E+ 00Min 13994Eminus 04 12186E+ 00 22482Eminus 03 82911Eminus 05 10911Eminus 05Mean 21509Eminus 03 14021E+ 00 11982Eminus 01 19200Eminus 03 61476Eminus 04Var 53780Eminus 02 38482Eminus 01 35554Eminus 01 50940Eminus 02 19880Eminus 02

f7

Max 47215E+ 02 67534E+ 02 56753E+ 02 53090E+ 02 23468E+ 02Min 0 22001E+ 02 56300Eminus 02 0 0Mean 24908Eminus 01 23377E+ 02 92909E+ 00 30558Eminus 01 94500Eminus 02Var 85433E+ 00 33856E+ 01 22424E+ 01 10089E+ 01 35569E+ 00

f8

Max 32500E+ 02 24690E+ 02 27226E+ 02 28001E+ 02 17249E+ 02Min 14678Eminus 239 83483E+ 01 59820Eminus 02 89624Eminus 239 0Mean 19072Eminus 01 91050E+ 01 79923E+ 00 23232Eminus 01 81580Eminus 02Var 63211E+ 00 13811E+ 01 17349E+ 01 64400E+ 00 29531E+ 00

Computational Intelligence and Neuroscience 11

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 12: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

the RF model as shown in Figure 3 where the horizontalaxis represents ntree and mtry respectively and the verticalaxis represents the accuracy of the RF classification model

(1) Parameter analysis of ntree

When the number of predictor variables mtry is set to6 the number of decision trees ntree is cyclically setfrom 0 to 1000 at intervals of 20 And the evolu-tionary progress of RF classification model accuracywith the change of ntree is shown in Figure 3(a) Fromthe curve in Figure 3(a) we can see that the accuracyof RF is gradually improved with the increase of thenumber N of decision trees However when thenumber of decision trees ntree is greater than a certainvalue the improvement of RF performance hasbecome gentle without obvious improvement butthe running time becomes longer

(2) Parameter analysis of mtry

When the number of decision trees ntree is set to 500the number of predictor variables mtry is cyclically setfrom 1 to 32+e limit of mtry is set to 32 because thenumber of attributes of the Ionosphere dataset is 32And the obtained curve of RF classification modelaccuracy with mtry transformation is shown inFigure 3(b) And we can see that with the increase ofthe splitting property of the selection the classifi-cation performance of RF is gradually improved butwhen the number of predictor variables mtry isgreater than 9 the RF generates overfitting and theaccuracy of RF begins to decrease+emain reason isthat too many split attributes are selected whichresulted in the same splitting attributes which areowned by a large number of decision trees +isreduced the diversity of decision trees

Table 5 Comparison on nine unimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f1

Max 18411E+ 02 34713E+ 02 29347E+ 02 16918E+ 02 16789E+ 00Min 0 57879Eminus 01 0 0 32988Eminus 238Mean 55890Eminus 02 11150E+ 00 16095Eminus 01 36580Eminus 02 42590Eminus 03Var 26628E+ 00 57218E+ 00 54561E+ 00 20867E+ 00 75858Eminus 02

f2

Max 22980E+ 00 21935E+ 00 32363E+ 00 31492E+ 00 11020E+ 00Min 55923Eminus 266 82690Eminus 02 29096Eminus 242 34367Eminus 241 0Mean 92769Eminus 04 93960Eminus 02 74900Eminus 03 12045Eminus 03 21565Eminus 04Var 33310Eminus 02 69080Eminus 02 40100Eminus 02 42190Eminus 02 13130Eminus 02

f3

Max 10647E+ 02 13245E+ 02 36203E+ 02 23793E+ 02 13089E+ 02Min 0 59950Eminus 01 0 0 0Mean 23040Eminus 02 85959Eminus 01 25020Eminus 01 63560Eminus 02 17170Eminus 02Var 12892E+ 00 28747E+ 00 75569E+ 00 29203E+ 00 13518E+ 00

f4

Max 17097E+ 02 61375E+ 01 68210E+ 01 59141E+ 01 14726E+ 01Min 16325Eminus 21 40940Eminus 02 82726Eminus 13 34830Eminus 25 0Mean 22480Eminus 02 93940Eminus 02 15730Eminus 02 11020Eminus 02 14308Eminus 02Var 17987E+ 00 74859Eminus 01 76015Eminus 01 64984Eminus 01 27598Eminus 01

f5

Max 15719E+ 02 22513E+ 02 33938E+ 02 18946E+ 02 87078E+ 01Min 0 70367Eminus 01 0 0 0Mean 34380Eminus 02 18850E+ 00 17082Eminus 01 50090Eminus 02 11880Eminus 02Var 19018E+ 00 56163E+ 00 59868E+ 00 24994E+ 00 91749Eminus 01

f6

Max 15887Eminus 01 15649Eminus 01 15919Eminus 01 13461Eminus 01 10139Eminus 01Min 25412Eminus 05 45060Eminus 04 59140Eminus 05 41588Eminus 05 73524Eminus 06Mean 23437Eminus 04 13328Eminus 03 60989Eminus 04 23462Eminus 04 92394Eminus 05Var 24301Eminus 03 36700Eminus 03 35200Eminus 03 19117Eminus 03 14664Eminus 03

f7

Max 35804E+ 00 28236E+ 00 17372E+ 00 27513E+ 00 19411E+ 00Min 0 76633Eminus 03 0 0 0Mean 85474Eminus 04 16590Eminus 02 86701Eminus 04 76781Eminus 04 31439Eminus 04Var 44630Eminus 02 60390Eminus 02 24090Eminus 02 37520Eminus 02 22333Eminus 02

f8

Max 43247E+ 00 21924E+ 00 53555E+ 00 33944E+ 00 55079Eminus 01Min 0 86132Eminus 03 0 0 0Mean 11649Eminus 03 19330Eminus 02 17145Eminus 03 73418Eminus 04 69138Eminus 05Var 59280Eminus 02 47800Eminus 02 75810Eminus 02 41414Eminus 02 57489Eminus 03

f9

Max 27030Eminus 02 35200Eminus 02 17240Eminus 02 40480Eminus 02 25230Eminus 02Min 0 50732Eminus 03 0 0 0Mean 61701Eminus 05 62500Eminus 03 88947Eminus 04 84870Eminus 05 27362Eminus 05Var 76990Eminus 04 13062Eminus 03 20400Eminus 03 96160Eminus 04 55610Eminus 04

12 Computational Intelligence and Neuroscience

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 13: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

In summary for the RF classification model to obtain theideal optimal solution the selection of the number of de-cision trees ntree and the number of predictor variables mtryare very important And the classification accuracy of the RFclassification model can only be optimized by the com-prehensive optimization of these two parameters So it isnecessary to use the proposed algorithm to find a suitable setof RF parameters Next we will optimize the RF classifi-cation model by the improved BSA proposed in Section 2

32 RF Model Based on an Improved Bird Swarm AlgorithmImproved bird swarm algorithm optimized RF classificationmodel (DMSDL-QBSA-RF) is based on the improved birdswarm algorithm optimized the RF classification model andintroduced the training dataset into the training process ofthe RF classification model finally getting the DMSDL-QBSA-RF classification model +e main idea is to constructa two-dimensional fitness function containing RFrsquos twoparameters ntree and mtry as the optimization target ofDMSDL-QBSA so as to obtain a set of grouping parametersand make the RF classification model obtain the best

classification accuracy +e specific algorithm steps areshown as in Algorithm 2

33 Simulation Experiment and Analysis In order to test theperformance of the improved DMSDL-QBSA-RF classificationmodel we compare the improved classification model with thestandard RF model BSA-RF model and DMSDL-BSA-RFmodel on 8 two-dimensional UCI datasets +e DMSDL-BSA-RF classification model is an RF classification model optimizedby BSA without quantum behavior In our experiment each ofdatasets is divided into two parts 70of the dataset is as trainingset and the remaining 30 is as a test set +e average classi-fication accuracies of 10 independent runs of each model arerecorded in Table 12 where the best results are marked in bold

From the accuracy results in Table 12 we can see that theDMSDL-QBSA-RF classification model can get best accuracyon each UCI dataset except magic dataset And the DMSDL-BSA-RF classification model has got best accuracy on magicdataset +en compared with the standard RF model the ac-curacy of the DMSDL-QBSA-RF classification model can getbetter accuracy which is increased by about 10 Finally the

Table 6 Comparison on nine multimodal functions with hybrid algorithms (Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 28498E+ 03 28226E+ 03 30564E+ 03 27739E+ 03 28795E+ 03Min 12553E+ 02 18214E+ 03 12922E+ 03 16446E+ 02 11634E+ 03Mean 25861E+ 02 19229E+ 03 13185E+ 03 31119E+ 02 12729E+ 03Var 24093E+ 02 12066E+ 02 12663E+ 02 25060E+ 02 12998E+ 02

f11

Max 12550E+ 02 10899E+ 02 11806E+ 02 11243E+ 02 91376E+ 01Min 0 63502E+ 01 10751E+ 01 0 0Mean 20417Eminus 01 67394E+ 01 39864E+ 01 13732Eminus 01 68060Eminus 02Var 35886E+ 00 58621E+ 00 13570E+ 01 30325E+ 00 20567E+ 00

f12

Max 20021E+ 01 19910E+ 01 19748E+ 01 19254E+ 01 18118E+ 01Min 88818Eminus 16 16575E+ 01 71700Eminus 02 88818Eminus 16 88818Eminus 16Mean 30500Eminus 02 17157E+ 01 30367E+ 00 38520Eminus 02 13420Eminus 02Var 58820Eminus 01 52968Eminus 01 16585E+ 00 64822Eminus 01 42888Eminus 01

f13

Max 10431E+ 02 13266E+ 02 15115E+ 02 12017E+ 02 61996E+ 01Min 0 45742E+ 01 21198Eminus 01 0 0Mean 61050Eminus 02 52056E+ 01 30613E+ 00 69340Eminus 02 29700Eminus 02Var 18258E+ 00 83141E+ 00 15058E+ 01 22452E+ 00 10425E+ 00

f14

Max 84576E+ 06 30442E+ 07 53508E+ 07 62509E+ 07 85231E+ 06Min 17658Eminus 13 19816E+ 06 45685Eminus 05 16961Eminus 13 51104Eminus 01Mean 13266E+ 03 31857E+ 06 68165E+ 04 88667E+ 03 11326E+ 03Var 97405E+ 04 14876E+ 06 14622E+ 06 64328E+ 05 87645E+ 04

f15

Max 18310E+ 08 14389E+ 08 18502E+ 08 14578E+ 08 25680E+ 07Min 17942Eminus 11 10497E+ 07 24500Eminus 03 11248Eminus 11 99870Eminus 01Mean 37089E+ 04 15974E+ 07 20226E+ 05 38852E+ 04 35739E+ 03Var 20633E+ 06 10724E+ 07 46539E+ 06 21133E+ 06 26488E+ 05

f16

Max 13876E+ 01 14988E+ 01 14849E+ 01 13506E+ 01 93280E+ 00Min 42410Eminus 174 68743E+ 00 25133Eminus 02 73524Eminus 176 0Mean 13633Eminus 02 72408E+ 00 25045E+ 00 13900Eminus 02 51800Eminus 03Var 33567Eminus 01 77774Eminus 01 10219E+ 00 34678Eminus 01 17952Eminus 01

f18

Max 36704E+ 01 36950E+ 01 28458E+ 01 26869E+ 01 24435E+ 01Min 20914Eminus 11 97737E+ 00 33997Eminus 03 59165Eminus 12 75806Eminus 01Mean 65733Eminus 02 12351E+ 01 67478Eminus 01 56520Eminus 02 79392Eminus 01Var 67543Eminus 01 28057E+ 00 14666E+ 00 64874Eminus 01 36928Eminus 01

Computational Intelligence and Neuroscience 13

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 14: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

DMSDL-QBSA-RF classification model has got the best ac-curacy on appendicitis dataset which is up to 9355 Insummary the DMSDL-QBSA-RF classification model hasvalidity on most datasets and a good performance on them

4 Oil Layer Classification Application

41 Design of Oil Layer Classification System +e block di-agram of the oil layer classification system based on the im-proved DMSDL-QBSA-RF is shown in Figure 4 +e oil layerclassification can be simplified by the following five steps

Step 1 +e selection of the actual logging datasets is intactand full-scale At the same time the datasets should beclosely related to rock sample analysis+e dataset shouldbe relatively independent+edataset is randomly dividedinto two parts of training and testing samples

Step 2 In order to better understand the relationshipbetween independent variables and dependent variablesand reduce the sample information attribute the datasetcontinuous attribute should be discretized by using agreedy algorithmStep 3 In order to improve the calculation speed andclassification accuracy we use the covering rough setmethod [43] to realize the attribute reduction Afterattribute reduction normalization of the actual loggingdatasets is carried out to avoid computationalsaturationStep 4 In the DMSDL-QBSA-RF layer classificationmodel we input the actual logging dataset after attributereduction use a DMSDL-QBSA-RF layer classificationalgorithm to train and finally get the DMSDL-QBSA-RFlayer classification model

Table 7 Comparison on nine multimodal functions with hybrid algorithms (Dim 2)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

f10

Max 20101E+ 02 28292E+ 02 29899E+ 02 24244E+ 02 28533E+ 02Min 25455Eminus 05 37717E+ 00 23690E+ 01 25455Eminus 05 94751E+ 01Mean 34882Eminus 01 80980E+ 00 25222E+ 01 42346Eminus 01 94816E+ 01Var 57922E+ 00 10853E+ 01 15533E+ 01 56138E+ 00 28160E+ 00

f11

Max 47662E+ 00 47784E+ 00 11067E+ 01 87792E+ 00 81665E+ 00Min 0 38174Eminus 01 0 0 0Mean 27200Eminus 03 60050Eminus 01 31540Eminus 02 42587Eminus 03 37800Eminus 03Var 86860Eminus 02 31980Eminus 01 24862Eminus 01 12032Eminus 01 13420Eminus 01

f12

Max 96893E+ 00 81811E+ 00 11635E+ 01 91576E+ 00 84720E+ 00Min 88818Eminus 16 51646Eminus 01 88818Eminus 16 88818Eminus 16 88818Eminus 16Mean 95600Eminus 03 69734Eminus 01 34540Eminus 02 99600Eminus 03 28548Eminus 03Var 21936Eminus 01 61050Eminus 01 25816Eminus 01 21556Eminus 01 11804Eminus 01

f13

Max 44609E+ 00 49215E+ 00 41160E+ 00 19020E+ 00 16875E+ 00Min 0 13718Eminus 01 0 0 0Mean 19200Eminus 03 17032Eminus 01 18240Eminus 02 14800Eminus 03 57618Eminus 04Var 66900Eminus 02 13032Eminus 01 18202Eminus 01 33900Eminus 02 22360Eminus 02

f14

Max 10045E+ 01 19266E+ 03 19212E+ 01 57939E+ 02 82650E+ 00Min 23558Eminus 31 13188Eminus 01 23558Eminus 31 23558Eminus 31 23558Eminus 31Mean 41600Eminus 03 35402Eminus 01 10840Eminus 02 61420Eminus 02 13924Eminus 03Var 17174Eminus 01 19427E+ 01 39528Eminus 01 58445E+ 00 87160Eminus 02

f15

Max 65797E+ 04 44041E+ 03 14412E+ 05 86107E+ 03 26372E+ 00Min 13498Eminus 31 91580Eminus 02 13498Eminus 31 13498Eminus 31 77800Eminus 03Mean 71736E+ 00 89370Eminus 01 17440E+ 01 90066Eminus 01 82551Eminus 03Var 67678E+ 02 54800E+ 01 14742E+ 03 87683E+ 01 28820Eminus 02

f16

Max 62468Eminus 01 64488Eminus 01 51564Eminus 01 84452Eminus 01 39560Eminus 01Min 69981Eminus 08 25000Eminus 03 15518Eminus 240 27655Eminus 07 0Mean 27062Eminus 04 69400Eminus 03 68555Eminus 04 20497Eminus 04 61996Eminus 05Var 10380Eminus 02 17520Eminus 02 84600Eminus 03 10140Eminus 02 44000Eminus 03

f17

Max 51946E+ 00 36014E+ 00 23463E+ 00 69106E+ 00 12521E+ 00Min 26445Eminus 11 26739Eminus 02 0 10855Eminus 10 0Mean 19343Eminus 03 51800Eminus 02 12245Eminus 03 28193Eminus 03 15138Eminus 04Var 73540Eminus 02 12590Eminus 01 41620Eminus 02 11506Eminus 01 12699Eminus 02

f18

Max 50214Eminus 01 34034Eminus 01 41400Eminus 01 37422Eminus 01 40295Eminus 01Min 14998Eminus 32 19167Eminus 03 14998Eminus 32 14998Eminus 32 14998Eminus 32Mean 10967Eminus 04 41000Eminus 03 18998Eminus 04 14147Eminus 04 60718Eminus 05Var 61800Eminus 03 10500Eminus 02 65200Eminus 03 57200Eminus 03 44014Eminus 03

14 Computational Intelligence and Neuroscience

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 15: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

Step 5 +e whole oil section is identified by the trainedDMSDL-QBSA-RF layer classification model and weoutput the classification results

In order to verify the application effect of the DMSDL-QBSA-RF layer classification model we select three actuallogging datasets of oil and gas wells to train and test

42 Practical Application In Section 23 the performance ofthe proposed DMSDL-QBSA is simulated and analyzed onbenchmark functions And in Section 33 the effectiveness ofthe improved RF classification model optimized by theproposed DMSDL-QBSA is tested and verified on two-di-mensional UCI datasets In order to test the applicationeffect of the improved DMSDL-QBSA-RF layer classificationmodel three actual logging datasets are adopted andrecorded as mathematical problems in engineeringW1W2andW3+eW1 is a gas well in Xian (China) theW2 is a gaswell in Shanxi (China) and theW3 is an oil well in Xinjiang(China) +e depth and the corresponding rock sampleanalysis samples of the three wells selected in the experimentare as shown in Table 13

Attribute reduction on the actual logging datasets isperformed before the training of the DMSDL-QBSA-RFclassification model on the training dataset as shown inTable 14 +en these attributes are normalized as shown inFigure 5 where the horizontal axis represents the depth andthe vertical axis represents the normalized value

+e logging dataset after attribute reduction and nor-malization is used to train the oil and gas layer classificationmodel In order to measure the performance of the DMSDL-QBSA-RF classification model we compare the improvedclassification model with several popular oil and gas layerclassification models +ese classification models are thestandard RF model SVM model BSA-RF model andDMSDL-BSA-RF model Here the RF classification modelwas first applied to the field of logging In order to evaluatethe performance of the recognition model we select thefollowing performance indicators

RMSE

1N

1113944N

i1 fi minus yi( 11138572

1113970

MAE 1N

1113944N

i1 fi minus yi

11138681113868111386811138681113868111386811138681113868

(21)

Table 8 Comparison on 8 unimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f1

Max 13396E+ 04 14767E+ 04 13310E+ 04 20099E+ 01 48745E+ 03 98570Eminus 01Min 0 0 42905Eminus 293 86468Eminus 17 0 0Mean 35990E+ 00 47621E+ 00 14014E+ 02 70100Eminus 02 45864E+ 00 10483Eminus 04Var 17645E+ 02 20419E+ 02 85054E+ 02 44200Eminus 01 14148E+ 02 98725Eminus 03

f2

Max 36021E+ 02 25789E+ 03 65027E+ 01 93479E+ 01 34359E+ 01 32313Eminus 01Min 0 0 98354Eminus 192 28954Eminus 03 24642Eminus 181 0Mean 50667Eminus 02 29480Eminus 01 40760Eminus 01 31406E+ 00 17000Eminus 02 11278Eminus 04Var 37270E+ 00 26091E+ 01 22746E+ 00 39264E+ 00 54370Eminus 01 48000Eminus 03

f3

Max 18041E+ 04 16789E+ 04 24921E+ 04 65697E+ 03 11382E+ 04 40855E+ 01Min 0 10581Eminus 18 76116Eminus 133 28796Eminus 01 32956Eminus 253 15918Eminus 264Mean 74511E+ 00 28838E+ 02 40693E+ 02 48472E+ 02 92062E+ 00 48381Eminus 03Var 26124E+ 02 14642E+ 03 15913E+ 03 71786E+ 02 29107E+ 02 41383Eminus 01

f4

Max 21812E+ 09 54706E+ 09 84019E+ 09 11942E+ 09 49386E+ 08 75188E+ 01Min 49125E+ 00 35695E+ 00 59559E+ 00 22249E+ 02 49806E+ 00 29279Eminus 13Mean 49592E+ 05 24802E+ 06 44489E+ 07 50021E+ 06 33374E+ 05 24033Eminus 02Var 29484E+ 07 11616E+ 08 45682E+ 08 39698E+ 07 11952E+ 07 97253Eminus 01

f5

Max 18222E+ 04 15374E+ 04 15874E+ 04 12132E+ 03 16361E+ 04 18007E+ 00Min 11334Eminus 08 83228Eminus 09 23971Eminus 01 27566Eminus 10 26159Eminus 16 10272Eminus 33Mean 51332E+ 00 59967E+ 00 12620E+ 02 58321E+ 01 88985E+ 00 23963Eminus 04Var 23617E+ 02 23285E+ 02 88155E+ 02 10872E+ 02 29986E+ 02 18500Eminus 02

f6

Max 74088E+ 00 83047E+ 00 88101E+ 00 68900Eminus 01 44298E+ 00 25787Eminus 01Min 18112Eminus 05 39349Eminus 05 48350Eminus 05 89528Eminus 02 40807Eminus 05 10734Eminus 04Mean 18333Eminus 03 32667Eminus 03 48400Eminus 02 93300Eminus 02 21000Eminus 03 52825Eminus 04Var 94267Eminus 02 10077Eminus 01 29410Eminus 01 35900Eminus 02 65500Eminus 02 46000Eminus 03

f7

Max 73626E+ 02 68488E+ 02 80796E+ 02 39241E+ 02 82036E+ 02 14770E+ 01Min 0 0 19441Eminus 292 79956Eminus 07 0 0Mean 20490Eminus 01 28060Eminus 01 49889E+ 00 16572E+ 01 27290Eminus 01 18081Eminus 03Var 95155E+ 00 10152E+ 01 35531E+ 01 23058E+ 01 10581E+ 01 16033Eminus 01

f8

Max 12749E+ 03 59740E+ 02 32527E+ 02 23425E+ 02 20300E+ 02 20423E+ 01Min 0 43596Eminus 35 15241Eminus 160 36588Eminus 05 10239Eminus 244 0Mean 31317Eminus 01 10582E+ 01 10457E+ 01 12497E+ 01 21870Eminus 01 26947Eminus 03Var 14416E+ 01 43485E+ 01 35021E+ 01 25766E+ 01 62362E+ 00 21290Eminus 01

Computational Intelligence and Neuroscience 15

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 16: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

where yi and fi are the classification output value and theexpected output value respectively

RMSE is used to evaluate the accuracy of each classifi-cation model MAE is used to show actual forecasting errorsTable 15 records the performance indicator data of eachclassification model and the best results are marked in bold+e smaller the RMSE and MAE the better the classificationmodel performs

From the performance indicator data of each classifi-cation model in Table 15 we can see that the DMSDL-QBSA-RF classification model can get the best recognitionaccuracy and all the accuracies are up to 90 +e rec-ognition accuracy of the proposed classification model forW3 is up to 9973 and it has superior performance for oiland gas layer classification in other performance indicatorsand different wells Secondly DMSDL-QBSA can improvethe performance of RF and the parameters found byDMSDL-QBSA used in the RF classification model canimprove the classification accuracy and keep running speed

relatively fast at the same time For example the runningtimes of DMSDL-QBSA-RF classification model for W1and W2 are respectively 00504 seconds and 19292 sec-onds faster than the original RF classification model Basedon above results of data the proposed classification modelis better than the traditional RF and SVMmodel in oil layerclassification +e comparison of oil layer classificationresult is shown in Figure 6 where (a) (c) and (e) representthe actual oil layer distribution and (b) (d) and (f ) rep-resent DMSDL-QBSA-RF oil layer distribution In addi-tion 0 means this depth has no oil or gas and 1 means thisdepth has oil or gas

From Figure 6 we can see that the DMSDL-QBSA-RFclassification model identifies that the oil layer distributionresults are not much different from the test oil test results Itcan accurately identify the distribution of oil and gas in awell +e DMSDL-QBSA-RFmodel is suitable for petroleumlogging applications which greatly reduces the difficulty ofoil exploration and has a good application foreground

Table 9 Comparison on 8 multimodal functions with popular algorithms (Dim 10 FEs 100000)

Function Term GWO WOA SCA GOA SSA DMSDL-QBSA

f10

Max 29544E+ 03 29903E+ 03 27629E+ 03 29445E+ 03 32180E+ 03 30032E+ 03Min 13037E+ 03 86892Eminus 04 15713E+ 03 13438E+ 03 12839Eminus 04 12922E+ 03Mean 16053E+ 03 14339E+ 01 17860E+ 03 18562E+ 03 25055E+ 02 12960E+ 03Var 26594E+ 02 12243E+ 02 16564E+ 02 51605E+ 02 33099E+ 02 58691E+ 01

f11

Max 13792E+ 02 12293E+ 02 12313E+ 02 11249E+ 02 32180E+ 03 18455E+ 01Min 0 0 0 12437E+ 01 12839Eminus 04 0Mean 44220Eminus 01 11252E+ 00 99316E+ 00 28378E+ 01 25055E+ 02 32676Eminus 03Var 43784E+ 00 65162E+ 00 19180E+ 01 16240E+ 01 33099E+ 02 19867Eminus 01

f12

Max 20257E+ 01 20043E+ 01 19440E+ 01 16623E+ 01 32180E+ 03 19113E+ 00Min 44409Eminus 15 32567Eminus 15 32567Eminus 15 23168E+ 00 12839Eminus 04 88818Eminus 16Mean 17200Eminus 02 42200Eminus 02 88870Eminus 01 55339E+ 00 25055E+ 02 31275Eminus 04Var 47080Eminus 01 64937Eminus 01 30887E+ 00 28866E+ 00 33099E+ 02 22433Eminus 02

f13

Max 15246E+ 02 16106E+ 02 11187E+ 02 61505E+ 01 32180E+ 03 31560Eminus 01Min 33000Eminus 03 0 0 24147Eminus 01 12839Eminus 04 0Mean 46733Eminus 02 93867Eminus 02 12094E+ 00 37540E+ 00 25055E+ 02 33660Eminus 05Var 19297E+ 00 26570E+ 00 68476E+ 00 41936E+ 00 33099E+ 02 31721Eminus 03

f14

Max 95993E+ 07 99026E+ 07 59355E+ 07 61674E+ 06 32180E+ 03 64903Eminus 01Min 38394Eminus 09 11749Eminus 08 96787Eminus 03 18099Eminus 04 12839Eminus 04 47116Eminus 32Mean 12033E+ 04 35007E+ 04 48303E+ 05 10465E+ 04 25055E+ 02 89321Eminus 05Var 98272E+ 05 15889E+ 06 40068E+ 06 19887E+ 05 33099E+ 02 68667Eminus 03

f15

Max 22691E+ 08 24717E+ 08 11346E+ 08 28101E+ 07 32180E+ 03 16407Eminus 01Min 32467Eminus 02 45345Eminus 08 11922Eminus 01 35465Eminus 05 12839Eminus 04 13498Eminus 32Mean 29011E+ 04 43873E+ 04 65529E+ 05 72504E+ 04 25055E+ 02 67357Eminus 05Var 23526E+ 06 27453E+ 06 71864E+ 06 12814E+ 06 33099E+ 02 24333Eminus 03

f16

Max 17692E+ 01 17142E+ 01 16087E+ 01 87570E+ 00 32180E+ 03 10959E+ 00Min 26210Eminus 07 00000E+ 00 62663Eminus 155 10497Eminus 02 12839Eminus 04 0Mean 10133Eminus 02 39073Eminus 01 59003Eminus 01 24770E+ 00 25055E+ 02 15200Eminus 04Var 30110Eminus 01 96267Eminus 01 14701E+ 00 19985E+ 00 33099E+ 02 11633Eminus 02

f18

Max 44776E+ 01 43588E+ 01 39095E+ 01 17041E+ 01 32180E+ 03 65613Eminus 01Min 19360Eminus 01 94058Eminus 08 22666Eminus 01 39111E+ 00 12839Eminus 04 14998Eminus 32Mean 21563Eminus 01 47800Eminus 02 98357Eminus 01 54021E+ 00 25055E+ 02 94518Eminus 05Var 58130Eminus 01 10434E+ 00 30643E+ 00 16674E+ 00 33099E+ 02 70251Eminus 03

16 Computational Intelligence and Neuroscience

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 17: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

Table 10 Comparison of numerical testing results on CEC2014 test sets (F1ndashF15 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F1

Max 27411E+ 08 36316E+ 09 68993E+ 08 39664E+ 08 96209E+ 08Min 14794E+ 07 31738E+ 09 13205E+ 06 16929E+ 05 17687E+ 05Mean 31107E+ 07 32020E+ 09 67081E+ 06 13567E+ 06 19320E+ 06Var 12900E+ 07 40848E+ 07 26376E+ 07 10236E+ 07 13093E+ 07

F2

Max 87763E+ 09 13597E+ 10 13515E+ 10 16907E+ 10 17326E+ 10Min 17455E+ 09 11878E+ 10 36982E+ 04 21268E+ 05 14074E+ 05Mean 19206E+ 09 11951E+ 10 17449E+ 08 59354E+ 07 48412E+ 07Var 19900E+ 08 15615E+ 08 92365E+ 08 51642E+ 08 42984E+ 08

F3

Max 48974E+ 05 85863E+ 04 26323E+ 06 24742E+ 06 53828E+ 06Min 11067E+ 04 14616E+ 04 18878E+ 03 12967E+ 03 63986E+ 02Mean 13286E+ 04 14976E+ 04 10451E+ 04 35184E+ 03 30848E+ 03Var 65283E+ 03 19988E+ 03 45736E+ 04 41580E+ 04 80572E+ 04

F4

Max 35252E+ 03 97330E+ 03 45679E+ 03 49730E+ 03 43338E+ 03Min 50446E+ 02 84482E+ 03 40222E+ 02 40843E+ 02 41267E+ 02Mean 58061E+ 02 85355E+ 03 44199E+ 02 43908E+ 02 43621E+ 02Var 14590E+ 02 13305E+ 02 16766E+ 02 12212E+ 02 85548E+ 01

F5

Max 52111E+ 02 52106E+ 02 52075E+ 02 52098E+ 02 52110E+ 02Min 52001E+ 02 52038E+ 02 52027E+ 02 52006E+ 02 52007E+ 02Mean 52003E+ 02 52041E+ 02 52033E+ 02 52014E+ 02 52014E+ 02Var 78380Eminus 02 57620Eminus 02 56433Eminus 02 10577Eminus 01 99700Eminus 02

F6

Max 61243E+ 02 61299E+ 02 61374E+ 02 61424E+ 02 61569E+ 02Min 60881E+ 02 61157E+ 02 60514E+ 02 60288E+ 02 60257E+ 02Mean 60904E+ 02 61164E+ 02 60604E+ 02 60401E+ 02 60358E+ 02Var 25608Eminus 01 12632Eminus 01 10434E+ 00 11717E+ 00 13117E+ 00

F7

Max 87895E+ 02 10459E+ 03 91355E+ 02 10029E+ 03 93907E+ 02Min 74203E+ 02 10238E+ 03 70013E+ 02 70081E+ 02 70069E+ 02Mean 74322E+ 02 10253E+ 03 71184E+ 02 70332E+ 02 70290E+ 02Var 43160E+ 00 24258E+ 00 28075E+ 01 15135E+ 01 13815E+ 01

F8

Max 89972E+ 02 91783E+ 02 96259E+ 02 92720E+ 02 93391E+ 02Min 84904E+ 02 88172E+ 02 83615E+ 02 81042E+ 02 80877E+ 02Mean 85087E+ 02 88406E+ 02 85213E+ 02 81888E+ 02 81676E+ 02Var 21797E+ 00 44494E+ 00 86249E+ 00 92595E+ 00 97968E+ 00

F9

Max 10082E+ 03 99538E+ 02 10239E+ 03 10146E+ 03 10366E+ 03Min 93598E+ 02 96805E+ 02 92725E+ 02 92062E+ 02 91537E+ 02Mean 93902E+ 02 97034E+ 02 94290E+ 02 92754E+ 02 92335E+ 02Var 34172E+ 00 32502E+ 00 81321E+ 00 90492E+ 00 93860E+ 00

F10

Max 31087E+ 03 35943E+ 03 39105E+ 03 39116E+ 03 34795E+ 03Min 22958E+ 03 28792E+ 03 15807E+ 03 14802E+ 03 14316E+ 03Mean 18668E+ 03 29172E+ 03 17627E+ 03 16336E+ 03 16111E+ 03Var 32703E+ 01 61787E+ 01 22359E+ 02 19535E+ 02 22195E+ 02

F11

Max 36641E+ 03 34628E+ 03 40593E+ 03 35263E+ 03 37357E+ 03Min 24726E+ 03 28210E+ 03 17855E+ 03 14012E+ 03 12811E+ 03Mean 25553E+ 03 28614E+ 03 18532E+ 03 15790E+ 03 14869E+ 03Var 84488E+ 01 56351E+ 01 13201E+ 02 21085E+ 02 26682E+ 02

F12

Max 12044E+ 03 12051E+ 03 12053E+ 03 12055E+ 03 12044E+ 03Min 12006E+ 03 12014E+ 03 12004E+ 03 12003E+ 03 12017E+ 03Mean 12007E+ 03 12017E+ 03 12007E+ 03 12003E+ 03 12018E+ 03Var 14482Eminus 01 35583Eminus 01 25873Eminus 01 24643Eminus 01 21603Eminus 01

F13

Max 13049E+ 03 13072E+ 03 13073E+ 03 13056E+ 03 13061E+ 03Min 13005E+ 03 13067E+ 03 13009E+ 03 13003E+ 03 13004E+ 03Mean 13006E+ 03 13068E+ 03 13011E+ 03 13005E+ 03 13006E+ 03Var 32018Eminus 01 50767Eminus 02 42173Eminus 01 36570Eminus 01 29913Eminus 01

F14

Max 14395E+ 03 14565E+ 03 14775E+ 03 14749E+ 03 14493E+ 03Min 14067E+ 03 14522E+ 03 14009E+ 03 14003E+ 03 14005E+ 03Mean 14079E+ 03 14529E+ 03 14024E+ 03 14008E+ 03 14009E+ 03Var 21699E+ 00 63013Eminus 01 53198E+ 00 32578E+ 00 28527E+ 00

F15

Max 54068E+ 04 33586E+ 04 48370E+ 05 40007E+ 05 19050E+ 05Min 19611E+ 03 21347E+ 04 15029E+ 03 15027E+ 03 15026E+ 03Mean 19933E+ 03 22417E+ 04 27920E+ 03 15860E+ 03 15816E+ 03Var 61622E+ 02 12832E+ 03 17802E+ 04 44091E+ 03 27233E+ 03

Computational Intelligence and Neuroscience 17

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 18: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

Table 11 Comparison of numerical testing results on CEC2014 test sets (F16ndash F30 Dim 10)

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F16

Max 16044E+ 03 16044E+ 03 16046E+ 03 16044E+ 03 16046E+ 03Min 16034E+ 03 16041E+ 03 16028E+ 03 16021E+ 03 16019E+ 03Mean 16034E+ 03 16041E+ 03 16032E+ 03 16024E+ 03 16024E+ 03Var 33300Eminus 02 35900Eminus 02 25510Eminus 01 37540Eminus 01 36883Eminus 01

F17

Max 13711E+ 07 16481E+ 06 33071E+ 07 46525E+ 07 84770E+ 06Min 12526E+ 05 47216E+ 05 27261E+ 03 20177E+ 03 20097E+ 03Mean 16342E+ 05 48499E+ 05 87769E+ 04 31084E+ 04 21095E+ 04Var 20194E+ 05 29949E+ 04 12284E+ 06 87638E+ 05 20329E+ 05

F18

Max 77173E+ 07 35371E+ 07 61684E+ 08 14216E+ 08 66050E+ 08Min 41934E+ 03 26103E+ 06 22743E+ 03 18192E+ 03 18288E+ 03Mean 16509E+ 04 34523E+ 06 12227E+ 06 10781E+ 05 19475E+ 05Var 79108E+ 05 14888E+ 06 21334E+ 07 36818E+ 06 10139E+ 07

F19

Max 19851E+ 03 25657E+ 03 20875E+ 03 19872E+ 03 19555E+ 03Min 19292E+ 03 24816E+ 03 19027E+ 03 19023E+ 03 19028E+ 03Mean 19299E+ 03 24834E+ 03 19044E+ 03 19032E+ 03 19036E+ 03Var 10820E+ 00 38009E+ 00 11111E+ 01 33514E+ 00 14209E+ 00

F20

Max 83021E+ 06 20160E+ 07 10350E+ 07 61162E+ 07 12708E+ 08Min 56288E+ 03 17838E+ 06 21570E+ 03 20408E+ 03 20241E+ 03Mean 12260E+ 04 18138E+ 06 56957E+ 03 11337E+ 04 45834E+ 04Var 10918E+ 05 59134E+ 05 13819E+ 05 64167E+ 05 21988E+ 06

F21

Max 24495E+ 06 17278E+ 09 20322E+ 06 13473E+ 07 26897E+ 07Min 49016E+ 03 14049E+ 09 33699E+ 03 21842E+ 03 22314E+ 03Mean 66613E+ 03 14153E+ 09 99472E+ 03 13972E+ 04 74587E+ 03Var 41702E+ 04 22557E+ 07 33942E+ 04 25098E+ 05 28735E+ 05

F22

Max 28304E+ 03 49894E+ 03 31817E+ 03 31865E+ 03 32211E+ 03Min 25070E+ 03 41011E+ 03 23492E+ 03 22442E+ 03 22314E+ 03Mean 25081E+ 03 41175E+ 03 23694E+ 03 22962E+ 03 22687E+ 03Var 54064E+ 00 38524E+ 01 43029E+ 01 48006E+ 01 51234E+ 01

F23

Max 28890E+ 03 26834E+ 03 28758E+ 03 30065E+ 03 29923E+ 03Min 25000E+ 03 26031E+ 03 24870E+ 03 25000E+ 03 25000E+ 03Mean 25004E+ 03 26088E+ 03 25326E+ 03 25010E+ 03 25015E+ 03Var 99486E+ 00 86432E+ 00 57045E+ 01 14213E+ 01 17156E+ 01

F24

Max 26293E+ 03 26074E+ 03 26565E+ 03 26491E+ 03 26369E+ 03Min 25816E+ 03 26049E+ 03 25557E+ 03 25246E+ 03 25251E+ 03Mean 25829E+ 03 26052E+ 03 25671E+ 03 25337E+ 03 25338E+ 03Var 13640E+ 00 33490Eminus 01 89434E+ 00 13050E+ 01 11715E+ 01

F25

Max 27133E+ 03 27014E+ 03 27445E+ 03 27122E+ 03 27327E+ 03Min 26996E+ 03 27003E+ 03 26784E+ 03 26789E+ 03 26635E+ 03Mean 26996E+ 03 27004E+ 03 26831E+ 03 26903E+ 03 26894E+ 03Var 23283Eminus 01 99333Eminus 02 49609E+ 00 71175E+ 00 12571E+ 01

F26

Max 27056E+ 03 28003E+ 03 27058E+ 03 27447E+ 03 27116E+ 03Min 27008E+ 03 28000E+ 03 27003E+ 03 27002E+ 03 27002E+ 03Mean 27010E+ 03 28000E+ 03 27005E+ 03 27003E+ 03 27003E+ 03Var 31316Eminus 01 16500Eminus 02 39700Eminus 01 11168E+ 00 35003Eminus 01

F27

Max 34052E+ 03 57614E+ 03 33229E+ 03 34188E+ 03 34144E+ 03Min 29000E+ 03 39113E+ 03 29698E+ 03 28347E+ 03 27054E+ 03Mean 29038E+ 03 40351E+ 03 29816E+ 03 28668E+ 03 27219E+ 03Var 32449E+ 01 20673E+ 02 34400E+ 01 62696E+ 01 61325E+ 01

F28

Max 44333E+ 03 54138E+ 03 45480E+ 03 43490E+ 03 48154E+ 03Min 30000E+ 03 40092E+ 03 34908E+ 03 30000E+ 03 30000E+ 03Mean 30079E+ 03 40606E+ 03 36004E+ 03 30063E+ 03 30065E+ 03Var 68101E+ 01 92507E+ 01 85705E+ 01 42080E+ 01 53483E+ 01

F29

Max 25038E+ 07 16181E+ 08 59096E+ 07 70928E+ 07 64392E+ 07Min 32066E+ 03 10014E+ 08 31755E+ 03 31005E+ 03 33287E+ 03Mean 57057E+ 04 10476E+ 08 85591E+ 04 68388E+ 04 28663E+ 04Var 49839E+ 05 67995E+ 06 17272E+ 06 15808E+ 06 86740E+ 05

18 Computational Intelligence and Neuroscience

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 19: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

098

096

094

092

09

088

086

Accu

racy

ntree

0 100 200 300 400 500 600 700 800 900 1000

(a)

Accu

racy

mtry

095

094

093

092

091

09

0890 5 10 15 20 25 30 35

(b)

Figure 3 +e effect of the two parameters on the performance of RF models (a) the effect of ntree (b) the effect of mtry

Variable setting number of iterations iter bird positions Xi local optimum Pi global optimal position gbest global optimumPgbest and out of bag error OOB error

Input training dataset Train Train Label test dataset Test the parameters of DMSDL-QBSAOutput the label of test dataset Test_Label the final classification model DMSDL-QBSA-RF(1) Begin(2) lowastBuild classification model based on DMSDL-QBSA-RFlowast(3) Initialize the positions of N birds using equations (16) and (17) Xi(i 1 2 N)(4) Calculated fitness f(Xi)(i 1 2 N) set Xi to be Pi and find Pgbest(5) While iterlt itermax + 1 do(6) For i 1 ntree(7) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(8) Select mtry attributes randomly at each leaf node compare the attributes and select the best one(9) Recursively generate each decision tree without pruning operations(10) End For(11) Update classification accuracy of RF Evaluate f(Xi)(12) Update gbest and Pgbest(13) [nbest mbest] gbest(14) iter iter + 1(15) End While(16) lowastClassify using RF modellowast(17) For i 1 nbest(18) Give each tree a training set of size N by random sampling with replacement based on Bootstrap(19) Select mbest attributes randomly at each leaf node compare the attributes and select the best one(20) Recursively generate each decision tree without pruning operations(21) End(22) Return DMSDL-QBSA-RF classification model(23) Classify the test dataset using equation (20)(24) Calculate OOB error(25) End

ALGORITHM 2 DMSDL-QBSA-RF classification model

Table 11 Continued

Function Term BSA DE DMSDL-PSO DMSDL-BSA DMSDL-QBSA

F30

Max 52623E+ 05 28922E+ 07 11938E+ 06 12245E+ 06 11393E+ 06Min 59886E+ 03 19017E+ 07 37874E+ 03 36290E+ 03 36416E+ 03Mean 74148E+ 03 20002E+ 07 55468E+ 03 43605E+ 03 41746E+ 03Var 11434E+ 04 11968E+ 06 32255E+ 04 22987E+ 04 18202E+ 04

Computational Intelligence and Neuroscience 19

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 20: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

Attribute discretization

Attribute reduction and

data normalization

RFclassification

To be identifed information

Data selection and

preprocessing

Remove redundant attributes andpretreatment

Output

Build modelbased on

DMSDL-QBSA-RF

Figure 4 Block diagram of the oil layer classification system based on DMSDL-QBSA-RF

Table 13 Distribution of the logging data

WellTraining dataset Test dataset

Depth (m) Data Oilgas layers Dry layer Depth (m) Data Oilgas layers Dry layerW1 3027sim3058 250 203 47 3250sim3450 1600 237 1363W2 2642sim2648 30 10 20 2940sim2978 191 99 92W3 1040sim1059 160 47 113 1120sim1290 1114 96 1018

Table 12 Comparison of numerical testing results on eight UCI datasets

Dataset RF () BSA-RF () DMSDL-BSA-RF () DMSDL-QBSA-RF ()Blood 7540 8080 7946 8125Heart-statlog 8210 8642 8642 8642Sonar 7839 8548 8548 8871Appendicitis 8323 8710 9032 9355Cleve 7955 8750 8864 8977Magic 8895 8985 9025 8932Mammographic 7473 7768 7679 7946Australian 8583 8884 8884 9078

Table 14 Attribute reduction results of the logging dataset

Well Attributes

W1Actual attributes GR DT SP WQ LLD LLS DEN NPHI PE U TH K CALI

Reductionattributes GR DT SP LLD LLS DEN NPHI

W2Actual attributes DENSITY GAMM VCLOK NEUTRO PERM POR RESI SONIC SP SW

Reductionattributes NEUTRO PERM POR RESI SW

W3Actual attributes AC CNL DEN GR RT RI RXO SP R2M R025 BZSP RA2 C1 C2 CALI RINC PORT VCL VMA1

VMA6 RHOG SW VO WO PORE VXO VW AC1Reductionattributes AC GR RI RXO SP

Nor

mal

ized

val

ue

3250 3300 3350 3400 3450Depth (m)

1

08

06

04

02

0

NPHI

SP

DT

LLD

(a)

DEN

GR

LLS

Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

03250 3300 3350 3400 3450

(b)

Figure 5 Continued

20 Computational Intelligence and Neuroscience

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 21: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

1

08

06

04

02

0

Nor

mal

ized

val

ue

Depth (m)2940 2945 2950 2955 2960 2965 2970 2975 2980

NEUTRO

PERM

(c)

Depth (m)

1

08

06

04

Nor

mal

ized

val

ue

02

02940 2945 2950 2955 2960 2965 2970 2975 2980

RESI

SW

POR

(d)

Nor

mal

ized

val

ue

1150 1200 1250 1300Depth (m)

1

08

06

04

02

0

GR

SPRI

(e)

1150 1200 1250 1300Depth (m)

Nor

mal

ized

val

ue

1

08

06

04

02

0

RXO

AC

(f )

Figure 5 +e normalized curves of attributes (a) and (b) attribute normalization ofW1 (c) and (d) attribute normalization ofW2 (e) and(f) attribute normalization of W3

Table 15 Performance of various well data

Well Classification model RMSE MAE Accuracy () Running time (s)

W1

RF 03326 01106 8894 15167SVM 02681 00719 9281 45861

BSA-RF 03269 01069 8931 18064DMSDL-BSA-RF 02806 00788 9213 23728DMSDL-QBSA-RF 02449 00600 9400 14663

W2

RF 04219 01780 8220 31579SVM 02983 00890 9110 42604

BSA-RF 03963 01571 8429 12979DMSDL-BSA-RF 02506 0062827 9372 16124DMSDL-QBSA-RF 02399 00576 9424 12287

W3

RF 04028 01622 8378 24971SVM 02507 00628 9372 21027

BSA-RF 03631 01318 8681 13791DMSDL-BSA-RF 02341 00548 9452 03125DMSDL-QBSA-RF 00519 00027 9973 09513

Labe

ls

Depth (m)

1

03200 3250 3300 3350 3400 3450

(a)

3200 3250 3300 3350 3400 3450

Labe

ls

1

0

(b)

Figure 6 Continued

Computational Intelligence and Neuroscience 21

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 22: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

5 Conclusion

+is paper presents an improved BSA called DMSDL-QBSA which employed the dynamic multi-swarmmethoddifferential evolution and quantum behavior to enhancethe global and the local exploration capabilities of originalBSA First 18 classical benchmark functions are used toverify the effectiveness of the improved method +e ex-perimental study of the effects of these three strategies onthe performance of DMSDL-QBSA revealed that the hybridmethod has an excellent influence to improve the im-provement of original GOA and especially original DESecond compared with the popular intelligence algo-rithms such as GWO WOA SCA GOA and SSA theDMSDL-QBSA can provide more competitive results onthe 18 classical benchmark functions Additionally 30complex CEC2014 test functions are used to better verifythe performance of DMSDL-QBSA in a more compre-hensively manner +e DMSDL-QBSA can show moreexcellent performance on the 18 classical benchmarkfunctions Finally the improved DMSDL-QBSA is used tooptimize the parameters of RF Experimental results onactual oil logging prediction problem have proved that theclassification accuracy of the established DMSDL-QBSA-RF classification model can get 9400 9424 and 9973on these wells and the accuracy is much higher than theoriginal RF model At the same time the running speedperformed faster than other four advanced classificationmodels on most wells

Although the proposed DMSDL-QBSA has been provento be effective in solving general optimization problemsDMSDL-QBSA has some shortcomings that warrant furtherinvestigation And in DMSDL-QBSA due to the hybrid of

three strategies DMSDL-QBSA has needed more time thanthe classical BSA +erefore deploying the proposed algo-rithm to increase recognition efficiency is a worthwhiledirection In the future research work the method presentedin this paper can also be extended to solving discrete op-timization problems and multiobjective optimizationproblems Furthermore applying the proposed DMSDL-QBSA-RF model to other fields such as financial predictionand biomedical science diagnosis is also an interesting futurework

Data Availability

All data included in this study are available upon request bycontact with the corresponding author

Conflicts of Interest

+e authors declare that they have no conflicts of interest

Acknowledgments

+is work was supported by the National Natural ScienceFoundation of China (no U1813222) Tianjin Natural Sci-ence Foundation China (no 18JCYBJC16500) Key Re-search and Development Project from Hebei ProvinceChina (nos 19210404D and 20351802D) Key ResearchProject of Science and Technology from the Ministry ofEducation of Hebei Province China (no ZD2019010) andProject of Industry-University Cooperative Education ofMinistry of Education of China (no 201801335014)

1

0

Labe

ls

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

(c)

2940 2945 2950 2955 2960Depth (m)

2965 2970 2975 2980

1

0

Labe

ls

(d)

Labe

ls

1140 1160 1180 1200 1220Depth (m)

1240 1260 1280 13000

1

(e)

1140 1160 1180 1200 1220 1240 1260 1280 1300Depth (m)

Labe

ls

0

1

(f )

Figure 6 Classification of DMSDL-QBSA-RF (a) the actual oil layer distribution ofW1 (b) DMSDL-QBSA-RF oil layer distribution ofW1(c) the actual oil layer distribution of W2 (d) DMSDL-QBSA-RF oil layer distribution of 2 (e) the actual oil layer distribution of 3 (f )DMSDL-QBSA-RF oil layer distribution of 3

22 Computational Intelligence and Neuroscience

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 23: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

References

[1] A P Engelbrecht Foundations of Computational SwarmIntelligence Beijing Tsinghua University Press BeijingChina 2019

[2] J Kennedy and R Eberhart ldquoParticle swarm optimizationrdquo inProceedings of the IEEE International Conference NeuralNetworks Perth Australia July 1995

[3] D Karaboga ldquoAn idea based on honey bee swarm for nu-merical optimizationrdquo Technical report-TR06 Erciyes Uni-versity Kayseri Turkey 2005

[4] X Li A new intelligent optimization method-artificial fishschool algorithm Doctoral thesis Zhejiang UniversityHangzhou China 2003

[5] X S Yang ldquoFirefly algorithms for multimodal optimizationstochastic algorithms foundations and applicationsrdquo inProceedings of the 5th International Symposium on StochasticAlgorithms SAGA Berlin Germany 2009

[6] Y Sharafi M A Khanesar and M Teshnehlab ldquoDiscretebinary cat swarm optimizationrdquo in Proceedings of the IEEEInternational Conference on Computer Control and Com-munication (IC4) Karachi Pakistan September 2013

[7] X B Meng X Z Gao L Lu Y Liu and H Z Zhang ldquoA newbio-inspired optimisation algorithm bird swarm algorithmrdquoJournal of Experimental amp eoretical Artificial Intelligencevol 28 no 4 pp 673ndash687 2016

[8] N Jatana and B Suri ldquoParticle swarm and genetic algorithmapplied to mutation testing for test data generation a com-parative evaluationrdquo Journal of King Saud University-Com-puter and Information Sciences vol 32 pp 514ndash521 2020

[9] H Chung and K Shin ldquoGenetic algorithm-optimized multi-channel convolutional neural network for stock marketpredictionrdquo in Neural Computing and Applications SpringerNew York NY USA 2019

[10] I Strumberger E Tuba M Zivkovic N Bacanin M Bekoand M Tuba ldquoDesigning convolutional neural network ar-chitecture by the firefly algorithmrdquo in Proceedings of the 2019IEEE International Young Engineers Forum (YEF-ECE) Costada Caparica Portugal May 2019

[11] I Strumberger N Bacanin M Tuba and E Tuba ldquoResourcescheduling in cloud computing based on a hybridized whaleoptimization algorithmrdquo Applied Sciences vol 9 no 22p 4893 2019

[12] M Tuba and N Bacanin ldquoImproved seeker optimizationalgorithm hybridized with firefly algorithm for constrainedoptimization problemsrdquo Neurocomputing vol 143 pp 197ndash207 2014

[13] I Strumberger M Minovic M Tuba and N Bacanin ldquoPer-formance of elephant herding optimization and tree growthalgorithm adapted for node localization in wireless sensornetworksrdquo Sensors vol 19 no 11 p 2515 2019

[14] X S Yang ldquoSwarm intelligence based algorithms a criticalanalysisrdquo Evolutionary Intelligence vol 7 no 1 pp 17ndash282014

[15] N Bacanin and M Tuba ldquoArtificial bee colony (ABC) al-gorithm for constrained optimization improved with geneticoperatorsrdquo Studies in Informatics and Control vol 21 no 2pp 137ndash146 2012

[16] J Liu H Peng Z Wu J Chen and C Deng Multi-StrategyBrainstormOptimization Algorithmwith Dynamic ParametersAdjustment Applied Intelligence Guildford UK 2010

[17] H Peng Y He and C Deng ldquoFirefly algorithm with lucif-erase inhibition mechanismrdquo IEEE Access vol 7pp 120189ndash120201 2019

[18] H Peng C Deng and Z Wu ldquoBest neighbor guided artificialbee colony algorithm for continuous optimization problemsrdquoSoft Computing vol 23 no 18 pp 8723ndash8740 2019

[19] C Xu and R Yang ldquoParameter estimation for chaotic systemsusing improved bird awarm algorithmrdquo Modern PhysicsLetters B vol 31 no 36 pp 1ndash15 2017

[20] J Yang and Y Liu ldquoSeparation of signals with same frequencybased on improved bird swarm algorithmrdquo in Proceedings ofthe International Symposium on Distributed Computing andApplications for Business Engineering and Science WuxiChina August 2018

[21] X Wang Y Deng and H Duan ldquoEdge-based target detectionfor unmanned aerial vehicles using competitive bird swarmalgorithmrdquo Aerospace Science amp Technology vol 78pp 708ndash720 2018

[22] L Breiman ldquoRandom forestsrdquo Machine Learning vol 45no 1 pp 5ndash32 2001

[23] L Ma and S Fan ldquoCURE-SMOTE algorithm and hybridalgorithm for feature selection and parameter optimizationbased on random forestsrdquo BMC Bioinformatics vol 18 no 1pp 1ndash18 2017

[24] J Hou Q Li and S Meng ldquoA differential privacy protectionrandom forestrdquo IEEE Access vol 7 pp 130707ndash130720 2019

[25] G L Liu S J Han and X H Zhao ldquoOptimisation algorithmsfor spatially constrained forest planningrdquo Ecological Model-ling vol 194 no 4 pp 421ndash428 2006

[26] D Gong B Xu and Y Zhang ldquoA similarity-based cooper-ative co-evolutionary algorithm for dynamic interval multi-objective optimization problemsrdquo in Proceedings of the IEEETransactions on Evolutionary Computation Piscataway NJUSA February 2019

[27] M Rong D Gong and W Pedrycz ldquoA multi-model pre-diction method for dynamic multi-objective evolutionaryoptimizationrdquo IEEE Transactions on Evolutionary Compu-tation vol 20 no 24 pp 290ndash304 2020

[28] B Xu Y Zhang D Gong Y Guo and M Rong ldquoEnvi-ronment sensitivity-based cooperative co-evolutionary algo-rithms for dynamic multi-objective optimizationrdquo IEEE-ACM Transactions on Computational Biology and Bio-informatics vol 15 no 6 pp 1877ndash1890 2018

[29] Y Guo H Yang M Chen J Cheng and D Gong ldquoEnsembleprediction-based dynamic robust multi-objective optimiza-tionmethodsrdquo Swarm and Evolutionary Computation vol 48pp 156ndash171 2019

[30] J J Liang and P N Suganthan ldquoDynamic multi-swarmparticle swarm optimizer with local searchrdquo in Proceedings ofthe IEEE Congress on Evolutionary Computation EdinburghScotland 2005

[31] Y Chen L Li and H Peng ldquoDynamic multi-swarm differ-ential learning particle swarm optimizerrdquo Swarm and Evo-lutionary Computation vol 39 pp 209ndash221 2018

[32] R Storn ldquoDifferential evolution design of an IIR-filterrdquo inProceedings of the IEEE International Conference on Evolu-tionary Computation (ICEC 96) Nagoya Japan May 1996

[33] Z Liu X Ji and Y Yang ldquoHierarchical differential evolutionalgorithm combined with multi-cross operationrdquo ExpertSystems with Applications vol 130 pp 276ndash292 2019

[34] H Peng Z Guo and C Deng ldquoEnhancing differentialevolution with random neighbors based strategyrdquo Journal ofComputational Science vol 26 pp 501ndash511 2018

[35] J Sun W Fang X Wu V Palade and W Xu ldquoQuantum-behaved particle swarm optimization analysis of individualparticle behavior and parameter selectionrdquo EvolutionaryComputation vol 20 no 3 pp 349ndash393 2012

Computational Intelligence and Neuroscience 23

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience

Page 24: DynamicMulti-SwarmDifferentialLearningQuantumBird ...downloads.hindawi.com/journals/cin/2020/6858541.pdfexploitation process of the original seeker optimization algorithm (SOA) approach

[36] B Chen H Lei H Shen Y Liu and Y Lu ldquoA hybridquantum-based PIO algorithm for global numerical optimi-zationrdquo Science ChinandashInformation Sciences vol 62 no 72019

[37] X Yao Y Liu and G Lin ldquoEvolutionary programming madefasterrdquo IEEE Transactions on Evolutionary Computationvol 3 no 2 pp 82ndash102 1999

[38] S Mirjalili S M Mirjalili and A Lewis ldquoGrey wolf opti-mizerrdquo Advances in Engineering Software vol 69 pp 46ndash612014

[39] S Mirjalili and A Lewis ldquo+e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[40] S Mirjalili ldquoSCA a sine cosine algorithm for solving opti-mization problemsrdquo Knowledge Based Systems vol 95pp 120ndash133 2016

[41] S Shahrzad M Seyedali and L Andrew ldquoGrasshopper op-timisation algorithm theory and applicationrdquo Advances inEngineering Software vol 105 pp 30ndash47 2017

[42] J Xue and B Shen ldquoA novel swarm intelligence optimizationapproach sparrow search algorithmrdquo Systems Science ampControl Engineering An Open Access Journal vol 8 no 1pp 22ndash34 2020

[43] J Bai K Xia Y Lin and PWu ldquoAttribute reduction based onconsistent covering rough set and its applicationrdquoComplexityvol 2017 Article ID 8986917 9 pages 2017

24 Computational Intelligence and Neuroscience