ahybridapproachintegratingmultipleiceemdans,woa,and ... · 2020. 10. 22. · aff477”53q i jaa...

17
Research Article AHybridApproachIntegratingMultipleICEEMDANs,WOA,and RVFL Networks for Economic and Financial Time Series Forecasting Jiang Wu , Tengfei Zhou , and Taiyong Li School of Economic Information Engineering, Southwestern University of Finance and Economics, Chengdu 611130, China Correspondence should be addressed to Taiyong Li; [email protected] Received 8 May 2020; Revised 11 August 2020; Accepted 6 October 2020; Published 22 October 2020 Academic Editor: Lei Xie Copyright © 2020 Jiang Wu et al. is is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. efluctuationsofeconomicandfinancialtimeseriesareinfluencedbyvariouskindsoffactorsandusuallydemonstratestrong nonstationary and high complexity. erefore, accurately forecasting economic and financial time series is always a challenging research topic. In this study, a novel multidecomposition and self-optimizing hybrid approach integrating multiple improved complete ensemble empirical mode decompositions with adaptive noise (ICEEMDANs), whale optimization algorithm (WOA), and random vector functional link (RVFL) neural networks, namely, MICEEMDAN-WOA-RVFL, is developed to predict economicandfinancialtimeseries.First,weemployICEEMDANwithrandomparameterstoseparatetheoriginaltimeseriesinto a group of comparatively simple subseries multiple times. Second, we construct RVFL networks to individually forecast each subseries.ConsideringthecomplexparametersettingsofRVFLnetworks,weutilizeWOAtosearchtheoptimalparametersfor RVFL networks simultaneously. en, we aggregate the prediction results of individual decomposed subseries as the prediction results of each decomposition, respectively, and finally integrate these prediction results of all the decompositions as the final ensemble prediction results. e proposed MICEEMDAN-WOA-RVFL remarkably outperforms the compared single and ensemblebenchmarkmodelsintermsofforecastingaccuracyandstability,asdemonstratedbytheexperimentsconductedusing various economic and financial time series, including West Texas Intermediate (WTI) crude oil prices, US dollar/Euro foreign exchange rate (USD/EUR), US industrial production (IP), and Shanghai stock exchange composite index (SSEC). 1. Introduction Economic and financial time series, such as price move- ments, stock market indices, and exchange rate, are usually characterized by strong nonlinearity and high complexity, since they are influenced by a number of extrinsic and in- trinsic factors including economic conditions, political events,andevensuddencrises[1,2].Economicandfinancial time series forecasting always play a vital role in social and economic development, which is of great economic im- portance to both individuals and countries. erefore, economic and financial time series forecasting is always a very active research area. In extant research, various forecasting methods were proposed to forecast various economic and financial time series. ese forecasting methods mainly include statistical and artificial intelligence (AI) approaches. e frequently used statistical approaches for economic and financial time seriesforecastingincludetheerrorcorrectionmodel(ECM) [3], hidden Markov model (HMM) [4], random walk (RW) model [5], autoregressive moving average (ARMA) model [6], autoregressive integrated moving average (ARIMA) model [7], and generalized autoregressive conditional het- eroskedasticity (GARCH) model [8, 9]. Lanza et al. fore- castedtheseriesofcrudeoilpricesintwodistinctareasusing the ECM [3]. Hassan and Nath developed the HMM ap- proach for forecasting stock price for interrelated markets [4]. Kilian and Taylor analyzed the advantage of RW in exchange rate forecasting [5]. Rout et al. integrated ARMA with differential evolution (DE) to develop a hybrid model for exchange rate forecasting [6]. Mondal et al. conducted a study on the effectiveness of the ARIMA model on the Hindawi Complexity Volume 2020, Article ID 9318308, 17 pages https://doi.org/10.1155/2020/9318308

Upload: others

Post on 09-Mar-2021

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

Research ArticleAHybrid Approach IntegratingMultiple ICEEMDANsWOA andRVFL Networks for Economic and Financial TimeSeries Forecasting

Jiang Wu Tengfei Zhou and Taiyong Li

School of Economic Information Engineering Southwestern University of Finance and Economics Chengdu 611130 China

Correspondence should be addressed to Taiyong Li litaiyonggmailcom

Received 8 May 2020 Revised 11 August 2020 Accepted 6 October 2020 Published 22 October 2020

Academic Editor Lei Xie

Copyright copy 2020 Jiang Wu et al )is is an open access article distributed under the Creative Commons Attribution Licensewhich permits unrestricted use distribution and reproduction in any medium provided the original work is properly cited

)e fluctuations of economic and financial time series are influenced by various kinds of factors and usually demonstrate strongnonstationary and high complexity )erefore accurately forecasting economic and financial time series is always a challengingresearch topic In this study a novel multidecomposition and self-optimizing hybrid approach integrating multiple improvedcomplete ensemble empirical mode decompositions with adaptive noise (ICEEMDANs) whale optimization algorithm (WOA)and random vector functional link (RVFL) neural networks namely MICEEMDAN-WOA-RVFL is developed to predicteconomic and financial time series First we employ ICEEMDANwith random parameters to separate the original time series intoa group of comparatively simple subseries multiple times Second we construct RVFL networks to individually forecast eachsubseries Considering the complex parameter settings of RVFL networks we utilize WOA to search the optimal parameters forRVFL networks simultaneously )en we aggregate the prediction results of individual decomposed subseries as the predictionresults of each decomposition respectively and finally integrate these prediction results of all the decompositions as the finalensemble prediction results )e proposed MICEEMDAN-WOA-RVFL remarkably outperforms the compared single andensemble benchmark models in terms of forecasting accuracy and stability as demonstrated by the experiments conducted usingvarious economic and financial time series including West Texas Intermediate (WTI) crude oil prices US dollarEuro foreignexchange rate (USDEUR) US industrial production (IP) and Shanghai stock exchange composite index (SSEC)

1 Introduction

Economic and financial time series such as price move-ments stock market indices and exchange rate are usuallycharacterized by strong nonlinearity and high complexitysince they are influenced by a number of extrinsic and in-trinsic factors including economic conditions politicalevents and even sudden crises [1 2] Economic and financialtime series forecasting always play a vital role in social andeconomic development which is of great economic im-portance to both individuals and countries )ereforeeconomic and financial time series forecasting is always avery active research area

In extant research various forecasting methods wereproposed to forecast various economic and financial timeseries )ese forecasting methods mainly include statistical

and artificial intelligence (AI) approaches )e frequentlyused statistical approaches for economic and financial timeseries forecasting include the error correction model (ECM)[3] hidden Markov model (HMM) [4] random walk (RW)model [5] autoregressive moving average (ARMA) model[6] autoregressive integrated moving average (ARIMA)model [7] and generalized autoregressive conditional het-eroskedasticity (GARCH) model [8 9] Lanza et al fore-casted the series of crude oil prices in two distinct areas usingthe ECM [3] Hassan and Nath developed the HMM ap-proach for forecasting stock price for interrelated markets[4] Kilian and Taylor analyzed the advantage of RW inexchange rate forecasting [5] Rout et al integrated ARMAwith differential evolution (DE) to develop a hybrid modelfor exchange rate forecasting [6] Mondal et al conducted astudy on the effectiveness of the ARIMA model on the

HindawiComplexityVolume 2020 Article ID 9318308 17 pageshttpsdoiorg10115520209318308

forecasting of 56 Indian stocks from different sectors [7]Alberg et al conducted a comprehensive analysis of the stockindices using various GARCHmodels and the experimentalresults showed that the asymmetric GARCH model en-hanced the overall prediction performance [9]

Since most economic and financial time series involvethe complex characteristics of strong nonlinearity andnonstationarity it is difficult to obtain satisfactory fore-casting accuracy by statistical approaches Hence various AIapproaches were proposed for economic and financial timeseries forecasting )ese AI forecasting approaches includethe artificial neural network (ANN) [10 11] support vectormachine (SVM) [12 13] extreme learning machine (ELM)[14] random vector functional link (RVFL) neural network[15] and recurrent neural network (RNN) [16] Pradhan andKumar utilized ANN to forecast foreign exchange rate inIndia and the experimental results indicated that the ANNcould effectively forecast the exchange rate [10] Das andPadhy forecasted the commodity futures contract indexusing the SVM and the empirical analysis showed that theproposed model was effective and achieved the satisfactoryprediction performance [12] Li et al made stock priceprediction using the ELM and the comparison resultsshowed that the ELM with radial basis function (RBF)kernels achieved better prediction performance with fasterspeed than back propagation neural networks (BPNNs) [14]Moudiki et al employed quasirandomized functional linknetworks for various time series forecasting and the pro-posed approach could generate more robust predictionresults [15] Baek and Kim employed long short-termmemory (LSTM) for stock index forecasting and the resultsconfirmed the LSTM model had excellent prediction ac-curacy [16]

In order to effectively improve prediction accuracyvarious hybrid forecasting models were designed for eco-nomic and financial time series forecasting Babu and Reddycombined ARIMA and nonlinear ANN models to develop anovel hybrid ARIMA-ANN model and the experiments onelectricity price and stock index indicated that the proposedARIMA-ANN had higher prediction accuracy [17] Kumarand )enmozhi compared three different hybrid models forthe forecasting of stock index returns and concluded that theARIMA-SVM model could obtain the highest predictionaccuracy [18]Hsu built a hybrid model based on a backpropagation neural network (BPNN) and genetic pro-gramming (GP) for stockfutures price forecasting and theempirical analysis showed that the proposed hybrid modelcould effectively improve the prediction accuracy [19])esehybrid models are able to fully take advantage of the po-tential of single models and thus obtain better predictionaccuracy than single models

Due to the complexity of original economic and financialtime series conducting forecasting on original time series ishard to obtain satisfactory prediction accuracy To reducethe complexity of original time series a framework ofldquodecomposition and ensemblerdquo is widely utilized in the fieldof time series forecasting )e framework includes threestages decomposition forecasting and ensemble )eoriginal time series is firstly separated into a sum of

subseries then a prediction model is used to forecast eachsubseries and finally the predictions of all the subseries areaggregated as the final prediction results Decomposition asthe first step is very important for enhancing the perfor-mance of the ensemble model )e widely used decompo-sition approaches include wavelet decomposition (WD)variational mode decomposition (VMD) and empiricalmode decomposition (EMD) class methods Lahmiri com-bined VMD with a general regression neural network(GRNN) to develop a novel ensemble forecasting model andthe experimental results suggested that VMD outperformedEMD for the prediction of economic and financial timeseries [20] Kao et al integrated WD support vector re-gression (SVR) and multivariate adaptive regression splines(Mars) to develop an ensemble forecasting model to forecaststock price and the proposed model obtained better pre-diction accuracy [21] In the second stage of the frameworkof ldquodecomposition and ensemblerdquo various optimizationapproaches were introduced to enhance the performance ofpredictors Li et al proposed a ridge regression (RR) with DEto forecast crude oil prices and obtained excellent forecastingaccuracy [22] Bagheri et al introduced quantum-behavedparticle swarm optimization (QPSO) to tune the adaptivenetwork-based fuzzy inference system (ANFIS) for financialtime series forecasting [23] Wang et al employed brainstorm optimization (BSO) algorithm to optimize SVR andthe results indicated that the developed approach was ef-fective in stock market analysis [24]

In the ldquodecomposition and ensemblerdquo framework de-composition approaches and prediction approaches influ-ence the final prediction results greatly Considering thepowerful decomposition ability of ICEEMDAN the excel-lent search efficiency of WOA and the accurate forecastingability of the RVFL network we develop a novel ensembleprediction model integrating multiple ICEEMDANs WOAand the RVFL network namely MICEEMDAN-WOA-RVFL for economic and financial time series forecastingFirstly ICEEMDAN with random parameters is utilized todivide the original economic and financial time series into asum of subseries Secondly the RVFL network is applied toforecast each decomposed subseries individually and WOAis used to optimize the parameter values of the RVFLnetwork simultaneously Finally the predictions of all in-dividual subseries are aggregated as the prediction values ofone process of decomposition and ensemble From ourobservations we find that the decomposition in the firststage has some disadvantages of the uncertainties with aquite randomness which can lead to the difference andinstability of the prediction results In addition extensiveliterature has shown that combining multiple forecasts caneffectively enhance prediction accuracy [25 26] )ereforewe randomize the decomposition parameter values ofICEEMDAN in the first stage repeat the abovementionedprocesses multiple times and integrate the results of mul-tiple decompositions and ensembles as the final predictionvalues We expect that the multiple decomposition strategycan reduce the randomness of one single decomposition andfurther improve the ensemble prediction stability andaccuracy

2 Complexity

)e main contributions of this paper are as follows (1)we propose a new multidecomposition and self-optimizingensemble prediction model integrating multiple ICE-EMDANs WOA and RVFL networks for economic andfinancial time series forecasting As far as we know this is thefirst time that the novel combination is developed foreconomic and financial time series forecasting (2) To furtherenhance forecasting accuracy and stability we utilize mul-tiple differentiated ICEEMDANs to decompose originaleconomic and financial time series and finally ensemble thepredictions of all decompositions as the final predictions (3)WOA is firstly introduced to optimize various parameters ofRVFL networks (4) )e empirical results on four differenttypes of economic and financial time series show that ourproposed MICEEMDAN-WOA-RVFL significantly en-hances the prediction performance in terms of forecastingaccuracy and stability

)e novelty of the proposed MICEEMDAN-WOA-RVFL is three-fold (1) a novel hybrid model integratingmultiple ICEEMDANs WOA and RVFL networks isdesigned for economic and financial time series forecasting(2) the multiple decomposition strategy is firstly proposed toovercome the randomness of one single decomposition andto improve prediction accuracy and stability and (3) WOAis first applied to optimizing RVFL networks to improve theperformance of individual forecasting

)e remainder of the paper is organized as followsSection 2 offers a brief introduction to the ICEEMDANWOA and RVFL network Section 3 provides the archi-tecture and the detailed implementation of the proposedMICEEMDAN-WOA-RVFL Section 4 analyzes the em-pirical results on various economic and financial time seriesforecasting Section 5 discusses some details of the de-veloped prediction model and Section 6 concludes thispaper

2 Preliminaries

21 Improved Complete Ensemble Empirical Mode Decom-positionwithAdaptiveNoise (ICEEMDAN) Empirical modedecomposition (EMD) an adaptive time-frequency analysisapproach for nonstationary signals was designed by Huanget al [27] EMD separates original time series into a sum ofldquointrinsic mode functionsrdquo (IMFs) and one residue andthus it can simplify time series analysis Due to somedrawbacks of EMD such as mode mixing EEMD [28] andCEEMDAN [29] have been proposed to improve decom-position performance and applied in various fields [30ndash32]In spite of that these decompositionmethods still have somenew problems To solve these problems an improvedCEEMDAN (ICEEMDAN) was developed by Colominaset al [33]

Let Ek(middot) be the operator which generates the kth modeusing EMD M(middot) be the operator which generates the localmean of the series and w(i) be a realization of zero meanunit variance noise When x is the original signal thedetailed decomposition process of ICEEMDAN is asfollows

(i) Step 1 employ EMD to compute the local means of Irealizations x(i) x+ β0E1(w(i)) to achieve the firstresidue r1 M(x(i)) and β0gt 0

(ii) Step 2 calculate the first IMF1 xminus r1(iii) Step 3 calculate the average of local means of the

realizations as the second residuer2 (r1 + β1E2(w(i)))

(iv) Step 4 compute the kth residue for k 3 Krk M(rkminus1 + βkminus1Ek(w(i)))

(v) Step 5 go to step 4 for next k

Since ICEEMDAN can effectively decompose the orig-inal time series it has been frequently introduced intovarious time series forecasting [34ndash36] In our study weemploy ICEEMDAN to separate the original economic andfinancial time series into a sum of simpler subseries forsubsequent forecasting

22 Whale Optimization Algorithm (WOA) Whale opti-mization algorithm (WOA) is a type of optimizationmethodand outperforms particle swarm optimization (PSO) antcolony optimization (ACO) gravitational search algorithm(GSA) and fast evolutionary programming (FEP) in theoptimization performance [37 38] Simulating the huntingprocess of whales WOA includes three main operatorsencircling prey bubble-net foraging and search for prey Ineach iteration individuals update their positions toward thebest individual in the last iteration which can be formulatedas follows

P(t + 1) Pbest(t) minus A middot C middot Pbest(t) minus P(t)1113868111386811138681113868

1113868111386811138681113868 (1)

where t represents the tth iteration A and C are two co-efficient vectors Pbest is the best individual so far P is theposition of an individual || represents the absolute valueand middot indicates an element-by-element multiplication

In the exploitation (bubble-net foraging) phase theindividual position is updated based on its distance to thebest individual by simulating the helix-shaped movement ofwhales which is formulated as follows

P(t + 1) D middot ebl

middot cos(2πl) + Pbest(t) (2)

where D |Pbest(t)minus P(t)| represents the distance betweenthe best individual obtained so far and the ith individual l isa random number in [minus1 1] and b is a constant which isused to define the shape of logarithmic spiral

In exploration (search for prey) phase the individualposition is updated using a randomly selected individual)e mathematical model is follows

P(t + 1) Prand minus A middot C middot Prand minus P(t)1113868111386811138681113868

1113868111386811138681113868 (3)

where Prand represents a randomly selected individual)e detailed flowchart of WOA is illustrated in Figure 1

where p is a random number in [0 1] Due to its verycompetitive search ability WOA has been widely applied invarious fields [39ndash41] )erefore we consider taking ad-vantage of the effective search ability of WOA to seek theoptimal parameters for RVFL networks

Complexity 3

23 RandomVector Functional Link (RVFL) Neural NetworkAs a kind of modification of the multilayer perceptron(MLP) model the random vector functional link (RVFL)neural network was proposed by Pao et al [42] )e RVFLneural network is able to overcome the slow convergenceoverfitting and local minimum inherently in the traditionalgradient-based learning algorithms Like MLP the archi-tecture of the RVFL neural network includes three layerswhich is illustrated in Figure 2)emain modification in theRVFL network lies in the connection in the networkstructure Since the RVFL network has the direct connec-tions from the input layer to the output layer it can performbetter compared to no direct link [43 44]

)e neurons in the hidden layer known as enhancementnodes calculate the sum of all the output of the input layerneurons and obtain their output with an activation function

hm g 1113944

N

n1wnmin + bm

⎛⎝ ⎞⎠ (4)

where wnm denotes the weight between in and hm bm rep-resents the bias of the mth neuron in hidden layer and g(middot)

represents an activation function)e output layer neurons integrate all the output from

the hidden layer and input layer neurons and the finaloutput is

ol 1113944M

m1wmlhm + 1113944

N

n1wnlin (5)

where wml represents the weight between hm and ol and wnl

indicates the weight between in and olTo enhance the training efficiency the RVFL neural

network utilizes a given distribution to fix the values of wnm

and bm and obtain the weights of wml and wnl by minimizingthe system error

E 12P

1113944

P

j1t(j)

minus Bd(j)

1113872 11138732 (6)

where P indicates the number of training samples and t arethe target values B is the combination of wml and wnl and drepresents a combined vector

)e RVFL neural network has demonstrated an ex-tremely efficient and fast forecasting ability and has beenfrequently used in time series forecasting [44 45]

3 MICEEMDAN-WOA-RVFL The ProposedApproach for Economic and Financial TimeSeries Forecasting

Referring to the framework of ldquodecomposition and en-semblerdquo we design a multidecomposition and self-opti-mizing hybrid model that integrates multipleICEEMDANs WOA and RVFL networks termed asMICEEMDAN-WOA-RVFL to forecast economic andfinancial time series )e architecture of the proposedhybrid model is illustrated in Figure 3

Our proposed MICEEMDAN-WOA-RVFL takes ad-vantage of the idea of ldquodivide and conquerrdquo that was fre-quently used in time series forecasting image processingfault diagnosis and so on [46ndash53] A procedure of ldquode-composition and ensemblerdquo in the MICEEMDAN-WOA-RVFL is as follows

(i) Stage 1 decomposition ICEEMDAN is employedto separate original time series into several subseries(ie several IMFs and one residue)

Initialize the whales population Pi (i = 1 2 n)

Calculate the fitness of each individual and obtain the best individual Pbest

t lt maximumnumber of iterations

Update A C p and l for each individual respectively

p ge 05

Update the position of the current search by the Eq (2) A lt 1

Update the position of the current search by the Eq (1) Update the position of the current search by the Eq (3)

Calculate the fitness of each individual and update the best individual Pbest

t = t + 1

Obtain the best individual Pbest

True

True False

False

True False

Figure 1 )e flowchart of whale optimization algorithm (WOA)

4 Complexity

(ii) Stage 2 individual forecasting Each decomposedsubseries is divided into a training dataset and atesting dataset then the RVFL network with WOAoptimization is developed on each training datasetindependently and finally the developed RVFLmodel is used to each testing dataset )e reasonwhy we select the RVFL network as the predictor isits powerful forecasting ability in extant research[15 34 44] Since the parameter setting of the RVFLnetwork plays an important role in the predictionperformance we introduce WOA to seek the op-timal parameter values for the RVFL network in theforecasting stage

(iii) Stage 3 ensemble the predictions of all thedecomposed subseries are aggregated as the finalprediction results of one ldquodecomposition and en-semblerdquo using addition aggregation

In this study to enhance both accuracy and stability offinal prediction we generate random values for the de-composition parameters of ICEEMDAN in the decompo-sition stage including number of realizations (Nr) noisestandard deviation (Nsd) and maximum number of siftingiterations (Maxsi) for each decomposition and repeat theprocedure of ldquodecomposition and ensemblerdquo M times andfinally combine all the results of multiple ldquodecompositionsand ensemblesrdquo using the RMSE-weighted method as thefinal prediction results )e corresponding weight of ithforecasting model is as follows

Weighti 1RMSEi

1113936Mj11RMSEj

(7)

where M denotes the number of individual models andRMSEi indicates the RMSE value of the ith forecasting modelin the training process

Decomposition

Individual forecasting

Ensemble

hellip hellip hellip hellip

hellip hellip

hellip hellip

Economic and financial time series

ICEEMDAN1

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted results1

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

Randomize the parametersof ICEEMDAN1

Randomize the parametersof ICEEMDANm

WOA RVFL WOA hellip hellip

hellip hellip

ICEEMDANm

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted resultsm

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

WOA RVFL WOA

helliphellip

Ensemble

Final predicted results

Figure 3 )e flowchart of the proposed MICEEMDAN-WOA-RVFL

Input layer

Hidden layer

Output layer

i1

iN

hM

h2

h1

o1

oL

wnm bm wmlwnl

Figure 2 )e architecture of the random vector functional link (RVFL) neural network

Complexity 5

Although some recent studies also employ the RVFLnetwork for time series forecasting they obviously differfrom the current study in the decomposition technique andnetwork optimization (1) they divide the original timeseries using WD or EMD (2) they construct RVFL net-works using the fixed parameter values Unlike the previousstudies our study decomposes economic and financial timeseries using ICEEMDAN and searches the optimal pa-rameter values of RVFL networks based on WOA Fur-thermore the previous research mainly focuses on dividingoriginal time series by one single decomposition or dualdecomposition [20 35 54] In dual decomposition theoriginal signal is first decomposed into several compo-nents and then the high-frequency components continueto be decomposed into other components using the same ordifferent decomposition method Essentially the dual de-composition process belongs to one decomposition justincluding two different decomposition stages Unlike theprevious research one main improvement in this study isthe multiple decomposition strategy which can success-fully overcome the randomness of one single decomposi-tion and further improve the prediction accuracy andstability of the developed forecasting approach To ourknowledge it is the first time that the multiple decom-position strategy is developed for the forecasting of eco-nomic and financial time series

4 Experimental Results

41 Data Description As we know economic and financialtime series are influenced by various factors sometimes raisingand dropping down in a short time )e dramatic fluctuationsusually lead to the significant nonlinearity and nonstationarityof the time series To comprehensively evaluate the effectivenessof the proposed MICEEMDAN-WOA-RVFL we choose fourdifferent time series including the West Texas Intermediatecrude oil spot price (WTI) US dollarEuro foreign exchangerate (USDEUR) US industrial production (IP) and Shanghaistock exchange composite index (SSEC) as the experimentaldatasets )e first three datasets can be accessed via the websiteof St Louis Fed Research [55] and the last one can be obtainedvia the website of NetEase [56]

Each time series is separated into two subdatasets thefirst 80 for training and the last 20 for testing Table 1shows the divided samples of the abovementioned foureconomic and financial time series

We utilize ICEEMDAN to decompose these time seriesinto groups of relatively simple subseries Figure 4 offers anexample of the decomposition of the WTI dataset usingICEEMDAN

42 Evaluation Indices In this study we use four evaluationmetrics including the mean absolute percent error (MAPE)the root mean squared error (RMSE) the directional statistic(Dstat) and the DieboldndashMariano (DM) test to assess theperformance of the proposed model Among them MAPEand RMSE are used to assess the forecasting error defined asfollows

MAPE 1113944N

t1

Ot minus Pt

Ot

11138681113868111386811138681113868111386811138681113868

11138681113868111386811138681113868111386811138681113868times100N

(8)

RMSE

1N

1113944N

t1Ot minus Pt( 1113857

2

11139741113972

(9)

where N is the size of the evaluated samples Ot denotes theactual values and Pt represents the predicted values at time t)e lower the values of RMSE and MAPE the better theprediction models

)e Dstat indicates the performance of direction pre-diction which is formulated as follows

Dstat 1N

1113944

N

i1di times 100 (10)

where di 1 if (Pt+1 minusOt) (Ot+1 minusOt)ge 0 otherwise di 0 Ahigher value of Dstat indicates a more accurate directionprediction

Furthermore to test the significance of the predictionperformance of pairs of models we employ the Die-boldndashMariano (DM) test in this study

43 Experimental Settings In this study we compare theproposed MICEEMDAN-WOA-RVFL with several state-of-the-art forecasting models including the single models andthe ensemble models Among all these models the singlemodels include one popular statistical model RW and twopopular AI models BPNN and least square SVR (LSSVR))e ensemble models derive from the combination of thesingle models and the decomposition method ICEEMDAN

)e detailed parameters of all prediction models de-composition approach ICEEMDAN and optimizationmethod WOA in the experiments are shown in Table 2 )eparameter values of BPNN LSSVR RVFL and ICEEMDANrefer to the previous literature [22 34 45]

All experiments were conducted using Matlab R2019bon a PC with 64 bit Microsoft Windows 10 8GB RAM and18GHz i7-8565U CPU

44 Results and Analysis We compare the forecastingperformance of six prediction models including three singlemodels

(RW LSSVR and BPNN) and three ensemble modes(ICEEMDAN-RW ICEEMDAN-LSSVR and ICEEMDAN-BPNN) with that of our proposed MICEEMDAN-WOA-RVFL in terms of MAPE RMSE and Dstat Due to thedifferent horizons we train different forecasting modelsseparately )at is to say we use the proposed scheme fordifferent horizons to train different models Tables 3ndash5 re-port the experimental results in terms of each evaluationindex with 1- 3- and 6-horizon respectively

From Table 3 we can see that the proposedMICEEMDAN-WOA-RVFL obtains the lowest (the best)MAPE values with all the horizons in all the datasets RWobtains the best MAPE values with all the horizons in all thedatasets among all the compared single models

6 Complexity

50

100

150

Crud

e oil

pric

es

ndash100

10

IMF1

ndash100

10

IMF2

ndash100

10

IMF3

ndash100

10

IMF4

ndash100

10

IMF5

ndash200

20

IMF6

ndash200

20

IMF7

ndash200

20

IMF8

ndash200

20

IMF9

ndash100

10

IMF1

0

ndash200

20

IMF1

1

20406080

Resid

ue

Jan 02 1986 Apr 25 1990 Aug 19 1994 Jan 04 1999 May 27 2003 Oct 17 2007 Feb 29 2012 Jul 12 2016 Apr 15 2020

Figure 4 )e WTI crude oil price series and the corresponding decomposed subseries by ICEEMDAN

Table 1 Samples of economic and financial time series

Dataset Type Dataset Size Date

WTI DailySample set 8641 2 January 1986sim15 April 2020Training set 6912 2 January 1986sim24 May 2013Testing set 1729 28 May 2013sim15 April 2020

USDEUR DailySample set 5341 4 January 1999sim10 April 2020Training set 4272 4 January 1999sim30 December 2015Testing set 1069 31 December 2015sim10 April 2020

IP MonthlySample set 1215 January 1919simMatch 2020Training set 972 January 1919simDecember 1999Testing set 243 January 2000simMatch 2020

SSEC DailySample set 7172 19 December 1990sim21 April 2020Training set 5737 19 December 1990sim4 June 2014Testing set 1435 5 June 2014sim21 April 2020

Complexity 7

demonstrating it is better than LSSVR and BPNN for theforecasting of economic and financial time series Of all theensemble models the ICEEMDAN-LSSVR model andICEEMDAN-BPNN model obtain the close MAPE valuesobviously better than the ICEEMDAN-RW

)e RMSE values of the four time series datasets arelisted in Table 4 From this table we can find that theproposed MICEEMDAN-WOA-RVFL outperforms allthe single and ensemble models with all the horizons in allthe datasets )e statistical model RW obtains the bestRMSE values in 10 out of 12 cases demonstrating that it ismore suitable for economic and financial time seriesforecasting than LSSVR and BPNN among all the singlemodels As to the ensemble models the proposedMICEEMDAN-WOA-RVFL obtains the lower RMSEvalues than the compared ensemble models

demonstrating that the former is more effective for eco-nomic and financial time series forecasting

Table 5 shows the directional statistics Dstat and we cansee that the MICEEMDAN-WOA-RVFL achieves thehighest Dstat values in all the 12 cases indicating that it hasbetter performance of direction forecasting Amongst thesingle prediction models LSSVR and RW obtain the bestDstat values in 5 cases respectively better than the BPNNSimilarly the ICEEMDAN-LSSVRmodel and ICEEMDAN-BPNN model obtain the close Dstat values obviously betterthan the ICEEMDAN-RW model in all the 12 cases

From the all prediction results we can find that all theensemble prediction models except ICEEMDAN-RW greatlyoutperform the corresponding single prediction models in allthe 12 cases showing that the framework of decompositionand ensemble is an effective tool for improving the forecasting

Table 2 )e settings for the parametersMethod Parameters Description

ICEEMDANNsd 02 Noise standard deviationNr 100 Number of realizations

Maxsi 5000 Maximum number of sifting iterations

LSSVR Rp 2minus10minus9 1112 Regularization parameterWidRBF 2minus10 minus9 1112 Width of the RBF kernel

BPNNNhe 10 Number of hidden neurons

Maxte 1000 Maximum training epochsLr 00001 Learning rate

WOA Pop 40 Population sizeMaxgen 100 Maximum generation

MICEEMDAN-WOA-RVFL

Nsd [001 04] Noise standard deviation in ICEEMDANNr [50 500] Number of realizations in ICEEMDAN

Maxsi [2000 8000] Maximum number of sifting iterations inICEEMDAN

Nhe [5 30] Number of hidden neurons in RVFLFunc sigmoid sine hardlim tribas radbas sign Activation function in RVFL

Mod 1 Regularized least square Mode in RVFL2 MoorendashPenrose pseudoinverse

Lag [3 20] Lag in RVFLBias true false Bias in RVFL

Rand 1 Gaussian 2 Uniform Random type in RVFLScale [01 1] Scale value in RVFL

ScaleMode 1 Scale the features for all neurons Scale mode in RVFL2 Scale the features for each hidden neuron

3 Scale the range of the randomization for uniformdiatribution

Table 3 )e mean absolute percent error (MAPE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 00036 00176 00177 00182 00200 00044 000553 00080 00307 00309 00320 00331 00092 000946 00113 00441 00449 00457 00458 01202 00130

USDEUR1 00006 00034 00034 00034 00044 00010 000103 00015 00063 00063 00063 00071 00020 000216 00022 00087 00088 00087 00094 00032 00031

IP1 00012 00049 00057 00056 00053 00024 000223 00023 00102 00140 00123 00104 00033 000336 00032 00183 00279 00235 00184 00043 00049

SSEC1 00020 00096 00096 00097 00111 00026 000263 00044 00178 00178 00179 00185 00047 000516 00065 00259 00259 00269 00268 00072 00074

8 Complexity

performance Of all the compared models the proposedMICEEMDAN-WOA-RVFL obtains the highest Dstat valuesand the lowest MAPE and RMSE values in all the time seriesdatasets showing that it is completely superior to thebenchmark prediction models Furthermore for each pre-dictionmodel theMAPE and RMSE values increase while theDstat values decrease with the horizon)is demonstrates thatit is easier to forecast time series with a short horizon thanwith a long one It is worth noting that the proposedMICEEMDAN-WOA-RVFL still achieves the relatively goodMAPE and RMSE with the increase of horizon among thecompared models For example when the proposed modelobtains 07737 Dstat with horizon 6 in the WTI dataset itachieves the relatively low MAPE (00113) and RMSE(08146) indicating that the prediction values are very close tothe real values although the proposed model misses directionabout 2263 In other words the proposed MICEEMDAN-WOA-RVFL can still achieve satisfactory forecasting accuracywith a long horizon

Furthermore we can find that the multiple decompo-sition strategy does not improve the forecast for the RWmodel One possible explanation is simply that RW infersthat the past movement or trend of a time series cannot beused to predict its future movement and thus it cannot takeadvantage of the historical data and the diversity of multipledecomposition to make the future prediction )ereforewhen we aggregate the multiple RWmodel predictions of allthe decomposed subseries we just integrate several randompredictions and thus cannot significantly improve theensemble prediction results In contrast the LSSVR and

BPNN as well as RVFL can fully use all the historical dataand the diversity of multiple decomposition to make thefuture prediction Specifically the multiple decompositionusing different parameters generates many groups of dif-ferent decomposed subseries and the diversity of decom-position can successfully overcome the randomness of onesingle decomposition and further improve the predictionaccuracy and stability of the developed forecasting approach

In addition the DieboldndashMariano (DM) test is utilizedto evaluate whether the forecasting accuracy of the proposedMICEEMDAN-WOA-RVFL significantly outperformsthose of the other compared models Table 6 shows thestatistics and p values (in brackets)

On one hand the DM statistical values between theensemble prediction models and their corresponding singlepredictors are much lower than zero and the correspondingp values are almost equal to zero with all the horizons exceptfor the RW model showing that the architecture of ldquode-composition and ensemblerdquo contributes to greatly im-proving prediction accuracy and the combination ofICEEMDAN and AI predictors is more effective for eco-nomic and financial time series forecasting

On the other hand DM test results on the prediction ofall the four time series datasets indicate that theMICEEMDAN-WOA-RVFL is significantly better than thesingle models and the other ensemble models with all thehorizons and the corresponding p values are much lowerthan 001 in all the cases

In summary the DM test results demonstrate that thecombination of multiple ICEEMDANs RVFL networks and

Table 4 )e root mean squared error (RMSE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 02715 13196 13228 13744 18134 03467 040783 05953 21784 21929 22867 25041 06754 076206 08146 30737 31271 31845 32348 08692 09462

USDEUR1 00009 00051 00052 00052 00099 00015 000163 00022 00091 00092 00092 00129 00031 000316 00033 00127 00128 00127 00154 00047 00046

IP1 02114 07528 08441 08230 08205 03861 041753 03875 14059 18553 18311 14401 05533 052946 05340 24459 36015 28116 24569 06408 08264

SSEC1 108115 500742 498800 508431 633951 126460 1379283 222983 891834 885290 891781 932887 252526 2977846 357201 1316386 1313487 1335647 1372598 404357 496341

Table 5 )e directional statistic (Dstat) values of different prediction modelsDataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 09381 04815 05243 05191 04977 09190 090973 08576 04873 05087 05012 04907 08300 084496 07737 05116 04902 04948 05185 07714 07575

USDEUR1 09410 05019 04719 04897 05056 08998 090263 08502 04888 04897 04925 04916 08118 080246 07828 04991 05318 05253 05047 07023 07154

IP1 09256 05579 05909 05785 05620 08636 087603 08802 06983 05455 04876 06529 08430 080586 08141 06198 05537 05661 06405 07355 07645

SSEC1 09156 04944 05049 05098 04979 09024 090023 08396 05042 05021 05007 04965 08222 081456 07587 04833 05063 04965 04805 07448 07455

Complexity 9

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 2: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

forecasting of 56 Indian stocks from different sectors [7]Alberg et al conducted a comprehensive analysis of the stockindices using various GARCHmodels and the experimentalresults showed that the asymmetric GARCH model en-hanced the overall prediction performance [9]

Since most economic and financial time series involvethe complex characteristics of strong nonlinearity andnonstationarity it is difficult to obtain satisfactory fore-casting accuracy by statistical approaches Hence various AIapproaches were proposed for economic and financial timeseries forecasting )ese AI forecasting approaches includethe artificial neural network (ANN) [10 11] support vectormachine (SVM) [12 13] extreme learning machine (ELM)[14] random vector functional link (RVFL) neural network[15] and recurrent neural network (RNN) [16] Pradhan andKumar utilized ANN to forecast foreign exchange rate inIndia and the experimental results indicated that the ANNcould effectively forecast the exchange rate [10] Das andPadhy forecasted the commodity futures contract indexusing the SVM and the empirical analysis showed that theproposed model was effective and achieved the satisfactoryprediction performance [12] Li et al made stock priceprediction using the ELM and the comparison resultsshowed that the ELM with radial basis function (RBF)kernels achieved better prediction performance with fasterspeed than back propagation neural networks (BPNNs) [14]Moudiki et al employed quasirandomized functional linknetworks for various time series forecasting and the pro-posed approach could generate more robust predictionresults [15] Baek and Kim employed long short-termmemory (LSTM) for stock index forecasting and the resultsconfirmed the LSTM model had excellent prediction ac-curacy [16]

In order to effectively improve prediction accuracyvarious hybrid forecasting models were designed for eco-nomic and financial time series forecasting Babu and Reddycombined ARIMA and nonlinear ANN models to develop anovel hybrid ARIMA-ANN model and the experiments onelectricity price and stock index indicated that the proposedARIMA-ANN had higher prediction accuracy [17] Kumarand )enmozhi compared three different hybrid models forthe forecasting of stock index returns and concluded that theARIMA-SVM model could obtain the highest predictionaccuracy [18]Hsu built a hybrid model based on a backpropagation neural network (BPNN) and genetic pro-gramming (GP) for stockfutures price forecasting and theempirical analysis showed that the proposed hybrid modelcould effectively improve the prediction accuracy [19])esehybrid models are able to fully take advantage of the po-tential of single models and thus obtain better predictionaccuracy than single models

Due to the complexity of original economic and financialtime series conducting forecasting on original time series ishard to obtain satisfactory prediction accuracy To reducethe complexity of original time series a framework ofldquodecomposition and ensemblerdquo is widely utilized in the fieldof time series forecasting )e framework includes threestages decomposition forecasting and ensemble )eoriginal time series is firstly separated into a sum of

subseries then a prediction model is used to forecast eachsubseries and finally the predictions of all the subseries areaggregated as the final prediction results Decomposition asthe first step is very important for enhancing the perfor-mance of the ensemble model )e widely used decompo-sition approaches include wavelet decomposition (WD)variational mode decomposition (VMD) and empiricalmode decomposition (EMD) class methods Lahmiri com-bined VMD with a general regression neural network(GRNN) to develop a novel ensemble forecasting model andthe experimental results suggested that VMD outperformedEMD for the prediction of economic and financial timeseries [20] Kao et al integrated WD support vector re-gression (SVR) and multivariate adaptive regression splines(Mars) to develop an ensemble forecasting model to forecaststock price and the proposed model obtained better pre-diction accuracy [21] In the second stage of the frameworkof ldquodecomposition and ensemblerdquo various optimizationapproaches were introduced to enhance the performance ofpredictors Li et al proposed a ridge regression (RR) with DEto forecast crude oil prices and obtained excellent forecastingaccuracy [22] Bagheri et al introduced quantum-behavedparticle swarm optimization (QPSO) to tune the adaptivenetwork-based fuzzy inference system (ANFIS) for financialtime series forecasting [23] Wang et al employed brainstorm optimization (BSO) algorithm to optimize SVR andthe results indicated that the developed approach was ef-fective in stock market analysis [24]

In the ldquodecomposition and ensemblerdquo framework de-composition approaches and prediction approaches influ-ence the final prediction results greatly Considering thepowerful decomposition ability of ICEEMDAN the excel-lent search efficiency of WOA and the accurate forecastingability of the RVFL network we develop a novel ensembleprediction model integrating multiple ICEEMDANs WOAand the RVFL network namely MICEEMDAN-WOA-RVFL for economic and financial time series forecastingFirstly ICEEMDAN with random parameters is utilized todivide the original economic and financial time series into asum of subseries Secondly the RVFL network is applied toforecast each decomposed subseries individually and WOAis used to optimize the parameter values of the RVFLnetwork simultaneously Finally the predictions of all in-dividual subseries are aggregated as the prediction values ofone process of decomposition and ensemble From ourobservations we find that the decomposition in the firststage has some disadvantages of the uncertainties with aquite randomness which can lead to the difference andinstability of the prediction results In addition extensiveliterature has shown that combining multiple forecasts caneffectively enhance prediction accuracy [25 26] )ereforewe randomize the decomposition parameter values ofICEEMDAN in the first stage repeat the abovementionedprocesses multiple times and integrate the results of mul-tiple decompositions and ensembles as the final predictionvalues We expect that the multiple decomposition strategycan reduce the randomness of one single decomposition andfurther improve the ensemble prediction stability andaccuracy

2 Complexity

)e main contributions of this paper are as follows (1)we propose a new multidecomposition and self-optimizingensemble prediction model integrating multiple ICE-EMDANs WOA and RVFL networks for economic andfinancial time series forecasting As far as we know this is thefirst time that the novel combination is developed foreconomic and financial time series forecasting (2) To furtherenhance forecasting accuracy and stability we utilize mul-tiple differentiated ICEEMDANs to decompose originaleconomic and financial time series and finally ensemble thepredictions of all decompositions as the final predictions (3)WOA is firstly introduced to optimize various parameters ofRVFL networks (4) )e empirical results on four differenttypes of economic and financial time series show that ourproposed MICEEMDAN-WOA-RVFL significantly en-hances the prediction performance in terms of forecastingaccuracy and stability

)e novelty of the proposed MICEEMDAN-WOA-RVFL is three-fold (1) a novel hybrid model integratingmultiple ICEEMDANs WOA and RVFL networks isdesigned for economic and financial time series forecasting(2) the multiple decomposition strategy is firstly proposed toovercome the randomness of one single decomposition andto improve prediction accuracy and stability and (3) WOAis first applied to optimizing RVFL networks to improve theperformance of individual forecasting

)e remainder of the paper is organized as followsSection 2 offers a brief introduction to the ICEEMDANWOA and RVFL network Section 3 provides the archi-tecture and the detailed implementation of the proposedMICEEMDAN-WOA-RVFL Section 4 analyzes the em-pirical results on various economic and financial time seriesforecasting Section 5 discusses some details of the de-veloped prediction model and Section 6 concludes thispaper

2 Preliminaries

21 Improved Complete Ensemble Empirical Mode Decom-positionwithAdaptiveNoise (ICEEMDAN) Empirical modedecomposition (EMD) an adaptive time-frequency analysisapproach for nonstationary signals was designed by Huanget al [27] EMD separates original time series into a sum ofldquointrinsic mode functionsrdquo (IMFs) and one residue andthus it can simplify time series analysis Due to somedrawbacks of EMD such as mode mixing EEMD [28] andCEEMDAN [29] have been proposed to improve decom-position performance and applied in various fields [30ndash32]In spite of that these decompositionmethods still have somenew problems To solve these problems an improvedCEEMDAN (ICEEMDAN) was developed by Colominaset al [33]

Let Ek(middot) be the operator which generates the kth modeusing EMD M(middot) be the operator which generates the localmean of the series and w(i) be a realization of zero meanunit variance noise When x is the original signal thedetailed decomposition process of ICEEMDAN is asfollows

(i) Step 1 employ EMD to compute the local means of Irealizations x(i) x+ β0E1(w(i)) to achieve the firstresidue r1 M(x(i)) and β0gt 0

(ii) Step 2 calculate the first IMF1 xminus r1(iii) Step 3 calculate the average of local means of the

realizations as the second residuer2 (r1 + β1E2(w(i)))

(iv) Step 4 compute the kth residue for k 3 Krk M(rkminus1 + βkminus1Ek(w(i)))

(v) Step 5 go to step 4 for next k

Since ICEEMDAN can effectively decompose the orig-inal time series it has been frequently introduced intovarious time series forecasting [34ndash36] In our study weemploy ICEEMDAN to separate the original economic andfinancial time series into a sum of simpler subseries forsubsequent forecasting

22 Whale Optimization Algorithm (WOA) Whale opti-mization algorithm (WOA) is a type of optimizationmethodand outperforms particle swarm optimization (PSO) antcolony optimization (ACO) gravitational search algorithm(GSA) and fast evolutionary programming (FEP) in theoptimization performance [37 38] Simulating the huntingprocess of whales WOA includes three main operatorsencircling prey bubble-net foraging and search for prey Ineach iteration individuals update their positions toward thebest individual in the last iteration which can be formulatedas follows

P(t + 1) Pbest(t) minus A middot C middot Pbest(t) minus P(t)1113868111386811138681113868

1113868111386811138681113868 (1)

where t represents the tth iteration A and C are two co-efficient vectors Pbest is the best individual so far P is theposition of an individual || represents the absolute valueand middot indicates an element-by-element multiplication

In the exploitation (bubble-net foraging) phase theindividual position is updated based on its distance to thebest individual by simulating the helix-shaped movement ofwhales which is formulated as follows

P(t + 1) D middot ebl

middot cos(2πl) + Pbest(t) (2)

where D |Pbest(t)minus P(t)| represents the distance betweenthe best individual obtained so far and the ith individual l isa random number in [minus1 1] and b is a constant which isused to define the shape of logarithmic spiral

In exploration (search for prey) phase the individualposition is updated using a randomly selected individual)e mathematical model is follows

P(t + 1) Prand minus A middot C middot Prand minus P(t)1113868111386811138681113868

1113868111386811138681113868 (3)

where Prand represents a randomly selected individual)e detailed flowchart of WOA is illustrated in Figure 1

where p is a random number in [0 1] Due to its verycompetitive search ability WOA has been widely applied invarious fields [39ndash41] )erefore we consider taking ad-vantage of the effective search ability of WOA to seek theoptimal parameters for RVFL networks

Complexity 3

23 RandomVector Functional Link (RVFL) Neural NetworkAs a kind of modification of the multilayer perceptron(MLP) model the random vector functional link (RVFL)neural network was proposed by Pao et al [42] )e RVFLneural network is able to overcome the slow convergenceoverfitting and local minimum inherently in the traditionalgradient-based learning algorithms Like MLP the archi-tecture of the RVFL neural network includes three layerswhich is illustrated in Figure 2)emain modification in theRVFL network lies in the connection in the networkstructure Since the RVFL network has the direct connec-tions from the input layer to the output layer it can performbetter compared to no direct link [43 44]

)e neurons in the hidden layer known as enhancementnodes calculate the sum of all the output of the input layerneurons and obtain their output with an activation function

hm g 1113944

N

n1wnmin + bm

⎛⎝ ⎞⎠ (4)

where wnm denotes the weight between in and hm bm rep-resents the bias of the mth neuron in hidden layer and g(middot)

represents an activation function)e output layer neurons integrate all the output from

the hidden layer and input layer neurons and the finaloutput is

ol 1113944M

m1wmlhm + 1113944

N

n1wnlin (5)

where wml represents the weight between hm and ol and wnl

indicates the weight between in and olTo enhance the training efficiency the RVFL neural

network utilizes a given distribution to fix the values of wnm

and bm and obtain the weights of wml and wnl by minimizingthe system error

E 12P

1113944

P

j1t(j)

minus Bd(j)

1113872 11138732 (6)

where P indicates the number of training samples and t arethe target values B is the combination of wml and wnl and drepresents a combined vector

)e RVFL neural network has demonstrated an ex-tremely efficient and fast forecasting ability and has beenfrequently used in time series forecasting [44 45]

3 MICEEMDAN-WOA-RVFL The ProposedApproach for Economic and Financial TimeSeries Forecasting

Referring to the framework of ldquodecomposition and en-semblerdquo we design a multidecomposition and self-opti-mizing hybrid model that integrates multipleICEEMDANs WOA and RVFL networks termed asMICEEMDAN-WOA-RVFL to forecast economic andfinancial time series )e architecture of the proposedhybrid model is illustrated in Figure 3

Our proposed MICEEMDAN-WOA-RVFL takes ad-vantage of the idea of ldquodivide and conquerrdquo that was fre-quently used in time series forecasting image processingfault diagnosis and so on [46ndash53] A procedure of ldquode-composition and ensemblerdquo in the MICEEMDAN-WOA-RVFL is as follows

(i) Stage 1 decomposition ICEEMDAN is employedto separate original time series into several subseries(ie several IMFs and one residue)

Initialize the whales population Pi (i = 1 2 n)

Calculate the fitness of each individual and obtain the best individual Pbest

t lt maximumnumber of iterations

Update A C p and l for each individual respectively

p ge 05

Update the position of the current search by the Eq (2) A lt 1

Update the position of the current search by the Eq (1) Update the position of the current search by the Eq (3)

Calculate the fitness of each individual and update the best individual Pbest

t = t + 1

Obtain the best individual Pbest

True

True False

False

True False

Figure 1 )e flowchart of whale optimization algorithm (WOA)

4 Complexity

(ii) Stage 2 individual forecasting Each decomposedsubseries is divided into a training dataset and atesting dataset then the RVFL network with WOAoptimization is developed on each training datasetindependently and finally the developed RVFLmodel is used to each testing dataset )e reasonwhy we select the RVFL network as the predictor isits powerful forecasting ability in extant research[15 34 44] Since the parameter setting of the RVFLnetwork plays an important role in the predictionperformance we introduce WOA to seek the op-timal parameter values for the RVFL network in theforecasting stage

(iii) Stage 3 ensemble the predictions of all thedecomposed subseries are aggregated as the finalprediction results of one ldquodecomposition and en-semblerdquo using addition aggregation

In this study to enhance both accuracy and stability offinal prediction we generate random values for the de-composition parameters of ICEEMDAN in the decompo-sition stage including number of realizations (Nr) noisestandard deviation (Nsd) and maximum number of siftingiterations (Maxsi) for each decomposition and repeat theprocedure of ldquodecomposition and ensemblerdquo M times andfinally combine all the results of multiple ldquodecompositionsand ensemblesrdquo using the RMSE-weighted method as thefinal prediction results )e corresponding weight of ithforecasting model is as follows

Weighti 1RMSEi

1113936Mj11RMSEj

(7)

where M denotes the number of individual models andRMSEi indicates the RMSE value of the ith forecasting modelin the training process

Decomposition

Individual forecasting

Ensemble

hellip hellip hellip hellip

hellip hellip

hellip hellip

Economic and financial time series

ICEEMDAN1

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted results1

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

Randomize the parametersof ICEEMDAN1

Randomize the parametersof ICEEMDANm

WOA RVFL WOA hellip hellip

hellip hellip

ICEEMDANm

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted resultsm

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

WOA RVFL WOA

helliphellip

Ensemble

Final predicted results

Figure 3 )e flowchart of the proposed MICEEMDAN-WOA-RVFL

Input layer

Hidden layer

Output layer

i1

iN

hM

h2

h1

o1

oL

wnm bm wmlwnl

Figure 2 )e architecture of the random vector functional link (RVFL) neural network

Complexity 5

Although some recent studies also employ the RVFLnetwork for time series forecasting they obviously differfrom the current study in the decomposition technique andnetwork optimization (1) they divide the original timeseries using WD or EMD (2) they construct RVFL net-works using the fixed parameter values Unlike the previousstudies our study decomposes economic and financial timeseries using ICEEMDAN and searches the optimal pa-rameter values of RVFL networks based on WOA Fur-thermore the previous research mainly focuses on dividingoriginal time series by one single decomposition or dualdecomposition [20 35 54] In dual decomposition theoriginal signal is first decomposed into several compo-nents and then the high-frequency components continueto be decomposed into other components using the same ordifferent decomposition method Essentially the dual de-composition process belongs to one decomposition justincluding two different decomposition stages Unlike theprevious research one main improvement in this study isthe multiple decomposition strategy which can success-fully overcome the randomness of one single decomposi-tion and further improve the prediction accuracy andstability of the developed forecasting approach To ourknowledge it is the first time that the multiple decom-position strategy is developed for the forecasting of eco-nomic and financial time series

4 Experimental Results

41 Data Description As we know economic and financialtime series are influenced by various factors sometimes raisingand dropping down in a short time )e dramatic fluctuationsusually lead to the significant nonlinearity and nonstationarityof the time series To comprehensively evaluate the effectivenessof the proposed MICEEMDAN-WOA-RVFL we choose fourdifferent time series including the West Texas Intermediatecrude oil spot price (WTI) US dollarEuro foreign exchangerate (USDEUR) US industrial production (IP) and Shanghaistock exchange composite index (SSEC) as the experimentaldatasets )e first three datasets can be accessed via the websiteof St Louis Fed Research [55] and the last one can be obtainedvia the website of NetEase [56]

Each time series is separated into two subdatasets thefirst 80 for training and the last 20 for testing Table 1shows the divided samples of the abovementioned foureconomic and financial time series

We utilize ICEEMDAN to decompose these time seriesinto groups of relatively simple subseries Figure 4 offers anexample of the decomposition of the WTI dataset usingICEEMDAN

42 Evaluation Indices In this study we use four evaluationmetrics including the mean absolute percent error (MAPE)the root mean squared error (RMSE) the directional statistic(Dstat) and the DieboldndashMariano (DM) test to assess theperformance of the proposed model Among them MAPEand RMSE are used to assess the forecasting error defined asfollows

MAPE 1113944N

t1

Ot minus Pt

Ot

11138681113868111386811138681113868111386811138681113868

11138681113868111386811138681113868111386811138681113868times100N

(8)

RMSE

1N

1113944N

t1Ot minus Pt( 1113857

2

11139741113972

(9)

where N is the size of the evaluated samples Ot denotes theactual values and Pt represents the predicted values at time t)e lower the values of RMSE and MAPE the better theprediction models

)e Dstat indicates the performance of direction pre-diction which is formulated as follows

Dstat 1N

1113944

N

i1di times 100 (10)

where di 1 if (Pt+1 minusOt) (Ot+1 minusOt)ge 0 otherwise di 0 Ahigher value of Dstat indicates a more accurate directionprediction

Furthermore to test the significance of the predictionperformance of pairs of models we employ the Die-boldndashMariano (DM) test in this study

43 Experimental Settings In this study we compare theproposed MICEEMDAN-WOA-RVFL with several state-of-the-art forecasting models including the single models andthe ensemble models Among all these models the singlemodels include one popular statistical model RW and twopopular AI models BPNN and least square SVR (LSSVR))e ensemble models derive from the combination of thesingle models and the decomposition method ICEEMDAN

)e detailed parameters of all prediction models de-composition approach ICEEMDAN and optimizationmethod WOA in the experiments are shown in Table 2 )eparameter values of BPNN LSSVR RVFL and ICEEMDANrefer to the previous literature [22 34 45]

All experiments were conducted using Matlab R2019bon a PC with 64 bit Microsoft Windows 10 8GB RAM and18GHz i7-8565U CPU

44 Results and Analysis We compare the forecastingperformance of six prediction models including three singlemodels

(RW LSSVR and BPNN) and three ensemble modes(ICEEMDAN-RW ICEEMDAN-LSSVR and ICEEMDAN-BPNN) with that of our proposed MICEEMDAN-WOA-RVFL in terms of MAPE RMSE and Dstat Due to thedifferent horizons we train different forecasting modelsseparately )at is to say we use the proposed scheme fordifferent horizons to train different models Tables 3ndash5 re-port the experimental results in terms of each evaluationindex with 1- 3- and 6-horizon respectively

From Table 3 we can see that the proposedMICEEMDAN-WOA-RVFL obtains the lowest (the best)MAPE values with all the horizons in all the datasets RWobtains the best MAPE values with all the horizons in all thedatasets among all the compared single models

6 Complexity

50

100

150

Crud

e oil

pric

es

ndash100

10

IMF1

ndash100

10

IMF2

ndash100

10

IMF3

ndash100

10

IMF4

ndash100

10

IMF5

ndash200

20

IMF6

ndash200

20

IMF7

ndash200

20

IMF8

ndash200

20

IMF9

ndash100

10

IMF1

0

ndash200

20

IMF1

1

20406080

Resid

ue

Jan 02 1986 Apr 25 1990 Aug 19 1994 Jan 04 1999 May 27 2003 Oct 17 2007 Feb 29 2012 Jul 12 2016 Apr 15 2020

Figure 4 )e WTI crude oil price series and the corresponding decomposed subseries by ICEEMDAN

Table 1 Samples of economic and financial time series

Dataset Type Dataset Size Date

WTI DailySample set 8641 2 January 1986sim15 April 2020Training set 6912 2 January 1986sim24 May 2013Testing set 1729 28 May 2013sim15 April 2020

USDEUR DailySample set 5341 4 January 1999sim10 April 2020Training set 4272 4 January 1999sim30 December 2015Testing set 1069 31 December 2015sim10 April 2020

IP MonthlySample set 1215 January 1919simMatch 2020Training set 972 January 1919simDecember 1999Testing set 243 January 2000simMatch 2020

SSEC DailySample set 7172 19 December 1990sim21 April 2020Training set 5737 19 December 1990sim4 June 2014Testing set 1435 5 June 2014sim21 April 2020

Complexity 7

demonstrating it is better than LSSVR and BPNN for theforecasting of economic and financial time series Of all theensemble models the ICEEMDAN-LSSVR model andICEEMDAN-BPNN model obtain the close MAPE valuesobviously better than the ICEEMDAN-RW

)e RMSE values of the four time series datasets arelisted in Table 4 From this table we can find that theproposed MICEEMDAN-WOA-RVFL outperforms allthe single and ensemble models with all the horizons in allthe datasets )e statistical model RW obtains the bestRMSE values in 10 out of 12 cases demonstrating that it ismore suitable for economic and financial time seriesforecasting than LSSVR and BPNN among all the singlemodels As to the ensemble models the proposedMICEEMDAN-WOA-RVFL obtains the lower RMSEvalues than the compared ensemble models

demonstrating that the former is more effective for eco-nomic and financial time series forecasting

Table 5 shows the directional statistics Dstat and we cansee that the MICEEMDAN-WOA-RVFL achieves thehighest Dstat values in all the 12 cases indicating that it hasbetter performance of direction forecasting Amongst thesingle prediction models LSSVR and RW obtain the bestDstat values in 5 cases respectively better than the BPNNSimilarly the ICEEMDAN-LSSVRmodel and ICEEMDAN-BPNN model obtain the close Dstat values obviously betterthan the ICEEMDAN-RW model in all the 12 cases

From the all prediction results we can find that all theensemble prediction models except ICEEMDAN-RW greatlyoutperform the corresponding single prediction models in allthe 12 cases showing that the framework of decompositionand ensemble is an effective tool for improving the forecasting

Table 2 )e settings for the parametersMethod Parameters Description

ICEEMDANNsd 02 Noise standard deviationNr 100 Number of realizations

Maxsi 5000 Maximum number of sifting iterations

LSSVR Rp 2minus10minus9 1112 Regularization parameterWidRBF 2minus10 minus9 1112 Width of the RBF kernel

BPNNNhe 10 Number of hidden neurons

Maxte 1000 Maximum training epochsLr 00001 Learning rate

WOA Pop 40 Population sizeMaxgen 100 Maximum generation

MICEEMDAN-WOA-RVFL

Nsd [001 04] Noise standard deviation in ICEEMDANNr [50 500] Number of realizations in ICEEMDAN

Maxsi [2000 8000] Maximum number of sifting iterations inICEEMDAN

Nhe [5 30] Number of hidden neurons in RVFLFunc sigmoid sine hardlim tribas radbas sign Activation function in RVFL

Mod 1 Regularized least square Mode in RVFL2 MoorendashPenrose pseudoinverse

Lag [3 20] Lag in RVFLBias true false Bias in RVFL

Rand 1 Gaussian 2 Uniform Random type in RVFLScale [01 1] Scale value in RVFL

ScaleMode 1 Scale the features for all neurons Scale mode in RVFL2 Scale the features for each hidden neuron

3 Scale the range of the randomization for uniformdiatribution

Table 3 )e mean absolute percent error (MAPE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 00036 00176 00177 00182 00200 00044 000553 00080 00307 00309 00320 00331 00092 000946 00113 00441 00449 00457 00458 01202 00130

USDEUR1 00006 00034 00034 00034 00044 00010 000103 00015 00063 00063 00063 00071 00020 000216 00022 00087 00088 00087 00094 00032 00031

IP1 00012 00049 00057 00056 00053 00024 000223 00023 00102 00140 00123 00104 00033 000336 00032 00183 00279 00235 00184 00043 00049

SSEC1 00020 00096 00096 00097 00111 00026 000263 00044 00178 00178 00179 00185 00047 000516 00065 00259 00259 00269 00268 00072 00074

8 Complexity

performance Of all the compared models the proposedMICEEMDAN-WOA-RVFL obtains the highest Dstat valuesand the lowest MAPE and RMSE values in all the time seriesdatasets showing that it is completely superior to thebenchmark prediction models Furthermore for each pre-dictionmodel theMAPE and RMSE values increase while theDstat values decrease with the horizon)is demonstrates thatit is easier to forecast time series with a short horizon thanwith a long one It is worth noting that the proposedMICEEMDAN-WOA-RVFL still achieves the relatively goodMAPE and RMSE with the increase of horizon among thecompared models For example when the proposed modelobtains 07737 Dstat with horizon 6 in the WTI dataset itachieves the relatively low MAPE (00113) and RMSE(08146) indicating that the prediction values are very close tothe real values although the proposed model misses directionabout 2263 In other words the proposed MICEEMDAN-WOA-RVFL can still achieve satisfactory forecasting accuracywith a long horizon

Furthermore we can find that the multiple decompo-sition strategy does not improve the forecast for the RWmodel One possible explanation is simply that RW infersthat the past movement or trend of a time series cannot beused to predict its future movement and thus it cannot takeadvantage of the historical data and the diversity of multipledecomposition to make the future prediction )ereforewhen we aggregate the multiple RWmodel predictions of allthe decomposed subseries we just integrate several randompredictions and thus cannot significantly improve theensemble prediction results In contrast the LSSVR and

BPNN as well as RVFL can fully use all the historical dataand the diversity of multiple decomposition to make thefuture prediction Specifically the multiple decompositionusing different parameters generates many groups of dif-ferent decomposed subseries and the diversity of decom-position can successfully overcome the randomness of onesingle decomposition and further improve the predictionaccuracy and stability of the developed forecasting approach

In addition the DieboldndashMariano (DM) test is utilizedto evaluate whether the forecasting accuracy of the proposedMICEEMDAN-WOA-RVFL significantly outperformsthose of the other compared models Table 6 shows thestatistics and p values (in brackets)

On one hand the DM statistical values between theensemble prediction models and their corresponding singlepredictors are much lower than zero and the correspondingp values are almost equal to zero with all the horizons exceptfor the RW model showing that the architecture of ldquode-composition and ensemblerdquo contributes to greatly im-proving prediction accuracy and the combination ofICEEMDAN and AI predictors is more effective for eco-nomic and financial time series forecasting

On the other hand DM test results on the prediction ofall the four time series datasets indicate that theMICEEMDAN-WOA-RVFL is significantly better than thesingle models and the other ensemble models with all thehorizons and the corresponding p values are much lowerthan 001 in all the cases

In summary the DM test results demonstrate that thecombination of multiple ICEEMDANs RVFL networks and

Table 4 )e root mean squared error (RMSE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 02715 13196 13228 13744 18134 03467 040783 05953 21784 21929 22867 25041 06754 076206 08146 30737 31271 31845 32348 08692 09462

USDEUR1 00009 00051 00052 00052 00099 00015 000163 00022 00091 00092 00092 00129 00031 000316 00033 00127 00128 00127 00154 00047 00046

IP1 02114 07528 08441 08230 08205 03861 041753 03875 14059 18553 18311 14401 05533 052946 05340 24459 36015 28116 24569 06408 08264

SSEC1 108115 500742 498800 508431 633951 126460 1379283 222983 891834 885290 891781 932887 252526 2977846 357201 1316386 1313487 1335647 1372598 404357 496341

Table 5 )e directional statistic (Dstat) values of different prediction modelsDataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 09381 04815 05243 05191 04977 09190 090973 08576 04873 05087 05012 04907 08300 084496 07737 05116 04902 04948 05185 07714 07575

USDEUR1 09410 05019 04719 04897 05056 08998 090263 08502 04888 04897 04925 04916 08118 080246 07828 04991 05318 05253 05047 07023 07154

IP1 09256 05579 05909 05785 05620 08636 087603 08802 06983 05455 04876 06529 08430 080586 08141 06198 05537 05661 06405 07355 07645

SSEC1 09156 04944 05049 05098 04979 09024 090023 08396 05042 05021 05007 04965 08222 081456 07587 04833 05063 04965 04805 07448 07455

Complexity 9

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 3: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

)e main contributions of this paper are as follows (1)we propose a new multidecomposition and self-optimizingensemble prediction model integrating multiple ICE-EMDANs WOA and RVFL networks for economic andfinancial time series forecasting As far as we know this is thefirst time that the novel combination is developed foreconomic and financial time series forecasting (2) To furtherenhance forecasting accuracy and stability we utilize mul-tiple differentiated ICEEMDANs to decompose originaleconomic and financial time series and finally ensemble thepredictions of all decompositions as the final predictions (3)WOA is firstly introduced to optimize various parameters ofRVFL networks (4) )e empirical results on four differenttypes of economic and financial time series show that ourproposed MICEEMDAN-WOA-RVFL significantly en-hances the prediction performance in terms of forecastingaccuracy and stability

)e novelty of the proposed MICEEMDAN-WOA-RVFL is three-fold (1) a novel hybrid model integratingmultiple ICEEMDANs WOA and RVFL networks isdesigned for economic and financial time series forecasting(2) the multiple decomposition strategy is firstly proposed toovercome the randomness of one single decomposition andto improve prediction accuracy and stability and (3) WOAis first applied to optimizing RVFL networks to improve theperformance of individual forecasting

)e remainder of the paper is organized as followsSection 2 offers a brief introduction to the ICEEMDANWOA and RVFL network Section 3 provides the archi-tecture and the detailed implementation of the proposedMICEEMDAN-WOA-RVFL Section 4 analyzes the em-pirical results on various economic and financial time seriesforecasting Section 5 discusses some details of the de-veloped prediction model and Section 6 concludes thispaper

2 Preliminaries

21 Improved Complete Ensemble Empirical Mode Decom-positionwithAdaptiveNoise (ICEEMDAN) Empirical modedecomposition (EMD) an adaptive time-frequency analysisapproach for nonstationary signals was designed by Huanget al [27] EMD separates original time series into a sum ofldquointrinsic mode functionsrdquo (IMFs) and one residue andthus it can simplify time series analysis Due to somedrawbacks of EMD such as mode mixing EEMD [28] andCEEMDAN [29] have been proposed to improve decom-position performance and applied in various fields [30ndash32]In spite of that these decompositionmethods still have somenew problems To solve these problems an improvedCEEMDAN (ICEEMDAN) was developed by Colominaset al [33]

Let Ek(middot) be the operator which generates the kth modeusing EMD M(middot) be the operator which generates the localmean of the series and w(i) be a realization of zero meanunit variance noise When x is the original signal thedetailed decomposition process of ICEEMDAN is asfollows

(i) Step 1 employ EMD to compute the local means of Irealizations x(i) x+ β0E1(w(i)) to achieve the firstresidue r1 M(x(i)) and β0gt 0

(ii) Step 2 calculate the first IMF1 xminus r1(iii) Step 3 calculate the average of local means of the

realizations as the second residuer2 (r1 + β1E2(w(i)))

(iv) Step 4 compute the kth residue for k 3 Krk M(rkminus1 + βkminus1Ek(w(i)))

(v) Step 5 go to step 4 for next k

Since ICEEMDAN can effectively decompose the orig-inal time series it has been frequently introduced intovarious time series forecasting [34ndash36] In our study weemploy ICEEMDAN to separate the original economic andfinancial time series into a sum of simpler subseries forsubsequent forecasting

22 Whale Optimization Algorithm (WOA) Whale opti-mization algorithm (WOA) is a type of optimizationmethodand outperforms particle swarm optimization (PSO) antcolony optimization (ACO) gravitational search algorithm(GSA) and fast evolutionary programming (FEP) in theoptimization performance [37 38] Simulating the huntingprocess of whales WOA includes three main operatorsencircling prey bubble-net foraging and search for prey Ineach iteration individuals update their positions toward thebest individual in the last iteration which can be formulatedas follows

P(t + 1) Pbest(t) minus A middot C middot Pbest(t) minus P(t)1113868111386811138681113868

1113868111386811138681113868 (1)

where t represents the tth iteration A and C are two co-efficient vectors Pbest is the best individual so far P is theposition of an individual || represents the absolute valueand middot indicates an element-by-element multiplication

In the exploitation (bubble-net foraging) phase theindividual position is updated based on its distance to thebest individual by simulating the helix-shaped movement ofwhales which is formulated as follows

P(t + 1) D middot ebl

middot cos(2πl) + Pbest(t) (2)

where D |Pbest(t)minus P(t)| represents the distance betweenthe best individual obtained so far and the ith individual l isa random number in [minus1 1] and b is a constant which isused to define the shape of logarithmic spiral

In exploration (search for prey) phase the individualposition is updated using a randomly selected individual)e mathematical model is follows

P(t + 1) Prand minus A middot C middot Prand minus P(t)1113868111386811138681113868

1113868111386811138681113868 (3)

where Prand represents a randomly selected individual)e detailed flowchart of WOA is illustrated in Figure 1

where p is a random number in [0 1] Due to its verycompetitive search ability WOA has been widely applied invarious fields [39ndash41] )erefore we consider taking ad-vantage of the effective search ability of WOA to seek theoptimal parameters for RVFL networks

Complexity 3

23 RandomVector Functional Link (RVFL) Neural NetworkAs a kind of modification of the multilayer perceptron(MLP) model the random vector functional link (RVFL)neural network was proposed by Pao et al [42] )e RVFLneural network is able to overcome the slow convergenceoverfitting and local minimum inherently in the traditionalgradient-based learning algorithms Like MLP the archi-tecture of the RVFL neural network includes three layerswhich is illustrated in Figure 2)emain modification in theRVFL network lies in the connection in the networkstructure Since the RVFL network has the direct connec-tions from the input layer to the output layer it can performbetter compared to no direct link [43 44]

)e neurons in the hidden layer known as enhancementnodes calculate the sum of all the output of the input layerneurons and obtain their output with an activation function

hm g 1113944

N

n1wnmin + bm

⎛⎝ ⎞⎠ (4)

where wnm denotes the weight between in and hm bm rep-resents the bias of the mth neuron in hidden layer and g(middot)

represents an activation function)e output layer neurons integrate all the output from

the hidden layer and input layer neurons and the finaloutput is

ol 1113944M

m1wmlhm + 1113944

N

n1wnlin (5)

where wml represents the weight between hm and ol and wnl

indicates the weight between in and olTo enhance the training efficiency the RVFL neural

network utilizes a given distribution to fix the values of wnm

and bm and obtain the weights of wml and wnl by minimizingthe system error

E 12P

1113944

P

j1t(j)

minus Bd(j)

1113872 11138732 (6)

where P indicates the number of training samples and t arethe target values B is the combination of wml and wnl and drepresents a combined vector

)e RVFL neural network has demonstrated an ex-tremely efficient and fast forecasting ability and has beenfrequently used in time series forecasting [44 45]

3 MICEEMDAN-WOA-RVFL The ProposedApproach for Economic and Financial TimeSeries Forecasting

Referring to the framework of ldquodecomposition and en-semblerdquo we design a multidecomposition and self-opti-mizing hybrid model that integrates multipleICEEMDANs WOA and RVFL networks termed asMICEEMDAN-WOA-RVFL to forecast economic andfinancial time series )e architecture of the proposedhybrid model is illustrated in Figure 3

Our proposed MICEEMDAN-WOA-RVFL takes ad-vantage of the idea of ldquodivide and conquerrdquo that was fre-quently used in time series forecasting image processingfault diagnosis and so on [46ndash53] A procedure of ldquode-composition and ensemblerdquo in the MICEEMDAN-WOA-RVFL is as follows

(i) Stage 1 decomposition ICEEMDAN is employedto separate original time series into several subseries(ie several IMFs and one residue)

Initialize the whales population Pi (i = 1 2 n)

Calculate the fitness of each individual and obtain the best individual Pbest

t lt maximumnumber of iterations

Update A C p and l for each individual respectively

p ge 05

Update the position of the current search by the Eq (2) A lt 1

Update the position of the current search by the Eq (1) Update the position of the current search by the Eq (3)

Calculate the fitness of each individual and update the best individual Pbest

t = t + 1

Obtain the best individual Pbest

True

True False

False

True False

Figure 1 )e flowchart of whale optimization algorithm (WOA)

4 Complexity

(ii) Stage 2 individual forecasting Each decomposedsubseries is divided into a training dataset and atesting dataset then the RVFL network with WOAoptimization is developed on each training datasetindependently and finally the developed RVFLmodel is used to each testing dataset )e reasonwhy we select the RVFL network as the predictor isits powerful forecasting ability in extant research[15 34 44] Since the parameter setting of the RVFLnetwork plays an important role in the predictionperformance we introduce WOA to seek the op-timal parameter values for the RVFL network in theforecasting stage

(iii) Stage 3 ensemble the predictions of all thedecomposed subseries are aggregated as the finalprediction results of one ldquodecomposition and en-semblerdquo using addition aggregation

In this study to enhance both accuracy and stability offinal prediction we generate random values for the de-composition parameters of ICEEMDAN in the decompo-sition stage including number of realizations (Nr) noisestandard deviation (Nsd) and maximum number of siftingiterations (Maxsi) for each decomposition and repeat theprocedure of ldquodecomposition and ensemblerdquo M times andfinally combine all the results of multiple ldquodecompositionsand ensemblesrdquo using the RMSE-weighted method as thefinal prediction results )e corresponding weight of ithforecasting model is as follows

Weighti 1RMSEi

1113936Mj11RMSEj

(7)

where M denotes the number of individual models andRMSEi indicates the RMSE value of the ith forecasting modelin the training process

Decomposition

Individual forecasting

Ensemble

hellip hellip hellip hellip

hellip hellip

hellip hellip

Economic and financial time series

ICEEMDAN1

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted results1

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

Randomize the parametersof ICEEMDAN1

Randomize the parametersof ICEEMDANm

WOA RVFL WOA hellip hellip

hellip hellip

ICEEMDANm

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted resultsm

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

WOA RVFL WOA

helliphellip

Ensemble

Final predicted results

Figure 3 )e flowchart of the proposed MICEEMDAN-WOA-RVFL

Input layer

Hidden layer

Output layer

i1

iN

hM

h2

h1

o1

oL

wnm bm wmlwnl

Figure 2 )e architecture of the random vector functional link (RVFL) neural network

Complexity 5

Although some recent studies also employ the RVFLnetwork for time series forecasting they obviously differfrom the current study in the decomposition technique andnetwork optimization (1) they divide the original timeseries using WD or EMD (2) they construct RVFL net-works using the fixed parameter values Unlike the previousstudies our study decomposes economic and financial timeseries using ICEEMDAN and searches the optimal pa-rameter values of RVFL networks based on WOA Fur-thermore the previous research mainly focuses on dividingoriginal time series by one single decomposition or dualdecomposition [20 35 54] In dual decomposition theoriginal signal is first decomposed into several compo-nents and then the high-frequency components continueto be decomposed into other components using the same ordifferent decomposition method Essentially the dual de-composition process belongs to one decomposition justincluding two different decomposition stages Unlike theprevious research one main improvement in this study isthe multiple decomposition strategy which can success-fully overcome the randomness of one single decomposi-tion and further improve the prediction accuracy andstability of the developed forecasting approach To ourknowledge it is the first time that the multiple decom-position strategy is developed for the forecasting of eco-nomic and financial time series

4 Experimental Results

41 Data Description As we know economic and financialtime series are influenced by various factors sometimes raisingand dropping down in a short time )e dramatic fluctuationsusually lead to the significant nonlinearity and nonstationarityof the time series To comprehensively evaluate the effectivenessof the proposed MICEEMDAN-WOA-RVFL we choose fourdifferent time series including the West Texas Intermediatecrude oil spot price (WTI) US dollarEuro foreign exchangerate (USDEUR) US industrial production (IP) and Shanghaistock exchange composite index (SSEC) as the experimentaldatasets )e first three datasets can be accessed via the websiteof St Louis Fed Research [55] and the last one can be obtainedvia the website of NetEase [56]

Each time series is separated into two subdatasets thefirst 80 for training and the last 20 for testing Table 1shows the divided samples of the abovementioned foureconomic and financial time series

We utilize ICEEMDAN to decompose these time seriesinto groups of relatively simple subseries Figure 4 offers anexample of the decomposition of the WTI dataset usingICEEMDAN

42 Evaluation Indices In this study we use four evaluationmetrics including the mean absolute percent error (MAPE)the root mean squared error (RMSE) the directional statistic(Dstat) and the DieboldndashMariano (DM) test to assess theperformance of the proposed model Among them MAPEand RMSE are used to assess the forecasting error defined asfollows

MAPE 1113944N

t1

Ot minus Pt

Ot

11138681113868111386811138681113868111386811138681113868

11138681113868111386811138681113868111386811138681113868times100N

(8)

RMSE

1N

1113944N

t1Ot minus Pt( 1113857

2

11139741113972

(9)

where N is the size of the evaluated samples Ot denotes theactual values and Pt represents the predicted values at time t)e lower the values of RMSE and MAPE the better theprediction models

)e Dstat indicates the performance of direction pre-diction which is formulated as follows

Dstat 1N

1113944

N

i1di times 100 (10)

where di 1 if (Pt+1 minusOt) (Ot+1 minusOt)ge 0 otherwise di 0 Ahigher value of Dstat indicates a more accurate directionprediction

Furthermore to test the significance of the predictionperformance of pairs of models we employ the Die-boldndashMariano (DM) test in this study

43 Experimental Settings In this study we compare theproposed MICEEMDAN-WOA-RVFL with several state-of-the-art forecasting models including the single models andthe ensemble models Among all these models the singlemodels include one popular statistical model RW and twopopular AI models BPNN and least square SVR (LSSVR))e ensemble models derive from the combination of thesingle models and the decomposition method ICEEMDAN

)e detailed parameters of all prediction models de-composition approach ICEEMDAN and optimizationmethod WOA in the experiments are shown in Table 2 )eparameter values of BPNN LSSVR RVFL and ICEEMDANrefer to the previous literature [22 34 45]

All experiments were conducted using Matlab R2019bon a PC with 64 bit Microsoft Windows 10 8GB RAM and18GHz i7-8565U CPU

44 Results and Analysis We compare the forecastingperformance of six prediction models including three singlemodels

(RW LSSVR and BPNN) and three ensemble modes(ICEEMDAN-RW ICEEMDAN-LSSVR and ICEEMDAN-BPNN) with that of our proposed MICEEMDAN-WOA-RVFL in terms of MAPE RMSE and Dstat Due to thedifferent horizons we train different forecasting modelsseparately )at is to say we use the proposed scheme fordifferent horizons to train different models Tables 3ndash5 re-port the experimental results in terms of each evaluationindex with 1- 3- and 6-horizon respectively

From Table 3 we can see that the proposedMICEEMDAN-WOA-RVFL obtains the lowest (the best)MAPE values with all the horizons in all the datasets RWobtains the best MAPE values with all the horizons in all thedatasets among all the compared single models

6 Complexity

50

100

150

Crud

e oil

pric

es

ndash100

10

IMF1

ndash100

10

IMF2

ndash100

10

IMF3

ndash100

10

IMF4

ndash100

10

IMF5

ndash200

20

IMF6

ndash200

20

IMF7

ndash200

20

IMF8

ndash200

20

IMF9

ndash100

10

IMF1

0

ndash200

20

IMF1

1

20406080

Resid

ue

Jan 02 1986 Apr 25 1990 Aug 19 1994 Jan 04 1999 May 27 2003 Oct 17 2007 Feb 29 2012 Jul 12 2016 Apr 15 2020

Figure 4 )e WTI crude oil price series and the corresponding decomposed subseries by ICEEMDAN

Table 1 Samples of economic and financial time series

Dataset Type Dataset Size Date

WTI DailySample set 8641 2 January 1986sim15 April 2020Training set 6912 2 January 1986sim24 May 2013Testing set 1729 28 May 2013sim15 April 2020

USDEUR DailySample set 5341 4 January 1999sim10 April 2020Training set 4272 4 January 1999sim30 December 2015Testing set 1069 31 December 2015sim10 April 2020

IP MonthlySample set 1215 January 1919simMatch 2020Training set 972 January 1919simDecember 1999Testing set 243 January 2000simMatch 2020

SSEC DailySample set 7172 19 December 1990sim21 April 2020Training set 5737 19 December 1990sim4 June 2014Testing set 1435 5 June 2014sim21 April 2020

Complexity 7

demonstrating it is better than LSSVR and BPNN for theforecasting of economic and financial time series Of all theensemble models the ICEEMDAN-LSSVR model andICEEMDAN-BPNN model obtain the close MAPE valuesobviously better than the ICEEMDAN-RW

)e RMSE values of the four time series datasets arelisted in Table 4 From this table we can find that theproposed MICEEMDAN-WOA-RVFL outperforms allthe single and ensemble models with all the horizons in allthe datasets )e statistical model RW obtains the bestRMSE values in 10 out of 12 cases demonstrating that it ismore suitable for economic and financial time seriesforecasting than LSSVR and BPNN among all the singlemodels As to the ensemble models the proposedMICEEMDAN-WOA-RVFL obtains the lower RMSEvalues than the compared ensemble models

demonstrating that the former is more effective for eco-nomic and financial time series forecasting

Table 5 shows the directional statistics Dstat and we cansee that the MICEEMDAN-WOA-RVFL achieves thehighest Dstat values in all the 12 cases indicating that it hasbetter performance of direction forecasting Amongst thesingle prediction models LSSVR and RW obtain the bestDstat values in 5 cases respectively better than the BPNNSimilarly the ICEEMDAN-LSSVRmodel and ICEEMDAN-BPNN model obtain the close Dstat values obviously betterthan the ICEEMDAN-RW model in all the 12 cases

From the all prediction results we can find that all theensemble prediction models except ICEEMDAN-RW greatlyoutperform the corresponding single prediction models in allthe 12 cases showing that the framework of decompositionand ensemble is an effective tool for improving the forecasting

Table 2 )e settings for the parametersMethod Parameters Description

ICEEMDANNsd 02 Noise standard deviationNr 100 Number of realizations

Maxsi 5000 Maximum number of sifting iterations

LSSVR Rp 2minus10minus9 1112 Regularization parameterWidRBF 2minus10 minus9 1112 Width of the RBF kernel

BPNNNhe 10 Number of hidden neurons

Maxte 1000 Maximum training epochsLr 00001 Learning rate

WOA Pop 40 Population sizeMaxgen 100 Maximum generation

MICEEMDAN-WOA-RVFL

Nsd [001 04] Noise standard deviation in ICEEMDANNr [50 500] Number of realizations in ICEEMDAN

Maxsi [2000 8000] Maximum number of sifting iterations inICEEMDAN

Nhe [5 30] Number of hidden neurons in RVFLFunc sigmoid sine hardlim tribas radbas sign Activation function in RVFL

Mod 1 Regularized least square Mode in RVFL2 MoorendashPenrose pseudoinverse

Lag [3 20] Lag in RVFLBias true false Bias in RVFL

Rand 1 Gaussian 2 Uniform Random type in RVFLScale [01 1] Scale value in RVFL

ScaleMode 1 Scale the features for all neurons Scale mode in RVFL2 Scale the features for each hidden neuron

3 Scale the range of the randomization for uniformdiatribution

Table 3 )e mean absolute percent error (MAPE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 00036 00176 00177 00182 00200 00044 000553 00080 00307 00309 00320 00331 00092 000946 00113 00441 00449 00457 00458 01202 00130

USDEUR1 00006 00034 00034 00034 00044 00010 000103 00015 00063 00063 00063 00071 00020 000216 00022 00087 00088 00087 00094 00032 00031

IP1 00012 00049 00057 00056 00053 00024 000223 00023 00102 00140 00123 00104 00033 000336 00032 00183 00279 00235 00184 00043 00049

SSEC1 00020 00096 00096 00097 00111 00026 000263 00044 00178 00178 00179 00185 00047 000516 00065 00259 00259 00269 00268 00072 00074

8 Complexity

performance Of all the compared models the proposedMICEEMDAN-WOA-RVFL obtains the highest Dstat valuesand the lowest MAPE and RMSE values in all the time seriesdatasets showing that it is completely superior to thebenchmark prediction models Furthermore for each pre-dictionmodel theMAPE and RMSE values increase while theDstat values decrease with the horizon)is demonstrates thatit is easier to forecast time series with a short horizon thanwith a long one It is worth noting that the proposedMICEEMDAN-WOA-RVFL still achieves the relatively goodMAPE and RMSE with the increase of horizon among thecompared models For example when the proposed modelobtains 07737 Dstat with horizon 6 in the WTI dataset itachieves the relatively low MAPE (00113) and RMSE(08146) indicating that the prediction values are very close tothe real values although the proposed model misses directionabout 2263 In other words the proposed MICEEMDAN-WOA-RVFL can still achieve satisfactory forecasting accuracywith a long horizon

Furthermore we can find that the multiple decompo-sition strategy does not improve the forecast for the RWmodel One possible explanation is simply that RW infersthat the past movement or trend of a time series cannot beused to predict its future movement and thus it cannot takeadvantage of the historical data and the diversity of multipledecomposition to make the future prediction )ereforewhen we aggregate the multiple RWmodel predictions of allthe decomposed subseries we just integrate several randompredictions and thus cannot significantly improve theensemble prediction results In contrast the LSSVR and

BPNN as well as RVFL can fully use all the historical dataand the diversity of multiple decomposition to make thefuture prediction Specifically the multiple decompositionusing different parameters generates many groups of dif-ferent decomposed subseries and the diversity of decom-position can successfully overcome the randomness of onesingle decomposition and further improve the predictionaccuracy and stability of the developed forecasting approach

In addition the DieboldndashMariano (DM) test is utilizedto evaluate whether the forecasting accuracy of the proposedMICEEMDAN-WOA-RVFL significantly outperformsthose of the other compared models Table 6 shows thestatistics and p values (in brackets)

On one hand the DM statistical values between theensemble prediction models and their corresponding singlepredictors are much lower than zero and the correspondingp values are almost equal to zero with all the horizons exceptfor the RW model showing that the architecture of ldquode-composition and ensemblerdquo contributes to greatly im-proving prediction accuracy and the combination ofICEEMDAN and AI predictors is more effective for eco-nomic and financial time series forecasting

On the other hand DM test results on the prediction ofall the four time series datasets indicate that theMICEEMDAN-WOA-RVFL is significantly better than thesingle models and the other ensemble models with all thehorizons and the corresponding p values are much lowerthan 001 in all the cases

In summary the DM test results demonstrate that thecombination of multiple ICEEMDANs RVFL networks and

Table 4 )e root mean squared error (RMSE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 02715 13196 13228 13744 18134 03467 040783 05953 21784 21929 22867 25041 06754 076206 08146 30737 31271 31845 32348 08692 09462

USDEUR1 00009 00051 00052 00052 00099 00015 000163 00022 00091 00092 00092 00129 00031 000316 00033 00127 00128 00127 00154 00047 00046

IP1 02114 07528 08441 08230 08205 03861 041753 03875 14059 18553 18311 14401 05533 052946 05340 24459 36015 28116 24569 06408 08264

SSEC1 108115 500742 498800 508431 633951 126460 1379283 222983 891834 885290 891781 932887 252526 2977846 357201 1316386 1313487 1335647 1372598 404357 496341

Table 5 )e directional statistic (Dstat) values of different prediction modelsDataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 09381 04815 05243 05191 04977 09190 090973 08576 04873 05087 05012 04907 08300 084496 07737 05116 04902 04948 05185 07714 07575

USDEUR1 09410 05019 04719 04897 05056 08998 090263 08502 04888 04897 04925 04916 08118 080246 07828 04991 05318 05253 05047 07023 07154

IP1 09256 05579 05909 05785 05620 08636 087603 08802 06983 05455 04876 06529 08430 080586 08141 06198 05537 05661 06405 07355 07645

SSEC1 09156 04944 05049 05098 04979 09024 090023 08396 05042 05021 05007 04965 08222 081456 07587 04833 05063 04965 04805 07448 07455

Complexity 9

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 4: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

23 RandomVector Functional Link (RVFL) Neural NetworkAs a kind of modification of the multilayer perceptron(MLP) model the random vector functional link (RVFL)neural network was proposed by Pao et al [42] )e RVFLneural network is able to overcome the slow convergenceoverfitting and local minimum inherently in the traditionalgradient-based learning algorithms Like MLP the archi-tecture of the RVFL neural network includes three layerswhich is illustrated in Figure 2)emain modification in theRVFL network lies in the connection in the networkstructure Since the RVFL network has the direct connec-tions from the input layer to the output layer it can performbetter compared to no direct link [43 44]

)e neurons in the hidden layer known as enhancementnodes calculate the sum of all the output of the input layerneurons and obtain their output with an activation function

hm g 1113944

N

n1wnmin + bm

⎛⎝ ⎞⎠ (4)

where wnm denotes the weight between in and hm bm rep-resents the bias of the mth neuron in hidden layer and g(middot)

represents an activation function)e output layer neurons integrate all the output from

the hidden layer and input layer neurons and the finaloutput is

ol 1113944M

m1wmlhm + 1113944

N

n1wnlin (5)

where wml represents the weight between hm and ol and wnl

indicates the weight between in and olTo enhance the training efficiency the RVFL neural

network utilizes a given distribution to fix the values of wnm

and bm and obtain the weights of wml and wnl by minimizingthe system error

E 12P

1113944

P

j1t(j)

minus Bd(j)

1113872 11138732 (6)

where P indicates the number of training samples and t arethe target values B is the combination of wml and wnl and drepresents a combined vector

)e RVFL neural network has demonstrated an ex-tremely efficient and fast forecasting ability and has beenfrequently used in time series forecasting [44 45]

3 MICEEMDAN-WOA-RVFL The ProposedApproach for Economic and Financial TimeSeries Forecasting

Referring to the framework of ldquodecomposition and en-semblerdquo we design a multidecomposition and self-opti-mizing hybrid model that integrates multipleICEEMDANs WOA and RVFL networks termed asMICEEMDAN-WOA-RVFL to forecast economic andfinancial time series )e architecture of the proposedhybrid model is illustrated in Figure 3

Our proposed MICEEMDAN-WOA-RVFL takes ad-vantage of the idea of ldquodivide and conquerrdquo that was fre-quently used in time series forecasting image processingfault diagnosis and so on [46ndash53] A procedure of ldquode-composition and ensemblerdquo in the MICEEMDAN-WOA-RVFL is as follows

(i) Stage 1 decomposition ICEEMDAN is employedto separate original time series into several subseries(ie several IMFs and one residue)

Initialize the whales population Pi (i = 1 2 n)

Calculate the fitness of each individual and obtain the best individual Pbest

t lt maximumnumber of iterations

Update A C p and l for each individual respectively

p ge 05

Update the position of the current search by the Eq (2) A lt 1

Update the position of the current search by the Eq (1) Update the position of the current search by the Eq (3)

Calculate the fitness of each individual and update the best individual Pbest

t = t + 1

Obtain the best individual Pbest

True

True False

False

True False

Figure 1 )e flowchart of whale optimization algorithm (WOA)

4 Complexity

(ii) Stage 2 individual forecasting Each decomposedsubseries is divided into a training dataset and atesting dataset then the RVFL network with WOAoptimization is developed on each training datasetindependently and finally the developed RVFLmodel is used to each testing dataset )e reasonwhy we select the RVFL network as the predictor isits powerful forecasting ability in extant research[15 34 44] Since the parameter setting of the RVFLnetwork plays an important role in the predictionperformance we introduce WOA to seek the op-timal parameter values for the RVFL network in theforecasting stage

(iii) Stage 3 ensemble the predictions of all thedecomposed subseries are aggregated as the finalprediction results of one ldquodecomposition and en-semblerdquo using addition aggregation

In this study to enhance both accuracy and stability offinal prediction we generate random values for the de-composition parameters of ICEEMDAN in the decompo-sition stage including number of realizations (Nr) noisestandard deviation (Nsd) and maximum number of siftingiterations (Maxsi) for each decomposition and repeat theprocedure of ldquodecomposition and ensemblerdquo M times andfinally combine all the results of multiple ldquodecompositionsand ensemblesrdquo using the RMSE-weighted method as thefinal prediction results )e corresponding weight of ithforecasting model is as follows

Weighti 1RMSEi

1113936Mj11RMSEj

(7)

where M denotes the number of individual models andRMSEi indicates the RMSE value of the ith forecasting modelin the training process

Decomposition

Individual forecasting

Ensemble

hellip hellip hellip hellip

hellip hellip

hellip hellip

Economic and financial time series

ICEEMDAN1

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted results1

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

Randomize the parametersof ICEEMDAN1

Randomize the parametersof ICEEMDANm

WOA RVFL WOA hellip hellip

hellip hellip

ICEEMDANm

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted resultsm

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

WOA RVFL WOA

helliphellip

Ensemble

Final predicted results

Figure 3 )e flowchart of the proposed MICEEMDAN-WOA-RVFL

Input layer

Hidden layer

Output layer

i1

iN

hM

h2

h1

o1

oL

wnm bm wmlwnl

Figure 2 )e architecture of the random vector functional link (RVFL) neural network

Complexity 5

Although some recent studies also employ the RVFLnetwork for time series forecasting they obviously differfrom the current study in the decomposition technique andnetwork optimization (1) they divide the original timeseries using WD or EMD (2) they construct RVFL net-works using the fixed parameter values Unlike the previousstudies our study decomposes economic and financial timeseries using ICEEMDAN and searches the optimal pa-rameter values of RVFL networks based on WOA Fur-thermore the previous research mainly focuses on dividingoriginal time series by one single decomposition or dualdecomposition [20 35 54] In dual decomposition theoriginal signal is first decomposed into several compo-nents and then the high-frequency components continueto be decomposed into other components using the same ordifferent decomposition method Essentially the dual de-composition process belongs to one decomposition justincluding two different decomposition stages Unlike theprevious research one main improvement in this study isthe multiple decomposition strategy which can success-fully overcome the randomness of one single decomposi-tion and further improve the prediction accuracy andstability of the developed forecasting approach To ourknowledge it is the first time that the multiple decom-position strategy is developed for the forecasting of eco-nomic and financial time series

4 Experimental Results

41 Data Description As we know economic and financialtime series are influenced by various factors sometimes raisingand dropping down in a short time )e dramatic fluctuationsusually lead to the significant nonlinearity and nonstationarityof the time series To comprehensively evaluate the effectivenessof the proposed MICEEMDAN-WOA-RVFL we choose fourdifferent time series including the West Texas Intermediatecrude oil spot price (WTI) US dollarEuro foreign exchangerate (USDEUR) US industrial production (IP) and Shanghaistock exchange composite index (SSEC) as the experimentaldatasets )e first three datasets can be accessed via the websiteof St Louis Fed Research [55] and the last one can be obtainedvia the website of NetEase [56]

Each time series is separated into two subdatasets thefirst 80 for training and the last 20 for testing Table 1shows the divided samples of the abovementioned foureconomic and financial time series

We utilize ICEEMDAN to decompose these time seriesinto groups of relatively simple subseries Figure 4 offers anexample of the decomposition of the WTI dataset usingICEEMDAN

42 Evaluation Indices In this study we use four evaluationmetrics including the mean absolute percent error (MAPE)the root mean squared error (RMSE) the directional statistic(Dstat) and the DieboldndashMariano (DM) test to assess theperformance of the proposed model Among them MAPEand RMSE are used to assess the forecasting error defined asfollows

MAPE 1113944N

t1

Ot minus Pt

Ot

11138681113868111386811138681113868111386811138681113868

11138681113868111386811138681113868111386811138681113868times100N

(8)

RMSE

1N

1113944N

t1Ot minus Pt( 1113857

2

11139741113972

(9)

where N is the size of the evaluated samples Ot denotes theactual values and Pt represents the predicted values at time t)e lower the values of RMSE and MAPE the better theprediction models

)e Dstat indicates the performance of direction pre-diction which is formulated as follows

Dstat 1N

1113944

N

i1di times 100 (10)

where di 1 if (Pt+1 minusOt) (Ot+1 minusOt)ge 0 otherwise di 0 Ahigher value of Dstat indicates a more accurate directionprediction

Furthermore to test the significance of the predictionperformance of pairs of models we employ the Die-boldndashMariano (DM) test in this study

43 Experimental Settings In this study we compare theproposed MICEEMDAN-WOA-RVFL with several state-of-the-art forecasting models including the single models andthe ensemble models Among all these models the singlemodels include one popular statistical model RW and twopopular AI models BPNN and least square SVR (LSSVR))e ensemble models derive from the combination of thesingle models and the decomposition method ICEEMDAN

)e detailed parameters of all prediction models de-composition approach ICEEMDAN and optimizationmethod WOA in the experiments are shown in Table 2 )eparameter values of BPNN LSSVR RVFL and ICEEMDANrefer to the previous literature [22 34 45]

All experiments were conducted using Matlab R2019bon a PC with 64 bit Microsoft Windows 10 8GB RAM and18GHz i7-8565U CPU

44 Results and Analysis We compare the forecastingperformance of six prediction models including three singlemodels

(RW LSSVR and BPNN) and three ensemble modes(ICEEMDAN-RW ICEEMDAN-LSSVR and ICEEMDAN-BPNN) with that of our proposed MICEEMDAN-WOA-RVFL in terms of MAPE RMSE and Dstat Due to thedifferent horizons we train different forecasting modelsseparately )at is to say we use the proposed scheme fordifferent horizons to train different models Tables 3ndash5 re-port the experimental results in terms of each evaluationindex with 1- 3- and 6-horizon respectively

From Table 3 we can see that the proposedMICEEMDAN-WOA-RVFL obtains the lowest (the best)MAPE values with all the horizons in all the datasets RWobtains the best MAPE values with all the horizons in all thedatasets among all the compared single models

6 Complexity

50

100

150

Crud

e oil

pric

es

ndash100

10

IMF1

ndash100

10

IMF2

ndash100

10

IMF3

ndash100

10

IMF4

ndash100

10

IMF5

ndash200

20

IMF6

ndash200

20

IMF7

ndash200

20

IMF8

ndash200

20

IMF9

ndash100

10

IMF1

0

ndash200

20

IMF1

1

20406080

Resid

ue

Jan 02 1986 Apr 25 1990 Aug 19 1994 Jan 04 1999 May 27 2003 Oct 17 2007 Feb 29 2012 Jul 12 2016 Apr 15 2020

Figure 4 )e WTI crude oil price series and the corresponding decomposed subseries by ICEEMDAN

Table 1 Samples of economic and financial time series

Dataset Type Dataset Size Date

WTI DailySample set 8641 2 January 1986sim15 April 2020Training set 6912 2 January 1986sim24 May 2013Testing set 1729 28 May 2013sim15 April 2020

USDEUR DailySample set 5341 4 January 1999sim10 April 2020Training set 4272 4 January 1999sim30 December 2015Testing set 1069 31 December 2015sim10 April 2020

IP MonthlySample set 1215 January 1919simMatch 2020Training set 972 January 1919simDecember 1999Testing set 243 January 2000simMatch 2020

SSEC DailySample set 7172 19 December 1990sim21 April 2020Training set 5737 19 December 1990sim4 June 2014Testing set 1435 5 June 2014sim21 April 2020

Complexity 7

demonstrating it is better than LSSVR and BPNN for theforecasting of economic and financial time series Of all theensemble models the ICEEMDAN-LSSVR model andICEEMDAN-BPNN model obtain the close MAPE valuesobviously better than the ICEEMDAN-RW

)e RMSE values of the four time series datasets arelisted in Table 4 From this table we can find that theproposed MICEEMDAN-WOA-RVFL outperforms allthe single and ensemble models with all the horizons in allthe datasets )e statistical model RW obtains the bestRMSE values in 10 out of 12 cases demonstrating that it ismore suitable for economic and financial time seriesforecasting than LSSVR and BPNN among all the singlemodels As to the ensemble models the proposedMICEEMDAN-WOA-RVFL obtains the lower RMSEvalues than the compared ensemble models

demonstrating that the former is more effective for eco-nomic and financial time series forecasting

Table 5 shows the directional statistics Dstat and we cansee that the MICEEMDAN-WOA-RVFL achieves thehighest Dstat values in all the 12 cases indicating that it hasbetter performance of direction forecasting Amongst thesingle prediction models LSSVR and RW obtain the bestDstat values in 5 cases respectively better than the BPNNSimilarly the ICEEMDAN-LSSVRmodel and ICEEMDAN-BPNN model obtain the close Dstat values obviously betterthan the ICEEMDAN-RW model in all the 12 cases

From the all prediction results we can find that all theensemble prediction models except ICEEMDAN-RW greatlyoutperform the corresponding single prediction models in allthe 12 cases showing that the framework of decompositionand ensemble is an effective tool for improving the forecasting

Table 2 )e settings for the parametersMethod Parameters Description

ICEEMDANNsd 02 Noise standard deviationNr 100 Number of realizations

Maxsi 5000 Maximum number of sifting iterations

LSSVR Rp 2minus10minus9 1112 Regularization parameterWidRBF 2minus10 minus9 1112 Width of the RBF kernel

BPNNNhe 10 Number of hidden neurons

Maxte 1000 Maximum training epochsLr 00001 Learning rate

WOA Pop 40 Population sizeMaxgen 100 Maximum generation

MICEEMDAN-WOA-RVFL

Nsd [001 04] Noise standard deviation in ICEEMDANNr [50 500] Number of realizations in ICEEMDAN

Maxsi [2000 8000] Maximum number of sifting iterations inICEEMDAN

Nhe [5 30] Number of hidden neurons in RVFLFunc sigmoid sine hardlim tribas radbas sign Activation function in RVFL

Mod 1 Regularized least square Mode in RVFL2 MoorendashPenrose pseudoinverse

Lag [3 20] Lag in RVFLBias true false Bias in RVFL

Rand 1 Gaussian 2 Uniform Random type in RVFLScale [01 1] Scale value in RVFL

ScaleMode 1 Scale the features for all neurons Scale mode in RVFL2 Scale the features for each hidden neuron

3 Scale the range of the randomization for uniformdiatribution

Table 3 )e mean absolute percent error (MAPE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 00036 00176 00177 00182 00200 00044 000553 00080 00307 00309 00320 00331 00092 000946 00113 00441 00449 00457 00458 01202 00130

USDEUR1 00006 00034 00034 00034 00044 00010 000103 00015 00063 00063 00063 00071 00020 000216 00022 00087 00088 00087 00094 00032 00031

IP1 00012 00049 00057 00056 00053 00024 000223 00023 00102 00140 00123 00104 00033 000336 00032 00183 00279 00235 00184 00043 00049

SSEC1 00020 00096 00096 00097 00111 00026 000263 00044 00178 00178 00179 00185 00047 000516 00065 00259 00259 00269 00268 00072 00074

8 Complexity

performance Of all the compared models the proposedMICEEMDAN-WOA-RVFL obtains the highest Dstat valuesand the lowest MAPE and RMSE values in all the time seriesdatasets showing that it is completely superior to thebenchmark prediction models Furthermore for each pre-dictionmodel theMAPE and RMSE values increase while theDstat values decrease with the horizon)is demonstrates thatit is easier to forecast time series with a short horizon thanwith a long one It is worth noting that the proposedMICEEMDAN-WOA-RVFL still achieves the relatively goodMAPE and RMSE with the increase of horizon among thecompared models For example when the proposed modelobtains 07737 Dstat with horizon 6 in the WTI dataset itachieves the relatively low MAPE (00113) and RMSE(08146) indicating that the prediction values are very close tothe real values although the proposed model misses directionabout 2263 In other words the proposed MICEEMDAN-WOA-RVFL can still achieve satisfactory forecasting accuracywith a long horizon

Furthermore we can find that the multiple decompo-sition strategy does not improve the forecast for the RWmodel One possible explanation is simply that RW infersthat the past movement or trend of a time series cannot beused to predict its future movement and thus it cannot takeadvantage of the historical data and the diversity of multipledecomposition to make the future prediction )ereforewhen we aggregate the multiple RWmodel predictions of allthe decomposed subseries we just integrate several randompredictions and thus cannot significantly improve theensemble prediction results In contrast the LSSVR and

BPNN as well as RVFL can fully use all the historical dataand the diversity of multiple decomposition to make thefuture prediction Specifically the multiple decompositionusing different parameters generates many groups of dif-ferent decomposed subseries and the diversity of decom-position can successfully overcome the randomness of onesingle decomposition and further improve the predictionaccuracy and stability of the developed forecasting approach

In addition the DieboldndashMariano (DM) test is utilizedto evaluate whether the forecasting accuracy of the proposedMICEEMDAN-WOA-RVFL significantly outperformsthose of the other compared models Table 6 shows thestatistics and p values (in brackets)

On one hand the DM statistical values between theensemble prediction models and their corresponding singlepredictors are much lower than zero and the correspondingp values are almost equal to zero with all the horizons exceptfor the RW model showing that the architecture of ldquode-composition and ensemblerdquo contributes to greatly im-proving prediction accuracy and the combination ofICEEMDAN and AI predictors is more effective for eco-nomic and financial time series forecasting

On the other hand DM test results on the prediction ofall the four time series datasets indicate that theMICEEMDAN-WOA-RVFL is significantly better than thesingle models and the other ensemble models with all thehorizons and the corresponding p values are much lowerthan 001 in all the cases

In summary the DM test results demonstrate that thecombination of multiple ICEEMDANs RVFL networks and

Table 4 )e root mean squared error (RMSE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 02715 13196 13228 13744 18134 03467 040783 05953 21784 21929 22867 25041 06754 076206 08146 30737 31271 31845 32348 08692 09462

USDEUR1 00009 00051 00052 00052 00099 00015 000163 00022 00091 00092 00092 00129 00031 000316 00033 00127 00128 00127 00154 00047 00046

IP1 02114 07528 08441 08230 08205 03861 041753 03875 14059 18553 18311 14401 05533 052946 05340 24459 36015 28116 24569 06408 08264

SSEC1 108115 500742 498800 508431 633951 126460 1379283 222983 891834 885290 891781 932887 252526 2977846 357201 1316386 1313487 1335647 1372598 404357 496341

Table 5 )e directional statistic (Dstat) values of different prediction modelsDataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 09381 04815 05243 05191 04977 09190 090973 08576 04873 05087 05012 04907 08300 084496 07737 05116 04902 04948 05185 07714 07575

USDEUR1 09410 05019 04719 04897 05056 08998 090263 08502 04888 04897 04925 04916 08118 080246 07828 04991 05318 05253 05047 07023 07154

IP1 09256 05579 05909 05785 05620 08636 087603 08802 06983 05455 04876 06529 08430 080586 08141 06198 05537 05661 06405 07355 07645

SSEC1 09156 04944 05049 05098 04979 09024 090023 08396 05042 05021 05007 04965 08222 081456 07587 04833 05063 04965 04805 07448 07455

Complexity 9

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 5: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

(ii) Stage 2 individual forecasting Each decomposedsubseries is divided into a training dataset and atesting dataset then the RVFL network with WOAoptimization is developed on each training datasetindependently and finally the developed RVFLmodel is used to each testing dataset )e reasonwhy we select the RVFL network as the predictor isits powerful forecasting ability in extant research[15 34 44] Since the parameter setting of the RVFLnetwork plays an important role in the predictionperformance we introduce WOA to seek the op-timal parameter values for the RVFL network in theforecasting stage

(iii) Stage 3 ensemble the predictions of all thedecomposed subseries are aggregated as the finalprediction results of one ldquodecomposition and en-semblerdquo using addition aggregation

In this study to enhance both accuracy and stability offinal prediction we generate random values for the de-composition parameters of ICEEMDAN in the decompo-sition stage including number of realizations (Nr) noisestandard deviation (Nsd) and maximum number of siftingiterations (Maxsi) for each decomposition and repeat theprocedure of ldquodecomposition and ensemblerdquo M times andfinally combine all the results of multiple ldquodecompositionsand ensemblesrdquo using the RMSE-weighted method as thefinal prediction results )e corresponding weight of ithforecasting model is as follows

Weighti 1RMSEi

1113936Mj11RMSEj

(7)

where M denotes the number of individual models andRMSEi indicates the RMSE value of the ith forecasting modelin the training process

Decomposition

Individual forecasting

Ensemble

hellip hellip hellip hellip

hellip hellip

hellip hellip

Economic and financial time series

ICEEMDAN1

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted results1

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

Randomize the parametersof ICEEMDAN1

Randomize the parametersof ICEEMDANm

WOA RVFL WOA hellip hellip

hellip hellip

ICEEMDANm

IMF1 IMFn Residue

RVFL

Predictedresult of IMF1

Addition

Predicted resultsm

WOA RVFL

Predictedresult of IMFn

Predictedresult of residue

WOA RVFL WOA

helliphellip

Ensemble

Final predicted results

Figure 3 )e flowchart of the proposed MICEEMDAN-WOA-RVFL

Input layer

Hidden layer

Output layer

i1

iN

hM

h2

h1

o1

oL

wnm bm wmlwnl

Figure 2 )e architecture of the random vector functional link (RVFL) neural network

Complexity 5

Although some recent studies also employ the RVFLnetwork for time series forecasting they obviously differfrom the current study in the decomposition technique andnetwork optimization (1) they divide the original timeseries using WD or EMD (2) they construct RVFL net-works using the fixed parameter values Unlike the previousstudies our study decomposes economic and financial timeseries using ICEEMDAN and searches the optimal pa-rameter values of RVFL networks based on WOA Fur-thermore the previous research mainly focuses on dividingoriginal time series by one single decomposition or dualdecomposition [20 35 54] In dual decomposition theoriginal signal is first decomposed into several compo-nents and then the high-frequency components continueto be decomposed into other components using the same ordifferent decomposition method Essentially the dual de-composition process belongs to one decomposition justincluding two different decomposition stages Unlike theprevious research one main improvement in this study isthe multiple decomposition strategy which can success-fully overcome the randomness of one single decomposi-tion and further improve the prediction accuracy andstability of the developed forecasting approach To ourknowledge it is the first time that the multiple decom-position strategy is developed for the forecasting of eco-nomic and financial time series

4 Experimental Results

41 Data Description As we know economic and financialtime series are influenced by various factors sometimes raisingand dropping down in a short time )e dramatic fluctuationsusually lead to the significant nonlinearity and nonstationarityof the time series To comprehensively evaluate the effectivenessof the proposed MICEEMDAN-WOA-RVFL we choose fourdifferent time series including the West Texas Intermediatecrude oil spot price (WTI) US dollarEuro foreign exchangerate (USDEUR) US industrial production (IP) and Shanghaistock exchange composite index (SSEC) as the experimentaldatasets )e first three datasets can be accessed via the websiteof St Louis Fed Research [55] and the last one can be obtainedvia the website of NetEase [56]

Each time series is separated into two subdatasets thefirst 80 for training and the last 20 for testing Table 1shows the divided samples of the abovementioned foureconomic and financial time series

We utilize ICEEMDAN to decompose these time seriesinto groups of relatively simple subseries Figure 4 offers anexample of the decomposition of the WTI dataset usingICEEMDAN

42 Evaluation Indices In this study we use four evaluationmetrics including the mean absolute percent error (MAPE)the root mean squared error (RMSE) the directional statistic(Dstat) and the DieboldndashMariano (DM) test to assess theperformance of the proposed model Among them MAPEand RMSE are used to assess the forecasting error defined asfollows

MAPE 1113944N

t1

Ot minus Pt

Ot

11138681113868111386811138681113868111386811138681113868

11138681113868111386811138681113868111386811138681113868times100N

(8)

RMSE

1N

1113944N

t1Ot minus Pt( 1113857

2

11139741113972

(9)

where N is the size of the evaluated samples Ot denotes theactual values and Pt represents the predicted values at time t)e lower the values of RMSE and MAPE the better theprediction models

)e Dstat indicates the performance of direction pre-diction which is formulated as follows

Dstat 1N

1113944

N

i1di times 100 (10)

where di 1 if (Pt+1 minusOt) (Ot+1 minusOt)ge 0 otherwise di 0 Ahigher value of Dstat indicates a more accurate directionprediction

Furthermore to test the significance of the predictionperformance of pairs of models we employ the Die-boldndashMariano (DM) test in this study

43 Experimental Settings In this study we compare theproposed MICEEMDAN-WOA-RVFL with several state-of-the-art forecasting models including the single models andthe ensemble models Among all these models the singlemodels include one popular statistical model RW and twopopular AI models BPNN and least square SVR (LSSVR))e ensemble models derive from the combination of thesingle models and the decomposition method ICEEMDAN

)e detailed parameters of all prediction models de-composition approach ICEEMDAN and optimizationmethod WOA in the experiments are shown in Table 2 )eparameter values of BPNN LSSVR RVFL and ICEEMDANrefer to the previous literature [22 34 45]

All experiments were conducted using Matlab R2019bon a PC with 64 bit Microsoft Windows 10 8GB RAM and18GHz i7-8565U CPU

44 Results and Analysis We compare the forecastingperformance of six prediction models including three singlemodels

(RW LSSVR and BPNN) and three ensemble modes(ICEEMDAN-RW ICEEMDAN-LSSVR and ICEEMDAN-BPNN) with that of our proposed MICEEMDAN-WOA-RVFL in terms of MAPE RMSE and Dstat Due to thedifferent horizons we train different forecasting modelsseparately )at is to say we use the proposed scheme fordifferent horizons to train different models Tables 3ndash5 re-port the experimental results in terms of each evaluationindex with 1- 3- and 6-horizon respectively

From Table 3 we can see that the proposedMICEEMDAN-WOA-RVFL obtains the lowest (the best)MAPE values with all the horizons in all the datasets RWobtains the best MAPE values with all the horizons in all thedatasets among all the compared single models

6 Complexity

50

100

150

Crud

e oil

pric

es

ndash100

10

IMF1

ndash100

10

IMF2

ndash100

10

IMF3

ndash100

10

IMF4

ndash100

10

IMF5

ndash200

20

IMF6

ndash200

20

IMF7

ndash200

20

IMF8

ndash200

20

IMF9

ndash100

10

IMF1

0

ndash200

20

IMF1

1

20406080

Resid

ue

Jan 02 1986 Apr 25 1990 Aug 19 1994 Jan 04 1999 May 27 2003 Oct 17 2007 Feb 29 2012 Jul 12 2016 Apr 15 2020

Figure 4 )e WTI crude oil price series and the corresponding decomposed subseries by ICEEMDAN

Table 1 Samples of economic and financial time series

Dataset Type Dataset Size Date

WTI DailySample set 8641 2 January 1986sim15 April 2020Training set 6912 2 January 1986sim24 May 2013Testing set 1729 28 May 2013sim15 April 2020

USDEUR DailySample set 5341 4 January 1999sim10 April 2020Training set 4272 4 January 1999sim30 December 2015Testing set 1069 31 December 2015sim10 April 2020

IP MonthlySample set 1215 January 1919simMatch 2020Training set 972 January 1919simDecember 1999Testing set 243 January 2000simMatch 2020

SSEC DailySample set 7172 19 December 1990sim21 April 2020Training set 5737 19 December 1990sim4 June 2014Testing set 1435 5 June 2014sim21 April 2020

Complexity 7

demonstrating it is better than LSSVR and BPNN for theforecasting of economic and financial time series Of all theensemble models the ICEEMDAN-LSSVR model andICEEMDAN-BPNN model obtain the close MAPE valuesobviously better than the ICEEMDAN-RW

)e RMSE values of the four time series datasets arelisted in Table 4 From this table we can find that theproposed MICEEMDAN-WOA-RVFL outperforms allthe single and ensemble models with all the horizons in allthe datasets )e statistical model RW obtains the bestRMSE values in 10 out of 12 cases demonstrating that it ismore suitable for economic and financial time seriesforecasting than LSSVR and BPNN among all the singlemodels As to the ensemble models the proposedMICEEMDAN-WOA-RVFL obtains the lower RMSEvalues than the compared ensemble models

demonstrating that the former is more effective for eco-nomic and financial time series forecasting

Table 5 shows the directional statistics Dstat and we cansee that the MICEEMDAN-WOA-RVFL achieves thehighest Dstat values in all the 12 cases indicating that it hasbetter performance of direction forecasting Amongst thesingle prediction models LSSVR and RW obtain the bestDstat values in 5 cases respectively better than the BPNNSimilarly the ICEEMDAN-LSSVRmodel and ICEEMDAN-BPNN model obtain the close Dstat values obviously betterthan the ICEEMDAN-RW model in all the 12 cases

From the all prediction results we can find that all theensemble prediction models except ICEEMDAN-RW greatlyoutperform the corresponding single prediction models in allthe 12 cases showing that the framework of decompositionand ensemble is an effective tool for improving the forecasting

Table 2 )e settings for the parametersMethod Parameters Description

ICEEMDANNsd 02 Noise standard deviationNr 100 Number of realizations

Maxsi 5000 Maximum number of sifting iterations

LSSVR Rp 2minus10minus9 1112 Regularization parameterWidRBF 2minus10 minus9 1112 Width of the RBF kernel

BPNNNhe 10 Number of hidden neurons

Maxte 1000 Maximum training epochsLr 00001 Learning rate

WOA Pop 40 Population sizeMaxgen 100 Maximum generation

MICEEMDAN-WOA-RVFL

Nsd [001 04] Noise standard deviation in ICEEMDANNr [50 500] Number of realizations in ICEEMDAN

Maxsi [2000 8000] Maximum number of sifting iterations inICEEMDAN

Nhe [5 30] Number of hidden neurons in RVFLFunc sigmoid sine hardlim tribas radbas sign Activation function in RVFL

Mod 1 Regularized least square Mode in RVFL2 MoorendashPenrose pseudoinverse

Lag [3 20] Lag in RVFLBias true false Bias in RVFL

Rand 1 Gaussian 2 Uniform Random type in RVFLScale [01 1] Scale value in RVFL

ScaleMode 1 Scale the features for all neurons Scale mode in RVFL2 Scale the features for each hidden neuron

3 Scale the range of the randomization for uniformdiatribution

Table 3 )e mean absolute percent error (MAPE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 00036 00176 00177 00182 00200 00044 000553 00080 00307 00309 00320 00331 00092 000946 00113 00441 00449 00457 00458 01202 00130

USDEUR1 00006 00034 00034 00034 00044 00010 000103 00015 00063 00063 00063 00071 00020 000216 00022 00087 00088 00087 00094 00032 00031

IP1 00012 00049 00057 00056 00053 00024 000223 00023 00102 00140 00123 00104 00033 000336 00032 00183 00279 00235 00184 00043 00049

SSEC1 00020 00096 00096 00097 00111 00026 000263 00044 00178 00178 00179 00185 00047 000516 00065 00259 00259 00269 00268 00072 00074

8 Complexity

performance Of all the compared models the proposedMICEEMDAN-WOA-RVFL obtains the highest Dstat valuesand the lowest MAPE and RMSE values in all the time seriesdatasets showing that it is completely superior to thebenchmark prediction models Furthermore for each pre-dictionmodel theMAPE and RMSE values increase while theDstat values decrease with the horizon)is demonstrates thatit is easier to forecast time series with a short horizon thanwith a long one It is worth noting that the proposedMICEEMDAN-WOA-RVFL still achieves the relatively goodMAPE and RMSE with the increase of horizon among thecompared models For example when the proposed modelobtains 07737 Dstat with horizon 6 in the WTI dataset itachieves the relatively low MAPE (00113) and RMSE(08146) indicating that the prediction values are very close tothe real values although the proposed model misses directionabout 2263 In other words the proposed MICEEMDAN-WOA-RVFL can still achieve satisfactory forecasting accuracywith a long horizon

Furthermore we can find that the multiple decompo-sition strategy does not improve the forecast for the RWmodel One possible explanation is simply that RW infersthat the past movement or trend of a time series cannot beused to predict its future movement and thus it cannot takeadvantage of the historical data and the diversity of multipledecomposition to make the future prediction )ereforewhen we aggregate the multiple RWmodel predictions of allthe decomposed subseries we just integrate several randompredictions and thus cannot significantly improve theensemble prediction results In contrast the LSSVR and

BPNN as well as RVFL can fully use all the historical dataand the diversity of multiple decomposition to make thefuture prediction Specifically the multiple decompositionusing different parameters generates many groups of dif-ferent decomposed subseries and the diversity of decom-position can successfully overcome the randomness of onesingle decomposition and further improve the predictionaccuracy and stability of the developed forecasting approach

In addition the DieboldndashMariano (DM) test is utilizedto evaluate whether the forecasting accuracy of the proposedMICEEMDAN-WOA-RVFL significantly outperformsthose of the other compared models Table 6 shows thestatistics and p values (in brackets)

On one hand the DM statistical values between theensemble prediction models and their corresponding singlepredictors are much lower than zero and the correspondingp values are almost equal to zero with all the horizons exceptfor the RW model showing that the architecture of ldquode-composition and ensemblerdquo contributes to greatly im-proving prediction accuracy and the combination ofICEEMDAN and AI predictors is more effective for eco-nomic and financial time series forecasting

On the other hand DM test results on the prediction ofall the four time series datasets indicate that theMICEEMDAN-WOA-RVFL is significantly better than thesingle models and the other ensemble models with all thehorizons and the corresponding p values are much lowerthan 001 in all the cases

In summary the DM test results demonstrate that thecombination of multiple ICEEMDANs RVFL networks and

Table 4 )e root mean squared error (RMSE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 02715 13196 13228 13744 18134 03467 040783 05953 21784 21929 22867 25041 06754 076206 08146 30737 31271 31845 32348 08692 09462

USDEUR1 00009 00051 00052 00052 00099 00015 000163 00022 00091 00092 00092 00129 00031 000316 00033 00127 00128 00127 00154 00047 00046

IP1 02114 07528 08441 08230 08205 03861 041753 03875 14059 18553 18311 14401 05533 052946 05340 24459 36015 28116 24569 06408 08264

SSEC1 108115 500742 498800 508431 633951 126460 1379283 222983 891834 885290 891781 932887 252526 2977846 357201 1316386 1313487 1335647 1372598 404357 496341

Table 5 )e directional statistic (Dstat) values of different prediction modelsDataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 09381 04815 05243 05191 04977 09190 090973 08576 04873 05087 05012 04907 08300 084496 07737 05116 04902 04948 05185 07714 07575

USDEUR1 09410 05019 04719 04897 05056 08998 090263 08502 04888 04897 04925 04916 08118 080246 07828 04991 05318 05253 05047 07023 07154

IP1 09256 05579 05909 05785 05620 08636 087603 08802 06983 05455 04876 06529 08430 080586 08141 06198 05537 05661 06405 07355 07645

SSEC1 09156 04944 05049 05098 04979 09024 090023 08396 05042 05021 05007 04965 08222 081456 07587 04833 05063 04965 04805 07448 07455

Complexity 9

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 6: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

Although some recent studies also employ the RVFLnetwork for time series forecasting they obviously differfrom the current study in the decomposition technique andnetwork optimization (1) they divide the original timeseries using WD or EMD (2) they construct RVFL net-works using the fixed parameter values Unlike the previousstudies our study decomposes economic and financial timeseries using ICEEMDAN and searches the optimal pa-rameter values of RVFL networks based on WOA Fur-thermore the previous research mainly focuses on dividingoriginal time series by one single decomposition or dualdecomposition [20 35 54] In dual decomposition theoriginal signal is first decomposed into several compo-nents and then the high-frequency components continueto be decomposed into other components using the same ordifferent decomposition method Essentially the dual de-composition process belongs to one decomposition justincluding two different decomposition stages Unlike theprevious research one main improvement in this study isthe multiple decomposition strategy which can success-fully overcome the randomness of one single decomposi-tion and further improve the prediction accuracy andstability of the developed forecasting approach To ourknowledge it is the first time that the multiple decom-position strategy is developed for the forecasting of eco-nomic and financial time series

4 Experimental Results

41 Data Description As we know economic and financialtime series are influenced by various factors sometimes raisingand dropping down in a short time )e dramatic fluctuationsusually lead to the significant nonlinearity and nonstationarityof the time series To comprehensively evaluate the effectivenessof the proposed MICEEMDAN-WOA-RVFL we choose fourdifferent time series including the West Texas Intermediatecrude oil spot price (WTI) US dollarEuro foreign exchangerate (USDEUR) US industrial production (IP) and Shanghaistock exchange composite index (SSEC) as the experimentaldatasets )e first three datasets can be accessed via the websiteof St Louis Fed Research [55] and the last one can be obtainedvia the website of NetEase [56]

Each time series is separated into two subdatasets thefirst 80 for training and the last 20 for testing Table 1shows the divided samples of the abovementioned foureconomic and financial time series

We utilize ICEEMDAN to decompose these time seriesinto groups of relatively simple subseries Figure 4 offers anexample of the decomposition of the WTI dataset usingICEEMDAN

42 Evaluation Indices In this study we use four evaluationmetrics including the mean absolute percent error (MAPE)the root mean squared error (RMSE) the directional statistic(Dstat) and the DieboldndashMariano (DM) test to assess theperformance of the proposed model Among them MAPEand RMSE are used to assess the forecasting error defined asfollows

MAPE 1113944N

t1

Ot minus Pt

Ot

11138681113868111386811138681113868111386811138681113868

11138681113868111386811138681113868111386811138681113868times100N

(8)

RMSE

1N

1113944N

t1Ot minus Pt( 1113857

2

11139741113972

(9)

where N is the size of the evaluated samples Ot denotes theactual values and Pt represents the predicted values at time t)e lower the values of RMSE and MAPE the better theprediction models

)e Dstat indicates the performance of direction pre-diction which is formulated as follows

Dstat 1N

1113944

N

i1di times 100 (10)

where di 1 if (Pt+1 minusOt) (Ot+1 minusOt)ge 0 otherwise di 0 Ahigher value of Dstat indicates a more accurate directionprediction

Furthermore to test the significance of the predictionperformance of pairs of models we employ the Die-boldndashMariano (DM) test in this study

43 Experimental Settings In this study we compare theproposed MICEEMDAN-WOA-RVFL with several state-of-the-art forecasting models including the single models andthe ensemble models Among all these models the singlemodels include one popular statistical model RW and twopopular AI models BPNN and least square SVR (LSSVR))e ensemble models derive from the combination of thesingle models and the decomposition method ICEEMDAN

)e detailed parameters of all prediction models de-composition approach ICEEMDAN and optimizationmethod WOA in the experiments are shown in Table 2 )eparameter values of BPNN LSSVR RVFL and ICEEMDANrefer to the previous literature [22 34 45]

All experiments were conducted using Matlab R2019bon a PC with 64 bit Microsoft Windows 10 8GB RAM and18GHz i7-8565U CPU

44 Results and Analysis We compare the forecastingperformance of six prediction models including three singlemodels

(RW LSSVR and BPNN) and three ensemble modes(ICEEMDAN-RW ICEEMDAN-LSSVR and ICEEMDAN-BPNN) with that of our proposed MICEEMDAN-WOA-RVFL in terms of MAPE RMSE and Dstat Due to thedifferent horizons we train different forecasting modelsseparately )at is to say we use the proposed scheme fordifferent horizons to train different models Tables 3ndash5 re-port the experimental results in terms of each evaluationindex with 1- 3- and 6-horizon respectively

From Table 3 we can see that the proposedMICEEMDAN-WOA-RVFL obtains the lowest (the best)MAPE values with all the horizons in all the datasets RWobtains the best MAPE values with all the horizons in all thedatasets among all the compared single models

6 Complexity

50

100

150

Crud

e oil

pric

es

ndash100

10

IMF1

ndash100

10

IMF2

ndash100

10

IMF3

ndash100

10

IMF4

ndash100

10

IMF5

ndash200

20

IMF6

ndash200

20

IMF7

ndash200

20

IMF8

ndash200

20

IMF9

ndash100

10

IMF1

0

ndash200

20

IMF1

1

20406080

Resid

ue

Jan 02 1986 Apr 25 1990 Aug 19 1994 Jan 04 1999 May 27 2003 Oct 17 2007 Feb 29 2012 Jul 12 2016 Apr 15 2020

Figure 4 )e WTI crude oil price series and the corresponding decomposed subseries by ICEEMDAN

Table 1 Samples of economic and financial time series

Dataset Type Dataset Size Date

WTI DailySample set 8641 2 January 1986sim15 April 2020Training set 6912 2 January 1986sim24 May 2013Testing set 1729 28 May 2013sim15 April 2020

USDEUR DailySample set 5341 4 January 1999sim10 April 2020Training set 4272 4 January 1999sim30 December 2015Testing set 1069 31 December 2015sim10 April 2020

IP MonthlySample set 1215 January 1919simMatch 2020Training set 972 January 1919simDecember 1999Testing set 243 January 2000simMatch 2020

SSEC DailySample set 7172 19 December 1990sim21 April 2020Training set 5737 19 December 1990sim4 June 2014Testing set 1435 5 June 2014sim21 April 2020

Complexity 7

demonstrating it is better than LSSVR and BPNN for theforecasting of economic and financial time series Of all theensemble models the ICEEMDAN-LSSVR model andICEEMDAN-BPNN model obtain the close MAPE valuesobviously better than the ICEEMDAN-RW

)e RMSE values of the four time series datasets arelisted in Table 4 From this table we can find that theproposed MICEEMDAN-WOA-RVFL outperforms allthe single and ensemble models with all the horizons in allthe datasets )e statistical model RW obtains the bestRMSE values in 10 out of 12 cases demonstrating that it ismore suitable for economic and financial time seriesforecasting than LSSVR and BPNN among all the singlemodels As to the ensemble models the proposedMICEEMDAN-WOA-RVFL obtains the lower RMSEvalues than the compared ensemble models

demonstrating that the former is more effective for eco-nomic and financial time series forecasting

Table 5 shows the directional statistics Dstat and we cansee that the MICEEMDAN-WOA-RVFL achieves thehighest Dstat values in all the 12 cases indicating that it hasbetter performance of direction forecasting Amongst thesingle prediction models LSSVR and RW obtain the bestDstat values in 5 cases respectively better than the BPNNSimilarly the ICEEMDAN-LSSVRmodel and ICEEMDAN-BPNN model obtain the close Dstat values obviously betterthan the ICEEMDAN-RW model in all the 12 cases

From the all prediction results we can find that all theensemble prediction models except ICEEMDAN-RW greatlyoutperform the corresponding single prediction models in allthe 12 cases showing that the framework of decompositionand ensemble is an effective tool for improving the forecasting

Table 2 )e settings for the parametersMethod Parameters Description

ICEEMDANNsd 02 Noise standard deviationNr 100 Number of realizations

Maxsi 5000 Maximum number of sifting iterations

LSSVR Rp 2minus10minus9 1112 Regularization parameterWidRBF 2minus10 minus9 1112 Width of the RBF kernel

BPNNNhe 10 Number of hidden neurons

Maxte 1000 Maximum training epochsLr 00001 Learning rate

WOA Pop 40 Population sizeMaxgen 100 Maximum generation

MICEEMDAN-WOA-RVFL

Nsd [001 04] Noise standard deviation in ICEEMDANNr [50 500] Number of realizations in ICEEMDAN

Maxsi [2000 8000] Maximum number of sifting iterations inICEEMDAN

Nhe [5 30] Number of hidden neurons in RVFLFunc sigmoid sine hardlim tribas radbas sign Activation function in RVFL

Mod 1 Regularized least square Mode in RVFL2 MoorendashPenrose pseudoinverse

Lag [3 20] Lag in RVFLBias true false Bias in RVFL

Rand 1 Gaussian 2 Uniform Random type in RVFLScale [01 1] Scale value in RVFL

ScaleMode 1 Scale the features for all neurons Scale mode in RVFL2 Scale the features for each hidden neuron

3 Scale the range of the randomization for uniformdiatribution

Table 3 )e mean absolute percent error (MAPE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 00036 00176 00177 00182 00200 00044 000553 00080 00307 00309 00320 00331 00092 000946 00113 00441 00449 00457 00458 01202 00130

USDEUR1 00006 00034 00034 00034 00044 00010 000103 00015 00063 00063 00063 00071 00020 000216 00022 00087 00088 00087 00094 00032 00031

IP1 00012 00049 00057 00056 00053 00024 000223 00023 00102 00140 00123 00104 00033 000336 00032 00183 00279 00235 00184 00043 00049

SSEC1 00020 00096 00096 00097 00111 00026 000263 00044 00178 00178 00179 00185 00047 000516 00065 00259 00259 00269 00268 00072 00074

8 Complexity

performance Of all the compared models the proposedMICEEMDAN-WOA-RVFL obtains the highest Dstat valuesand the lowest MAPE and RMSE values in all the time seriesdatasets showing that it is completely superior to thebenchmark prediction models Furthermore for each pre-dictionmodel theMAPE and RMSE values increase while theDstat values decrease with the horizon)is demonstrates thatit is easier to forecast time series with a short horizon thanwith a long one It is worth noting that the proposedMICEEMDAN-WOA-RVFL still achieves the relatively goodMAPE and RMSE with the increase of horizon among thecompared models For example when the proposed modelobtains 07737 Dstat with horizon 6 in the WTI dataset itachieves the relatively low MAPE (00113) and RMSE(08146) indicating that the prediction values are very close tothe real values although the proposed model misses directionabout 2263 In other words the proposed MICEEMDAN-WOA-RVFL can still achieve satisfactory forecasting accuracywith a long horizon

Furthermore we can find that the multiple decompo-sition strategy does not improve the forecast for the RWmodel One possible explanation is simply that RW infersthat the past movement or trend of a time series cannot beused to predict its future movement and thus it cannot takeadvantage of the historical data and the diversity of multipledecomposition to make the future prediction )ereforewhen we aggregate the multiple RWmodel predictions of allthe decomposed subseries we just integrate several randompredictions and thus cannot significantly improve theensemble prediction results In contrast the LSSVR and

BPNN as well as RVFL can fully use all the historical dataand the diversity of multiple decomposition to make thefuture prediction Specifically the multiple decompositionusing different parameters generates many groups of dif-ferent decomposed subseries and the diversity of decom-position can successfully overcome the randomness of onesingle decomposition and further improve the predictionaccuracy and stability of the developed forecasting approach

In addition the DieboldndashMariano (DM) test is utilizedto evaluate whether the forecasting accuracy of the proposedMICEEMDAN-WOA-RVFL significantly outperformsthose of the other compared models Table 6 shows thestatistics and p values (in brackets)

On one hand the DM statistical values between theensemble prediction models and their corresponding singlepredictors are much lower than zero and the correspondingp values are almost equal to zero with all the horizons exceptfor the RW model showing that the architecture of ldquode-composition and ensemblerdquo contributes to greatly im-proving prediction accuracy and the combination ofICEEMDAN and AI predictors is more effective for eco-nomic and financial time series forecasting

On the other hand DM test results on the prediction ofall the four time series datasets indicate that theMICEEMDAN-WOA-RVFL is significantly better than thesingle models and the other ensemble models with all thehorizons and the corresponding p values are much lowerthan 001 in all the cases

In summary the DM test results demonstrate that thecombination of multiple ICEEMDANs RVFL networks and

Table 4 )e root mean squared error (RMSE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 02715 13196 13228 13744 18134 03467 040783 05953 21784 21929 22867 25041 06754 076206 08146 30737 31271 31845 32348 08692 09462

USDEUR1 00009 00051 00052 00052 00099 00015 000163 00022 00091 00092 00092 00129 00031 000316 00033 00127 00128 00127 00154 00047 00046

IP1 02114 07528 08441 08230 08205 03861 041753 03875 14059 18553 18311 14401 05533 052946 05340 24459 36015 28116 24569 06408 08264

SSEC1 108115 500742 498800 508431 633951 126460 1379283 222983 891834 885290 891781 932887 252526 2977846 357201 1316386 1313487 1335647 1372598 404357 496341

Table 5 )e directional statistic (Dstat) values of different prediction modelsDataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 09381 04815 05243 05191 04977 09190 090973 08576 04873 05087 05012 04907 08300 084496 07737 05116 04902 04948 05185 07714 07575

USDEUR1 09410 05019 04719 04897 05056 08998 090263 08502 04888 04897 04925 04916 08118 080246 07828 04991 05318 05253 05047 07023 07154

IP1 09256 05579 05909 05785 05620 08636 087603 08802 06983 05455 04876 06529 08430 080586 08141 06198 05537 05661 06405 07355 07645

SSEC1 09156 04944 05049 05098 04979 09024 090023 08396 05042 05021 05007 04965 08222 081456 07587 04833 05063 04965 04805 07448 07455

Complexity 9

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 7: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

50

100

150

Crud

e oil

pric

es

ndash100

10

IMF1

ndash100

10

IMF2

ndash100

10

IMF3

ndash100

10

IMF4

ndash100

10

IMF5

ndash200

20

IMF6

ndash200

20

IMF7

ndash200

20

IMF8

ndash200

20

IMF9

ndash100

10

IMF1

0

ndash200

20

IMF1

1

20406080

Resid

ue

Jan 02 1986 Apr 25 1990 Aug 19 1994 Jan 04 1999 May 27 2003 Oct 17 2007 Feb 29 2012 Jul 12 2016 Apr 15 2020

Figure 4 )e WTI crude oil price series and the corresponding decomposed subseries by ICEEMDAN

Table 1 Samples of economic and financial time series

Dataset Type Dataset Size Date

WTI DailySample set 8641 2 January 1986sim15 April 2020Training set 6912 2 January 1986sim24 May 2013Testing set 1729 28 May 2013sim15 April 2020

USDEUR DailySample set 5341 4 January 1999sim10 April 2020Training set 4272 4 January 1999sim30 December 2015Testing set 1069 31 December 2015sim10 April 2020

IP MonthlySample set 1215 January 1919simMatch 2020Training set 972 January 1919simDecember 1999Testing set 243 January 2000simMatch 2020

SSEC DailySample set 7172 19 December 1990sim21 April 2020Training set 5737 19 December 1990sim4 June 2014Testing set 1435 5 June 2014sim21 April 2020

Complexity 7

demonstrating it is better than LSSVR and BPNN for theforecasting of economic and financial time series Of all theensemble models the ICEEMDAN-LSSVR model andICEEMDAN-BPNN model obtain the close MAPE valuesobviously better than the ICEEMDAN-RW

)e RMSE values of the four time series datasets arelisted in Table 4 From this table we can find that theproposed MICEEMDAN-WOA-RVFL outperforms allthe single and ensemble models with all the horizons in allthe datasets )e statistical model RW obtains the bestRMSE values in 10 out of 12 cases demonstrating that it ismore suitable for economic and financial time seriesforecasting than LSSVR and BPNN among all the singlemodels As to the ensemble models the proposedMICEEMDAN-WOA-RVFL obtains the lower RMSEvalues than the compared ensemble models

demonstrating that the former is more effective for eco-nomic and financial time series forecasting

Table 5 shows the directional statistics Dstat and we cansee that the MICEEMDAN-WOA-RVFL achieves thehighest Dstat values in all the 12 cases indicating that it hasbetter performance of direction forecasting Amongst thesingle prediction models LSSVR and RW obtain the bestDstat values in 5 cases respectively better than the BPNNSimilarly the ICEEMDAN-LSSVRmodel and ICEEMDAN-BPNN model obtain the close Dstat values obviously betterthan the ICEEMDAN-RW model in all the 12 cases

From the all prediction results we can find that all theensemble prediction models except ICEEMDAN-RW greatlyoutperform the corresponding single prediction models in allthe 12 cases showing that the framework of decompositionand ensemble is an effective tool for improving the forecasting

Table 2 )e settings for the parametersMethod Parameters Description

ICEEMDANNsd 02 Noise standard deviationNr 100 Number of realizations

Maxsi 5000 Maximum number of sifting iterations

LSSVR Rp 2minus10minus9 1112 Regularization parameterWidRBF 2minus10 minus9 1112 Width of the RBF kernel

BPNNNhe 10 Number of hidden neurons

Maxte 1000 Maximum training epochsLr 00001 Learning rate

WOA Pop 40 Population sizeMaxgen 100 Maximum generation

MICEEMDAN-WOA-RVFL

Nsd [001 04] Noise standard deviation in ICEEMDANNr [50 500] Number of realizations in ICEEMDAN

Maxsi [2000 8000] Maximum number of sifting iterations inICEEMDAN

Nhe [5 30] Number of hidden neurons in RVFLFunc sigmoid sine hardlim tribas radbas sign Activation function in RVFL

Mod 1 Regularized least square Mode in RVFL2 MoorendashPenrose pseudoinverse

Lag [3 20] Lag in RVFLBias true false Bias in RVFL

Rand 1 Gaussian 2 Uniform Random type in RVFLScale [01 1] Scale value in RVFL

ScaleMode 1 Scale the features for all neurons Scale mode in RVFL2 Scale the features for each hidden neuron

3 Scale the range of the randomization for uniformdiatribution

Table 3 )e mean absolute percent error (MAPE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 00036 00176 00177 00182 00200 00044 000553 00080 00307 00309 00320 00331 00092 000946 00113 00441 00449 00457 00458 01202 00130

USDEUR1 00006 00034 00034 00034 00044 00010 000103 00015 00063 00063 00063 00071 00020 000216 00022 00087 00088 00087 00094 00032 00031

IP1 00012 00049 00057 00056 00053 00024 000223 00023 00102 00140 00123 00104 00033 000336 00032 00183 00279 00235 00184 00043 00049

SSEC1 00020 00096 00096 00097 00111 00026 000263 00044 00178 00178 00179 00185 00047 000516 00065 00259 00259 00269 00268 00072 00074

8 Complexity

performance Of all the compared models the proposedMICEEMDAN-WOA-RVFL obtains the highest Dstat valuesand the lowest MAPE and RMSE values in all the time seriesdatasets showing that it is completely superior to thebenchmark prediction models Furthermore for each pre-dictionmodel theMAPE and RMSE values increase while theDstat values decrease with the horizon)is demonstrates thatit is easier to forecast time series with a short horizon thanwith a long one It is worth noting that the proposedMICEEMDAN-WOA-RVFL still achieves the relatively goodMAPE and RMSE with the increase of horizon among thecompared models For example when the proposed modelobtains 07737 Dstat with horizon 6 in the WTI dataset itachieves the relatively low MAPE (00113) and RMSE(08146) indicating that the prediction values are very close tothe real values although the proposed model misses directionabout 2263 In other words the proposed MICEEMDAN-WOA-RVFL can still achieve satisfactory forecasting accuracywith a long horizon

Furthermore we can find that the multiple decompo-sition strategy does not improve the forecast for the RWmodel One possible explanation is simply that RW infersthat the past movement or trend of a time series cannot beused to predict its future movement and thus it cannot takeadvantage of the historical data and the diversity of multipledecomposition to make the future prediction )ereforewhen we aggregate the multiple RWmodel predictions of allthe decomposed subseries we just integrate several randompredictions and thus cannot significantly improve theensemble prediction results In contrast the LSSVR and

BPNN as well as RVFL can fully use all the historical dataand the diversity of multiple decomposition to make thefuture prediction Specifically the multiple decompositionusing different parameters generates many groups of dif-ferent decomposed subseries and the diversity of decom-position can successfully overcome the randomness of onesingle decomposition and further improve the predictionaccuracy and stability of the developed forecasting approach

In addition the DieboldndashMariano (DM) test is utilizedto evaluate whether the forecasting accuracy of the proposedMICEEMDAN-WOA-RVFL significantly outperformsthose of the other compared models Table 6 shows thestatistics and p values (in brackets)

On one hand the DM statistical values between theensemble prediction models and their corresponding singlepredictors are much lower than zero and the correspondingp values are almost equal to zero with all the horizons exceptfor the RW model showing that the architecture of ldquode-composition and ensemblerdquo contributes to greatly im-proving prediction accuracy and the combination ofICEEMDAN and AI predictors is more effective for eco-nomic and financial time series forecasting

On the other hand DM test results on the prediction ofall the four time series datasets indicate that theMICEEMDAN-WOA-RVFL is significantly better than thesingle models and the other ensemble models with all thehorizons and the corresponding p values are much lowerthan 001 in all the cases

In summary the DM test results demonstrate that thecombination of multiple ICEEMDANs RVFL networks and

Table 4 )e root mean squared error (RMSE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 02715 13196 13228 13744 18134 03467 040783 05953 21784 21929 22867 25041 06754 076206 08146 30737 31271 31845 32348 08692 09462

USDEUR1 00009 00051 00052 00052 00099 00015 000163 00022 00091 00092 00092 00129 00031 000316 00033 00127 00128 00127 00154 00047 00046

IP1 02114 07528 08441 08230 08205 03861 041753 03875 14059 18553 18311 14401 05533 052946 05340 24459 36015 28116 24569 06408 08264

SSEC1 108115 500742 498800 508431 633951 126460 1379283 222983 891834 885290 891781 932887 252526 2977846 357201 1316386 1313487 1335647 1372598 404357 496341

Table 5 )e directional statistic (Dstat) values of different prediction modelsDataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 09381 04815 05243 05191 04977 09190 090973 08576 04873 05087 05012 04907 08300 084496 07737 05116 04902 04948 05185 07714 07575

USDEUR1 09410 05019 04719 04897 05056 08998 090263 08502 04888 04897 04925 04916 08118 080246 07828 04991 05318 05253 05047 07023 07154

IP1 09256 05579 05909 05785 05620 08636 087603 08802 06983 05455 04876 06529 08430 080586 08141 06198 05537 05661 06405 07355 07645

SSEC1 09156 04944 05049 05098 04979 09024 090023 08396 05042 05021 05007 04965 08222 081456 07587 04833 05063 04965 04805 07448 07455

Complexity 9

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 8: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

demonstrating it is better than LSSVR and BPNN for theforecasting of economic and financial time series Of all theensemble models the ICEEMDAN-LSSVR model andICEEMDAN-BPNN model obtain the close MAPE valuesobviously better than the ICEEMDAN-RW

)e RMSE values of the four time series datasets arelisted in Table 4 From this table we can find that theproposed MICEEMDAN-WOA-RVFL outperforms allthe single and ensemble models with all the horizons in allthe datasets )e statistical model RW obtains the bestRMSE values in 10 out of 12 cases demonstrating that it ismore suitable for economic and financial time seriesforecasting than LSSVR and BPNN among all the singlemodels As to the ensemble models the proposedMICEEMDAN-WOA-RVFL obtains the lower RMSEvalues than the compared ensemble models

demonstrating that the former is more effective for eco-nomic and financial time series forecasting

Table 5 shows the directional statistics Dstat and we cansee that the MICEEMDAN-WOA-RVFL achieves thehighest Dstat values in all the 12 cases indicating that it hasbetter performance of direction forecasting Amongst thesingle prediction models LSSVR and RW obtain the bestDstat values in 5 cases respectively better than the BPNNSimilarly the ICEEMDAN-LSSVRmodel and ICEEMDAN-BPNN model obtain the close Dstat values obviously betterthan the ICEEMDAN-RW model in all the 12 cases

From the all prediction results we can find that all theensemble prediction models except ICEEMDAN-RW greatlyoutperform the corresponding single prediction models in allthe 12 cases showing that the framework of decompositionand ensemble is an effective tool for improving the forecasting

Table 2 )e settings for the parametersMethod Parameters Description

ICEEMDANNsd 02 Noise standard deviationNr 100 Number of realizations

Maxsi 5000 Maximum number of sifting iterations

LSSVR Rp 2minus10minus9 1112 Regularization parameterWidRBF 2minus10 minus9 1112 Width of the RBF kernel

BPNNNhe 10 Number of hidden neurons

Maxte 1000 Maximum training epochsLr 00001 Learning rate

WOA Pop 40 Population sizeMaxgen 100 Maximum generation

MICEEMDAN-WOA-RVFL

Nsd [001 04] Noise standard deviation in ICEEMDANNr [50 500] Number of realizations in ICEEMDAN

Maxsi [2000 8000] Maximum number of sifting iterations inICEEMDAN

Nhe [5 30] Number of hidden neurons in RVFLFunc sigmoid sine hardlim tribas radbas sign Activation function in RVFL

Mod 1 Regularized least square Mode in RVFL2 MoorendashPenrose pseudoinverse

Lag [3 20] Lag in RVFLBias true false Bias in RVFL

Rand 1 Gaussian 2 Uniform Random type in RVFLScale [01 1] Scale value in RVFL

ScaleMode 1 Scale the features for all neurons Scale mode in RVFL2 Scale the features for each hidden neuron

3 Scale the range of the randomization for uniformdiatribution

Table 3 )e mean absolute percent error (MAPE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 00036 00176 00177 00182 00200 00044 000553 00080 00307 00309 00320 00331 00092 000946 00113 00441 00449 00457 00458 01202 00130

USDEUR1 00006 00034 00034 00034 00044 00010 000103 00015 00063 00063 00063 00071 00020 000216 00022 00087 00088 00087 00094 00032 00031

IP1 00012 00049 00057 00056 00053 00024 000223 00023 00102 00140 00123 00104 00033 000336 00032 00183 00279 00235 00184 00043 00049

SSEC1 00020 00096 00096 00097 00111 00026 000263 00044 00178 00178 00179 00185 00047 000516 00065 00259 00259 00269 00268 00072 00074

8 Complexity

performance Of all the compared models the proposedMICEEMDAN-WOA-RVFL obtains the highest Dstat valuesand the lowest MAPE and RMSE values in all the time seriesdatasets showing that it is completely superior to thebenchmark prediction models Furthermore for each pre-dictionmodel theMAPE and RMSE values increase while theDstat values decrease with the horizon)is demonstrates thatit is easier to forecast time series with a short horizon thanwith a long one It is worth noting that the proposedMICEEMDAN-WOA-RVFL still achieves the relatively goodMAPE and RMSE with the increase of horizon among thecompared models For example when the proposed modelobtains 07737 Dstat with horizon 6 in the WTI dataset itachieves the relatively low MAPE (00113) and RMSE(08146) indicating that the prediction values are very close tothe real values although the proposed model misses directionabout 2263 In other words the proposed MICEEMDAN-WOA-RVFL can still achieve satisfactory forecasting accuracywith a long horizon

Furthermore we can find that the multiple decompo-sition strategy does not improve the forecast for the RWmodel One possible explanation is simply that RW infersthat the past movement or trend of a time series cannot beused to predict its future movement and thus it cannot takeadvantage of the historical data and the diversity of multipledecomposition to make the future prediction )ereforewhen we aggregate the multiple RWmodel predictions of allthe decomposed subseries we just integrate several randompredictions and thus cannot significantly improve theensemble prediction results In contrast the LSSVR and

BPNN as well as RVFL can fully use all the historical dataand the diversity of multiple decomposition to make thefuture prediction Specifically the multiple decompositionusing different parameters generates many groups of dif-ferent decomposed subseries and the diversity of decom-position can successfully overcome the randomness of onesingle decomposition and further improve the predictionaccuracy and stability of the developed forecasting approach

In addition the DieboldndashMariano (DM) test is utilizedto evaluate whether the forecasting accuracy of the proposedMICEEMDAN-WOA-RVFL significantly outperformsthose of the other compared models Table 6 shows thestatistics and p values (in brackets)

On one hand the DM statistical values between theensemble prediction models and their corresponding singlepredictors are much lower than zero and the correspondingp values are almost equal to zero with all the horizons exceptfor the RW model showing that the architecture of ldquode-composition and ensemblerdquo contributes to greatly im-proving prediction accuracy and the combination ofICEEMDAN and AI predictors is more effective for eco-nomic and financial time series forecasting

On the other hand DM test results on the prediction ofall the four time series datasets indicate that theMICEEMDAN-WOA-RVFL is significantly better than thesingle models and the other ensemble models with all thehorizons and the corresponding p values are much lowerthan 001 in all the cases

In summary the DM test results demonstrate that thecombination of multiple ICEEMDANs RVFL networks and

Table 4 )e root mean squared error (RMSE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 02715 13196 13228 13744 18134 03467 040783 05953 21784 21929 22867 25041 06754 076206 08146 30737 31271 31845 32348 08692 09462

USDEUR1 00009 00051 00052 00052 00099 00015 000163 00022 00091 00092 00092 00129 00031 000316 00033 00127 00128 00127 00154 00047 00046

IP1 02114 07528 08441 08230 08205 03861 041753 03875 14059 18553 18311 14401 05533 052946 05340 24459 36015 28116 24569 06408 08264

SSEC1 108115 500742 498800 508431 633951 126460 1379283 222983 891834 885290 891781 932887 252526 2977846 357201 1316386 1313487 1335647 1372598 404357 496341

Table 5 )e directional statistic (Dstat) values of different prediction modelsDataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 09381 04815 05243 05191 04977 09190 090973 08576 04873 05087 05012 04907 08300 084496 07737 05116 04902 04948 05185 07714 07575

USDEUR1 09410 05019 04719 04897 05056 08998 090263 08502 04888 04897 04925 04916 08118 080246 07828 04991 05318 05253 05047 07023 07154

IP1 09256 05579 05909 05785 05620 08636 087603 08802 06983 05455 04876 06529 08430 080586 08141 06198 05537 05661 06405 07355 07645

SSEC1 09156 04944 05049 05098 04979 09024 090023 08396 05042 05021 05007 04965 08222 081456 07587 04833 05063 04965 04805 07448 07455

Complexity 9

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 9: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

performance Of all the compared models the proposedMICEEMDAN-WOA-RVFL obtains the highest Dstat valuesand the lowest MAPE and RMSE values in all the time seriesdatasets showing that it is completely superior to thebenchmark prediction models Furthermore for each pre-dictionmodel theMAPE and RMSE values increase while theDstat values decrease with the horizon)is demonstrates thatit is easier to forecast time series with a short horizon thanwith a long one It is worth noting that the proposedMICEEMDAN-WOA-RVFL still achieves the relatively goodMAPE and RMSE with the increase of horizon among thecompared models For example when the proposed modelobtains 07737 Dstat with horizon 6 in the WTI dataset itachieves the relatively low MAPE (00113) and RMSE(08146) indicating that the prediction values are very close tothe real values although the proposed model misses directionabout 2263 In other words the proposed MICEEMDAN-WOA-RVFL can still achieve satisfactory forecasting accuracywith a long horizon

Furthermore we can find that the multiple decompo-sition strategy does not improve the forecast for the RWmodel One possible explanation is simply that RW infersthat the past movement or trend of a time series cannot beused to predict its future movement and thus it cannot takeadvantage of the historical data and the diversity of multipledecomposition to make the future prediction )ereforewhen we aggregate the multiple RWmodel predictions of allthe decomposed subseries we just integrate several randompredictions and thus cannot significantly improve theensemble prediction results In contrast the LSSVR and

BPNN as well as RVFL can fully use all the historical dataand the diversity of multiple decomposition to make thefuture prediction Specifically the multiple decompositionusing different parameters generates many groups of dif-ferent decomposed subseries and the diversity of decom-position can successfully overcome the randomness of onesingle decomposition and further improve the predictionaccuracy and stability of the developed forecasting approach

In addition the DieboldndashMariano (DM) test is utilizedto evaluate whether the forecasting accuracy of the proposedMICEEMDAN-WOA-RVFL significantly outperformsthose of the other compared models Table 6 shows thestatistics and p values (in brackets)

On one hand the DM statistical values between theensemble prediction models and their corresponding singlepredictors are much lower than zero and the correspondingp values are almost equal to zero with all the horizons exceptfor the RW model showing that the architecture of ldquode-composition and ensemblerdquo contributes to greatly im-proving prediction accuracy and the combination ofICEEMDAN and AI predictors is more effective for eco-nomic and financial time series forecasting

On the other hand DM test results on the prediction ofall the four time series datasets indicate that theMICEEMDAN-WOA-RVFL is significantly better than thesingle models and the other ensemble models with all thehorizons and the corresponding p values are much lowerthan 001 in all the cases

In summary the DM test results demonstrate that thecombination of multiple ICEEMDANs RVFL networks and

Table 4 )e root mean squared error (RMSE) values of different prediction models

Dataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 02715 13196 13228 13744 18134 03467 040783 05953 21784 21929 22867 25041 06754 076206 08146 30737 31271 31845 32348 08692 09462

USDEUR1 00009 00051 00052 00052 00099 00015 000163 00022 00091 00092 00092 00129 00031 000316 00033 00127 00128 00127 00154 00047 00046

IP1 02114 07528 08441 08230 08205 03861 041753 03875 14059 18553 18311 14401 05533 052946 05340 24459 36015 28116 24569 06408 08264

SSEC1 108115 500742 498800 508431 633951 126460 1379283 222983 891834 885290 891781 932887 252526 2977846 357201 1316386 1313487 1335647 1372598 404357 496341

Table 5 )e directional statistic (Dstat) values of different prediction modelsDataset Horizon MICEEMDAN -WOA-RVFL RW LSSVR BPNN ICEEMDAN -RW ICEEMDAN -LSSVR ICEEMDAN -BPNN

WTI1 09381 04815 05243 05191 04977 09190 090973 08576 04873 05087 05012 04907 08300 084496 07737 05116 04902 04948 05185 07714 07575

USDEUR1 09410 05019 04719 04897 05056 08998 090263 08502 04888 04897 04925 04916 08118 080246 07828 04991 05318 05253 05047 07023 07154

IP1 09256 05579 05909 05785 05620 08636 087603 08802 06983 05455 04876 06529 08430 080586 08141 06198 05537 05661 06405 07355 07645

SSEC1 09156 04944 05049 05098 04979 09024 090023 08396 05042 05021 05007 04965 08222 081456 07587 04833 05063 04965 04805 07448 07455

Complexity 9

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 10: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

Tabl

e6

Results

oftheDiebo

ldndashM

ariano

(DM)test

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

WTI

1

MIC

EEMDAN-W

OA-RVFL

minus49081

(le00001)

minus70030

(le00001)

minus61452

(le00001)

minus107240(le00001)

minus103980(le00001)

minus106210(le00001)

ICEE

MDAN-LSSVR

minus35925

(00003)

minus60620

(le00001)

minus105640(le00001)

minus103680(le00001)

minus104660(le00001)

ICEE

MDAN-BPN

Nminus59718

(le00001)

minus102070(le00001)

minus99529

(le00001)

minus100960(le00001)

ICEE

MDAN-RW

30776

(00021)

27682

(00057)

30951

(00020)

LSSV

Rminus18480

(00648)

10058

(03147)

BPNN

19928

(00464)

3

MIC

EEMDAN-W

OA-RVFL

minus49136

(le00001)

minus36146

(00003)

minus108660(le00001)

minus165990(le00001)

minus156260(le00001)

minus166720(le00001)

ICEE

MDAN-LSSVR

minus21143

(00346)

minus107560(le00001)

minus167020(le00001)

minus156780(le00001)

minus167760(le00001)

ICEE

MDAN-BPN

Nminus105250(le00001)

minus162160(le00001)

minus153500(le00001)

minus162430(le00001)

ICEE

MDAN-RW

30818

(00021)

20869

(00370)

32155

(00012)

LSSV

Rminus27750

(00056)

23836

(00173)

BPNN

30743

(00021)

6

MIC

EEMDAN-W

OA-RVFL

minus39302

(le00001)

minus51332

(le00001)

minus189130(le00001)

minus190620(le00001)

minus219930(le00001)

minus195080(le00001)

ICEE

MDAN-LSSVR

minus34653

(00005)

minus189170(le00001)

minus191680(le00001)

minus220120(le00001)

minus195420(le00001)

ICEE

MDAN-BPN

Nminus188380(le00001)

minus191650(le00001)

minus219930(le00001)

minus195200(le00001)

ICEE

MDAN-RW

26360

(00085)

05129

(06081)

41421

(le00001)

LSSV

Rminus22124

(00271)

39198

(le00001)

BPNN

43545

(le00001)

USD

EUR

1

MIC

EEMDAN-W

OA-RVFL

minus88860

(le00001)

minus86031

(le00001)

minus44893

(le00001)

minus154300(le00001)

minus154100(le00001)

minus153440(le00001)

ICEE

MDAN-LSSVR

minus08819

(03780)

minus44230

(le00001)

minus147210(le00001)

minus147320(le00001)

minus146280(le00001)

ICEE

MDAN-BPN

Nminus44158

(le00001)

minus146310(le00001)

minus146440(le00001)

minus145310(le00001)

ICEE

MDAN-RW

33089

(00010)

32912

(00010)

33244

(00009)

LSSV

Rminus18732

(006132)

29472

(00033)

BPNN

34378

(00006)

3

MIC

EEMDAN-W

OA-RVFL

minus101370(le00001)

minus79933

(le00001)

minus57359

(le00001)

minus183380(le00001)

minus186460(le00001)

minus182090(le00001)

ICEE

MDAN-LSSVR

minus14797

(01392)

minus55972

(le00001)

minus179030(le00001)

minus181920(le00001)

minus177730(le00001)

ICEE

MDAN-BPN

Nminus56039

(le00001)

minus178370(le00001)

minus181350(le00001)

minus176910(le00001)

ICEE

MDAN-RW

30048

(00027)

29552

(00032)

30206

(00026)

LSSV

Rminus14077

(01595)

15242

(01278)

BPNN

21240

(00339)

6

MIC

EEMDAN-W

OA-RVFL

minus111070(le00001)

minus100570(le00001)

minus74594

(le00001)

minus183010(le00001)

minus184640(le00001)

minus184940(le00001)

ICEE

MDAN-LSSVR

19379

(00529)

minus70858

(le00001)

minus174490(le00001)

minus175970(le00001)

minus176370(le00001)

ICEE

MDAN-BPN

Nminus71244

(le00001)

minus175850(le00001)

minus177340(le00001)

minus177690(le00001)

ICEE

MDAN-RW

25399

(00112)

26203

(00089)

26207

(00089)

LSSV

R22946

(00220)

25913

(00097)

BPNN

minus02090

(08345)

10 Complexity

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 11: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

Tabl

e6

Con

tinued

Dataset

Horizon

Tested

Mod

elIC

EEMDAN-LSSVR

ICEE

MDAN-BPN

NIC

EEMDAN-RW

LSSV

RBP

NN

RW

IP

1

MIC

EEMDAN-W

OA-RVFL

minus34304

(00007)

minus24253

(00098)

minus34369

(00007)

minus35180

(00005)

minus41486

(le00001)

minus33215

(00015)

ICEE

MDAN-LSSVR

minus06714

(05026)

minus32572

(00013)

minus33188

(00010)

minus40082

(le00001)

minus29573

(00034)

ICEE

MDAN-BPN

Nminus36265

(00004)

minus37455

(00002)

minus46662

(le00001)

minus33790

(00008)

ICEE

MDAN-RW

minus07438

(04577)

minus00502

(09600)

26089

(00096)

LSSV

R04094

(06826)

34803

(00006)

BPNN

16021

(01104)

3

MIC

EEMDAN-W

OA-RVFL

minus45687

(le00001)

minus37062

(00003)

minus59993

(le00001)

minus68919

(le00001)

minus55008

(le00001)

minus58037

(le00001)

ICEE

MDAN-LSSVR

11969

(02325)

minus55286

(le00001)

minus66325

(le00001)

minus52260

(le00001)

minus53260

(le00001)

ICEE

MDAN-BPN

Nminus57199

(le00001)

minus67722

(le00001)

minus52955

(le00001)

minus55099

(le00001)

ICEE

MDAN-RW

minus56536

(le00001)

minus36721

(00003)

17609

(00795)

LSSV

R02479

(08044)

63126

(le00001)

BPNN

39699

(le00001)

6

MIC

EEMDAN-W

OA-RVFL

minus59152

(le00001)

minus35717

(00004)

minus59430

(le00001)

minus75672

(le00001)

minus108900(le00001)

minus58450

(le00001)

ICEE

MDAN-LSSVR

minus25163

(00125)

minus58465

(le00001)

minus75185

(le00001)

minus107140(le00001)

minus57483

(le00001)

ICEE

MDAN-BPN

Nminus56135

(le00001)

minus74370

(le00001)

minus103470(le00001)

minus55254

(le00001)

ICEE

MDAN-RW

minus76719

(le00001)

minus22903

(002287)

08079

(04200)

LSSV

R38137

(00002)

77861

(le00001)

BPNN

23351

(00204)

SSEC

1

MIC

EEMDAN-W

OA-RVFL

minus40255

(le00001)

minus3748(00004)

minus79407

(le00001)

minus108090(le00001)

minus103070(le00001)

minus107330(le00001)

ICEE

MDAN-LSSVR

minus15083

(01317)

minus78406

(le00001)

minus105410(le00001)

minus100710(le00001)

minus105020(le00001)

ICEE

MDAN-BPN

Nminus77986

(le00001)

minus105470(le00001)

minus100580(le00001)

minus105030(le00001)

ICEE

MDAN-RW

35571

(00004)

32632

(00011)

35150

(00005)

LSSV

Rminus10197

(03081)

minus06740

(05004)

BPNN

08078

(04193)

3

MIC

EEMDAN-W

OA-RVFL

minus38915

(00001)

minus37312

(00002)

minus105890(le00001)

minus105480(le00001)

minus107470(le00001)

minus101260(le00001)

ICEE

MDAN-LSSVR

minus24145

(00159)

minus105830(le00001)

minus105240(le00001)

minus107760(le00001)

minus101110(le00001)

ICEE

MDAN-BPN

Nminus102420(le00001)

minus101860(le00001)

minus104400(le00001)

minus97595

(le00001)

ICEE

MDAN-RW

34031

(00007)

23198

(00205)

31912

(00014)

LSSV

Rminus05931

(05532)

minus12284

(02194)

BPNN

minus00042

(09966)

6

MIC

EEMDAN-W

OA-RVFL

minus29349

(00034)

minus37544

(00002)

minus102390(le00001)

minus101970(le00001)

minus110150(le00001)

minus101630(le00001)

ICEE

MDAN-LSSVR

minus25830

(00099)

minus101390(le00001)

minus100940(le00001)

minus109630(le00001)

minus100690(le00001)

ICEE

MDAN-BPN

Nminus96418

(le00001)

minus95562

(le00001)

minus104090(le00001)

minus95122

(le00001)

ICEE

MDAN-RW

23167

(00207)

13656

(01723)

22227

(002639)

LSSV

Rminus19027

(005728)

minus05814

(05611)

BPNN

15707

(01165)

Complexity 11

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 12: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

WOA optimization can significantly enhance the predictionaccuracy of economic and financial time series forecasting

5 Discussion

To better investigate the proposed MICEEMDAN-WOA-RVFL we further discuss the developed prediction modelincluding the comparison of single decomposition andmultiple decompositions the optimization effectiveness ofWOA and the impact of ensemble size in this subsection

51 Comparison of Single Decomposition and MultipleDecompositions One of the main novelties of this study isthe multiple decomposition strategy which can successfullyovercome the randomness of a single decomposition andimprove the prediction accuracy and stability of the de-veloped forecasting model To evaluate the effectiveness ofthe multiple decomposition strategy we compare the pre-diction results of MICEEMDAN-WOA-RVFL and ICE-EMDAN-WOA-RVFL )e former ensembles theprediction results of M(M 100) individual ICEEMDANdecompositions with random parameters while the latteronly employs one ICEEMDAN decomposition We ran-domly choose 5 out of these 100 decompositions and executeICEEMDAN-WOA-RVFL for time series forecastingTables 7ndash9 report the MAPE RMSE and Dstat values of theMICEEMDAN-WOA-RVFL and the five ICEEMDAN-WOA-RVFL models using single decomposition and thecorresponding mean values of these five models using singledecomposition

On one hand compared with the prediction results ofthe five single decompositions and the mean predictionresults the proposed MICEEMDAN-WOA-RVFL achievesthe lowest MAPE and the highest Dstat values in all the 12cases and the lowest RMSE values in 11 out of all the 12cases indicating that the multiple decomposition strategycan successfully overcome the randomness of single de-composition and improve the ensemble prediction accuracy

On the other hand we can find that the multiple de-composition strategy can greatly improve the stability of theprediction model For example the range of MAPE values ofthe five single decomposition models with horizon 6 in theIP time series dataset is from 00034 to 01114 indicatingthat different single decomposition can produce relativelygreat difference in prediction results When we employ themultiple decomposition strategy we can overcome therandomness of single decomposition and thus enhanceprediction stability

In summary the experimental results suggest that themultiple decomposition strategy and prediction ensemblecan effectively enhance prediction accuracy and stability)e main reasons for the prediction improvement lie inthree aspects (1) the multiple decomposition can reduce the

randomness of one single decomposition and simulta-neously generate groups of differential subseries (2) pre-dictions using these groups of differential subseries canachieve diverse prediction results and (3) the selection andensemble of these diverse prediction results can ensure bothaccuracy and diversity and thus improve the final ensembleprediction

52 9e Optimization Effectiveness of WOA When we useRVFL networks to construct predictors a number of pa-rameters need to be set in advance In this study WOA isintroduced to search the optimal parameter values forRVFL predictors using its powerful optimization ability Toinvestigate the optimization effectiveness of WOA forparameter search we compare the proposed MICEEM-DAN-WOA-RVFL with MICEEMDAN-RVFL withoutWOA optimization According to the literature [45] wefixed the number of hidden neurons Nhe 100 activationfunction Func sigmoid and random type Rand Gaussianin MICEEMDAN-RVFL )e MAPE RMSE and Dstatvalues are reported in Tables 10ndash12 respectively

In all the four time series datasets the prediction per-formance of the proposed MICEEMDAN-WOA-RVFLmodel is better than or equal to that of the MICEEMDAN-RVFL model without WOA optimization in all the 12 casesexcept for the RMSE value with horizon 1 in the SSEC datasetin terms of MAPE and RMSE as listed in Tables 10 and 11 Inaddition the MICEEMDAN-WOA-RVFL obtains the higherDstat values in 10 out of 12 cases which can be seen in Ta-ble 12)e all results indicate thatWOA can effectively searchthe optimal parameter settings for RVFL networks furtherimproving the overall prediction performance

53 9e Impact of Ensemble Size )e previous research hasdemonstrated that the ensemble strategy of using all indi-vidual prediction models is unlikely to work well and theselection of individual prediction models contributes toimproving the ensemble prediction performance [57] Inthis study we sort all individual prediction models based ontheir past performance (RMSE values) and then select thetopN percent as the ensemble size to construct the ensembleprediction model To further investigate the impact of en-semble size on ensemble prediction we use different en-semble sizes (es 10 20 100) to select the top Npercent of individual forecasting models to develop theensemble prediction model and conduct the one-step-aheadforecasting experiment on the four time series datasets )eresults are demonstrated in Figure 5

We can see that the MICEEMDAN-WOA-RVFL obtainsthe best forecasting performance in the WTI IP and SSECdatasets when the ensemble size es is in the range of 20ndash40 and in the USDEUR dataset when the ensemble size es

12 Complexity

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 13: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

Table 7 )e mean absolute percent error (MAPE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 00036 00043 00045 00045 00041 00040 000433 00080 00088 00084 00084 00082 00083 000846 00113 00132 00115 00113 00118 00120 00112

USDEUR

1 00006 00011 00010 00008 00010 00008 000093 00015 00021 00021 00015 00017 00015 000186 00022 00033 00022 00023 00023 00024 00025

IP1 00012 00016 00016 00017 00017 00015 000163 00023 00024 00026 00030 00028 00024 000266 00032 00036 00034 01114 00036 00041 00252

SSEC1 00020 00024 00023 00026 00023 00023 000243 00044 00045 00046 00052 00045 00047 000476 00065 00067 00070 00085 00068 00075 00073

Table 8 )e root mean squared error (RMSE) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 02715 03316 03376 03352 03191 03148 032773 05953 06557 06214 06200 06058 06133 062326 08146 09490 08239 08199 08508 08597 08607

USDEUR

1 00009 00017 00017 00012 00015 00012 000153 00022 00031 00034 00023 00025 00023 000276 00033 00048 00033 00035 00035 00036 00037

IP1 02114 02894 03080 03656 03220 02589 030883 03875 04098 04507 04809 04776 04027 044436 05340 05502 05563 10081 05697 06466 06661

SSEC1 108115 115515 130728 167301 122465 143671 1359363 222983 220603 228140 339937 239755 258004 2572886 357201 370255 419033 555066 386140 482131 442525

Table 9 )e directional statistic (Dstat) values of single decomposition and multiple decompositions

Dataset Horizon MICEEMDAN-WOA-RVFL

ICEEMDAN-WOA-RVFL1

ICEEMDAN-WOA-RVFL2

ICEEMDAN-WOA-RVFL3

ICEEMDAN-WOA-RVFL4

ICEEMDAN-WOA-RVFL5 Mean

WTI1 09381 09201 09161 09184 09323 09271 092283 08576 08414 08548 08385 08553 08513 084836 07737 07419 07703 07668 07627 07616 07607

USDEUR

1 0941 09008 09270 09204 09073 09157 091423 08502 08052 08418 08399 08296 08427 083186 07828 06826 07809 07819 07772 07706 07586

IP1 09256 08967 08926 09132 08967 09174 090333 08802 08760 08430 08223 08471 08802 085376 08141 08017 08017 07066 07851 08017 07794

SSEC1 09156 09052 09052 09052 09087 09135 090763 08396 08222 08292 08208 08368 08264 082716 07587 07462 07441 07134 07594 07448 07416

Table 10 )e mean absolute percent error (MAPE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 00036 000373 00080 000826 00113 00118

USDEUR1 00006 000063 00015 000156 00022 00023

IP1 00012 000133 00023 000236 00032 00036

SSEC1 00020 000203 00044 000446 00065 00066

Complexity 13

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 14: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

Table 11 )e root mean squared error (RMSE) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 02715 028243 05953 060796 08146 08485

USDEUR1 00009 000103 00022 000226 00033 00034

IP1 02114 023623 03875 040336 05340 05531

SSEC1 108115 1076163 222983 2274566 357201 363150

Table 12 )e directional statistic (Dstat) values with and without WOA optimizationDataset Horizon MICEEMDAN-WOA-RVFL MICEEMDAN-RVFL

WTI1 09381 093583 08576 085656 07737 07656

USDEUR1 0941 093173 08502 084556 07828 07762

IP1 09256 091743 08802 084306 08141 07893

SSEC1 09156 091913 08396 083516 07587 07594

00034

00036

00038

MA

PE

Ensemble size ()

WTI

10 20 30 40 50 60 70 80 90 100

(a)

000055

00006

000065

MA

PE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(b)

00011000120001300014

MA

PE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(c)

00019

0002

00021

MA

PE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(d)

026027028029

RMSE

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(e)

00008500009

0000950001

RMSE

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(f )

Figure 5 Continued

14 Complexity

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 15: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

is in the range of 10ndash40 in terms of MAPE RMSE andDstat When the es is greater than 40 the MAPE and RMSEvalues continue to worsen and become the worst when theensemble size grows to 100 )e experimental results in-dicate that the ensemble size has an overall significantimpact on ensemble prediction and an ideal range of en-semble size is about 20 to 40

6 Conclusions

To better forecast economic and financial time series wepropose a novel multidecomposition and self-optimizing en-semble prediction model MICEEMDAN-WOA-RVFL com-bining multiple ICEEMDANs WOA and RVFL networks)e MICEEMDAN-WOA-RVFL first uses ICEEMDAN tomultiply separate original economic and financial time seriesinto groups of subseries many times and then RVFL networksare used to individually forecast the decomposed subseries ineach decomposition Simultaneously WOA is introduced tooptimize RVFL networks to further improve the predictionaccuracy )irdly the predictions of subseries in each de-composition are integrated into the forecasting results of eachdecomposition using addition Finally the prediction results ofeach decomposition are selected based on RMSE values and arecombined as the final prediction results

As far as we know it is the first time that WOA isemployed for the optimal parameter search for RVFL net-works and themultiple decomposition strategy is introducedin time series forecasting )e empirical results indicate that

(1) the proposed MICEEMDAN-WOA-RVFL significantlyimproves prediction accuracy in various economic and fi-nancial time series forecasting (2) WOA can effectivelysearch optimal parameters for RVFL networks and improveprediction performance of economic and financial timeseries forecasting and (3) the multiple decompositionstrategy can successfully overcome the randomness of asingle decomposition and enhance the prediction accuracyand stability of the developed prediction model

We will extend our study in two aspects in the future (1)applying the MICEEMDAN-WOA-RVFL to forecast moreeconomic and financial time series and (2) improving theselection and ensemble method of individual forecastingmodels to further enhance the prediction performance

Data Availability

)e data used to support the findings of this study are in-cluded within the article

Conflicts of Interest

)e authors declare that they have no conflicts of interest

Acknowledgments

)is work was supported by the Fundamental ResearchFunds for the Central Universities (Grant no JBK2003001)the Ministry of Education of Humanities and Social ScienceProject (Grant nos 19YJAZH047 and 16XJAZH002) and

01502

02503

RMSE

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(g)

10105

11115

12

RMSE

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(h)

093

0935

094

Dsta

t

WTI

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(i)

0930935

0940945

Dsta

t

USDEUR

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(j)

08809

092094

Dsta

t

IP

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(k)

0910912091409160918

Dsta

t

SSEC

Ensemble size ()10 20 30 40 50 60 70 80 90 100

(l)

Figure 5 )e impact of ensemble size with one-step-ahead forecasting

Complexity 15

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 16: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

the Scientific Research Fund of Sichuan Provincial Educa-tion Department (Grant no 17ZB0433)

References

[1] Y Hui W-K Wong Z Bai and Z-Z Zhu ldquoA new non-linearity test to circumvent the limitation of Volterra ex-pansion with applicationrdquo Journal of the Korean StatisticalSociety vol 46 no 3 pp 365ndash374 2017

[2] R Adhikari and R K Agrawal ldquoA combination of artificialneural network and random walk models for financial timeseries forecastingrdquo Neural Computing and Applicationsvol 24 no 6 pp 1441ndash1449 2014

[3] A Lanza M Manera and M Giovannini ldquoModeling andforecasting cointegrated relationships among heavy oil andproduct pricesrdquo Energy Economics vol 27 no 6 pp 831ndash8482005

[4] M R Hassan and B Nath ldquoStock market forecasting usinghidden Markov model a new approachrdquo in Proceedings of the5th International Conference on Intelligent Systems Design andApplications (ISDArsquo05) pp 192ndash196 IEEE Warsaw PolandSeptember 2005

[5] L Kilian and M P Taylor ldquoWhy is it so difficult to beat therandom walk forecast of exchange ratesrdquo Journal of Inter-national Economics vol 60 no 1 pp 85ndash107 2003

[6] M Rout B Majhi R Majhi and G Panda ldquoForecasting ofcurrency exchange rates using an adaptive ARMAmodel withdifferential evolution based trainingrdquo Journal of King SaudUniversity-Computer and Information Sciences vol 26 no 1pp 7ndash18 2014

[7] P Mondal L Shit and S Goswami ldquoStudy of effectiveness oftime series modeling (ARIMA) in forecasting stock pricesrdquoInternational Journal of Computer Science Engineering andApplications vol 4 no 2 pp 13ndash29 2014

[8] A A Drakos G P Kouretas and L P Zarangas ldquoForecastingfinancial volatility of the Athens stock exchange daily returnsan application of the asymmetric normal mixture GARCHmodelrdquo International Journal of Finance amp Economics vol 15pp 331ndash350 2010

[9] D Alberg H Shalit and R Yosef ldquoEstimating stock marketvolatility using asymmetric GARCH modelsrdquo Applied Fi-nancial Economics vol 18 no 15 pp 1201ndash1208 2008

[10] R P Pradhan and R Kumar ldquoForecasting exchange rate inIndia an application of artificial neural network modelrdquoJournal of Mathematics Research vol 2 pp 111ndash117 2010

[11] S T A Niaki and S Hoseinzade ldquoForecasting SampP 500 indexusing artificial neural networks and design of experimentsrdquoJournal of Industrial Engineering International vol 9 pp 1ndash92013

[12] S P Das and S Padhy ldquoA novel hybrid model using teaching-learning-based optimization and a support vector machine forcommodity futures index forecastingrdquo International Journalof Machine Learning and Cybernetics vol 9 no 1 pp 97ndash1112018

[13] M K Okasha ldquoUsing support vector machines in financialtime series forecastingrdquo International Journal of Statistics andApplications vol 4 pp 28ndash39 2014

[14] X Li H Xie R Wang et al ldquoEmpirical analysis stock marketprediction via extreme learning machinerdquo Neural Computingand Applications vol 27 no 1 pp 67ndash78 2016

[15] T Moudiki F Planchet and A Cousin ldquoMultiple time seriesforecasting using quasi-randomized functional link neuralnetworksrdquo Risks vol 6 no 1 pp 22ndash42 2018

[16] Y Baek and H Y Kim ldquoModAugNet a new forecastingframework for stock market index value with an overfittingprevention LSTM module and a prediction LSTM modulerdquoExpert Systems with Applications vol 113 pp 457ndash480 2018

[17] C N Babu and B E Reddy ldquoA moving-average filter basedhybrid ARIMA-ANN model for forecasting time series datardquoApplied Soft Computing vol 23 pp 27ndash38 2014

[18] M Kumar and M )enmozhi ldquoForecasting stock indexreturns using ARIMA-SVM ARIMA-ANN and ARIMA-random forest hybrid modelsrdquo International Journal ofBanking Accounting and Finance vol 5 no 3 pp 284ndash3082014

[19] C-M Hsu ldquoA hybrid procedure with feature selection forresolving stockfutures price forecasting problemsrdquo NeuralComputing and Applications vol 22 no 3-4 pp 651ndash6712013

[20] S Lahmiri ldquoA variational mode decompoisition approach foranalysis and forecasting of economic and financial time se-riesrdquo Expert Systems with Applications vol 55 pp 268ndash2732016

[21] L-J Kao C-C Chiu C-J Lu and C-H Chang ldquoA hybridapproach by integrating wavelet-based feature extraction withMARS and SVR for stock index forecastingrdquoDecision SupportSystems vol 54 no 3 pp 1228ndash1244 2013

[22] T Li Y Zhou X Li J Wu and T He ldquoForecasting dailycrude oil prices using improved CEEMDAN and ridge re-gression-based predictorsrdquo Energies vol 12 no 19pp 3603ndash3628 2019

[23] A Bagheri H Mohammadi Peyhani and M Akbari ldquoFi-nancial forecasting using ANFIS networks with quantum-behaved particle swarm optimizationrdquo Expert Systems withApplications vol 41 no 14 pp 6235ndash6250 2014

[24] J Wang R Hou C Wang and L Shen ldquoImproved v-Supportvector regression model based on variable selection and brainstorm optimization for stock price forecastingrdquo Applied SoftComputing vol 49 pp 164ndash178 2016

[25] G Claeskens J R Magnus A L Vasnev and W Wang ldquo)eforecast combination puzzle a simple theoretical explana-tionrdquo International Journal of Forecasting vol 32 no 3pp 754ndash762 2016

[26] S G Hall and J Mitchell ldquoCombining density forecastsrdquoInternational Journal of Forecasting vol 23 no 1 pp 1ndash132007

[27] N E Huang Z Shen S R Long et al ldquo)e empirical modedecomposition and the Hilbert spectrum for nonlinear andnon-stationary time series analysisrdquo Proceedings of the RoyalSociety of London Series A Mathematical Physical and En-gineering Sciences vol 454 no 1971 pp 903ndash995 1998

[28] Z Wu and N E Huang ldquoEnsemble empirical mode de-composition a noise-assisted data analysis methodrdquo Ad-vances in Adaptive Data Analysis vol 1 no 1 pp 1ndash41 2009

[29] M E Torres M A Colominas G Schlotthauer andP Flandrin ldquoA complete ensemble empirical mode decom-position with adaptive noiserdquo in Proceedings of the 2011 IEEEInternational Conference on Acoustics Speech and SignalProcessing (ICASSP) pp 4144ndash4147 IEEE Prague CzechRepublic May 2011

16 Complexity

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17

Page 17: AHybridApproachIntegratingMultipleICEEMDANs,WOA,and ... · 2020. 10. 22. · Aff477”53Q I jAA Aff477”53Q m Z–3 Z–3 A A m ff”G I n j I 3 +A m Z–3jYG

[30] J Wu T Zhou and T Li ldquoDetecting epileptic seizures in EEGsignals with complementary ensemble empirical mode de-composition and extreme gradient boostingrdquo Entropy vol 22no 2 pp 140ndash165 2020

[31] T Li Z Hu Y Jia J Wu and Y Zhou ldquoForecasting crude oilprices using ensemble empirical mode decomposition andsparse Bayesian learningrdquo Energies vol 11 no 7pp 1882ndash1905 2018

[32] T Li M Zhou C Guo et al ldquoForecasting crude oil priceusing EEMD and RVM with adaptive PSO-based kernelsrdquoEnergies vol 9 no 12 p 1014 2016

[33] M A Colominas G Schlotthauer and M E Torres ldquoIm-proved complete ensemble EMD a suitable tool for bio-medical signal processingrdquo Biomedical Signal Processing andControl vol 14 pp 19ndash29 2014

[34] J Wu F Miu and T Li ldquoDaily crude oil price forecastingbased on improved CEEMDAN SCA and RVFL a case studyin WTI oil marketrdquo Energies vol 13 no 7 pp 1852ndash18722020

[35] T Li Z Qian and T He ldquoShort-term load forecasting withimproved CEEMDAN and GWO-based multiple kernelELMrdquo Complexity vol 2020 Article ID 1209547 20 pages2020

[36] W Yang J Wang T Niu and P Du ldquoA hybrid forecastingsystem based on a dual decomposition strategy and multi-objective optimization for electricity price forecastingrdquo Ap-plied Energy vol 235 pp 1205ndash1225 2019

[37] W Deng J Xu Y Song and H Zhao ldquoAn effective improvedco-evolution ant colony optimization algorithm with multi-strategies and its applicationrdquo International Journal of Bio-Inspired Computation vol 16 pp 1ndash10 2020

[38] S Mirjalili and A Lewis ldquo)e whale optimization algorithmrdquoAdvances in Engineering Software vol 95 pp 51ndash67 2016

[39] I Aljarah H Faris and S Mirjalili ldquoOptimizing connectionweights in neural networks using the whale optimizationalgorithmrdquo Soft Computing vol 22 no 1 pp 1ndash15 2018

[40] J Wang P Du T Niu and W Yang ldquoA novel hybrid systembased on a new proposed algorithm-Multi-Objective WhaleOptimization Algorithm for wind speed forecastingrdquo AppliedEnergy vol 208 pp 344ndash360 2017

[41] Z Alameer M A Elaziz A A Ewees H Ye and Z JianhualdquoForecasting gold price fluctuations using improved multi-layer perceptron neural network and whale optimizationalgorithmrdquo Resources Policy vol 61 pp 250ndash260 2019

[42] Y-H Pao G-H Park and D J Sobajic ldquoLearning andgeneralization characteristics of the random vector func-tional-link netrdquo Neurocomputing vol 6 no 2 pp 163ndash1801994

[43] L Zhang and P N Suganthan ldquoA comprehensive evaluationof random vector functional link networksrdquo InformationSciences vol 367-368 pp 1094ndash1105 2016

[44] Y Ren P N Suganthan N Srikanth and G AmaratungaldquoRandom vector functional link network for short-termelectricity load demand forecastingrdquo Information Sciencesvol 367-368 pp 1078ndash1093 2016

[45] L Tang Y Wu and L Yu ldquoA non-iterative decomposition-ensemble learning paradigm using RVFL network for crudeoil price forecastingrdquo Applied Soft Computing vol 70pp 1097ndash1108 2018

[46] T Li J Shi X Li J Wu and F Pan ldquoImage encryption basedon pixel-level diffusion with dynamic filtering and DNA-levelpermutation with 3D Latin cubesrdquo Entropy vol 21 no 3pp 319ndash340 2019

[47] J Wu J Shi and T Li ldquoA novel image encryption approachbased on a hyperchaotic system pixel-level filtering withvariable kernels and DNA-level diffusionrdquo Entropy vol 22pp 5ndash24 2020

[48] T Li M Yang J Wu and X Jing ldquoA novel image encryptionalgorithm based on a fractional-order hyperchaotic systemand DNA computingrdquo Complexity vol 2017 Article ID9010251 13 pages 2017

[49] H Zhao J Zheng W Deng and Y Song ldquoSemi-supervisedbroad learning system based on manifold regularization andbroad networkrdquo IEEE Transactions on Circuits and Systems IRegular Papers vol 67 no 3 pp 983ndash994 2020

[50] S Sun S Wang and Y Wei ldquoA new multiscale decompo-sition ensemble approach for forecasting exchange ratesrdquoEconomic Modelling vol 81 pp 49ndash58 2019

[51] T Li and M Zhou ldquoECG classification using wavelet packetentropy and random forestsrdquo Entropy vol 18 no 8 p 2852016

[52] H Zhao H Liu J Xu andW Deng ldquoPerformance predictionusing high-order differential mathematical morphology gra-dient spectrum entropy and extreme learning machinerdquo IEEETransactions on Instrumentation and Measurement vol 69pp 4165ndash4172 2019

[53] W Deng H Liu J Xu H Zhao and Y Song ldquoAn improvedquantum-inspired differential evolution algorithm for deepbelief networkrdquo IEEE Transactions on Instrumentation andMeasurement vol 69 no 10 pp 7319ndash7327 2020

[54] H Liu Z Duan F-Z Han and Y-F Li ldquoBig multi-step windspeed forecasting model based on secondary decompositionensemble method and error correction algorithmrdquo EnergyConversion and Management vol 156 pp 525ndash541 2018

[55] Fred Website httpsfredstlouisfedorg[56] NetEase Website httpquotesmoney163comservice

chddatahtmlcode=0000001ampstart=19901219[57] M Aiolfi and A Timmermann ldquoPersistence in forecasting

performance and conditional combination strategiesrdquo Journalof Econometrics vol 135 no 1-2 pp 31ndash53 2006

Complexity 17