[ieee 2010 international conference on computer information systems and industrial management...

6
Predicting Trend in the Next-Day Market by Hierarchical Hidden Markov Model Luigi Troiano Department of Engineering University of Sannio Benevento, Italy [email protected] Pravesh Kriplani Computational Intelligent System Engineering Lab (CISELab) University of Sannio Benevento, Italy [email protected] Abstract The prediction of stock market movements represents a key component in developing winning trading strategies. Forecasting can have different time horizon. In this paper we focus our attention to very short term, and we develop a model able to predict market trends with a horizon of few days ahead. Based on Hierarchical Hidden Markov Model, our approach has been tested against the Indian stock market index (Nifty), and experimental results are presented and discussed. 1. Introduction Designing a trading strategy able to beat the market means at some extent to predict future market movements. Trading strategies are characterized by the time horizon they consider. In general, the shorter the horizon is, the more profitable and risky the trades are. Whatever the time horizon is, trades have in common uncertainty related to future market movements. Although this characteristic is highly undesirable for the investor, it is also unavoidable whenever the stock market is chosen as an investment tool. In addition, automated trading requires instruments able to manage and reduce uncertainty on quan- titative basis. Therefore, prediction (forescasting) of future market movements is one step in the process of designing a reliable automated trading strategy. Among the different styles, day trading offers extremely profitable (or extremely unprofitable) investment opportu- nities. In day trading, positions are rarely (or never) held when the market is closed or overnight. Efficient Market Hypothesis (EMH) [1] assumes that all new information is instantly absorbed by the market, and prices change according to that. Therefore, there is not space for prediction, and prices will change whenever new information will reach the market. As far as day trading deals with high frequency data, the absorption process of new information is not anymore instantaneous. This provides new opportunities to capture signals able drive an automated day trading strategy. In this context, forecasting the market becomes an in- teresting and challenging task. A valuable information in deploying day trading strategies, is the trend forecast at the the market opening, especially for trend and counter- trend followers. Several techniques have being investigated for predicting market movements, and trend in particular. Among them, probabilistic reasoning offers a theoretically robust and reliable framework in which develop machine learning techniques able learn from historical data pattern which predict market trends in the short term. In this paper we propose a three-level hierarchical hidden Markov model (HHMM) which describes the market trend in terms of states whose transition/permanence probabilities depends on past states and underlying price dynamics, within a given time window. The reminder of this paper is organized as follows: in Section 2, we briefly provides a general overview of trend forecasting techniques investigated and in use; Section 3 offers preliminaries on HHMM for readers that are novel to this technique; Section 4 is devoted to de- scribe in detail our approach and Section 5 to the discussion of experimental results; Section 6 outlines conclusions and future works. 2. Trend Prediction In literature a number of different methods have been applied in order to predict stock market movements and trends. Providing an extensive overview of the plethora of available studies, methods and techniques, would go further the scope of this paper. We will just limit to a broad overview, able to outline different approaches, pros and cons. In general, methods can be classified in four major categories: (i) Technical Analysis, (ii) Fundamental Analysis, (iii) Traditional Time Series Forecasting and (iv) Machine Learning Methods. Technical Analysis is probably the most common ap- proach to trend forecasting. A large literature is available (e.g. see reference [2]). Technical analysis makes use of composite functions, such as indicators and oscillators, derived by time series, and heuristic rules able to reveal signals of change in the market trends. Popular examples of methods are Moving Average Convergence Divergence (MACD), Relative Strength Index (RSI), Stochastic oscil- lator. This approach relies on the belief that markets are 199 978-1-4244-7818-7/10/$26.00 c 2010 IEEE

Upload: pravesh

Post on 18-Dec-2016

220 views

Category:

Documents


7 download

TRANSCRIPT

Page 1: [IEEE 2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM) - Krackow, Poland (2010.10.8-2010.10.10)] 2010 International Conference

Predicting Trend in the Next-Day Market by Hierarchical Hidden Markov Model

Luigi TroianoDepartment of Engineering

University of SannioBenevento, Italy

[email protected]

Pravesh KriplaniComputational Intelligent System Engineering Lab (CISELab)

University of SannioBenevento, Italy

[email protected]

Abstract

The prediction of stock market movements represents akey component in developing winning trading strategies.Forecasting can have different time horizon. In this paperwe focus our attention to very short term, and we developa model able to predict market trends with a horizon offew days ahead. Based on Hierarchical Hidden MarkovModel, our approach has been tested against the Indianstock market index (Nifty), and experimental results arepresented and discussed.

1. Introduction

Designing a trading strategy able to beat the marketmeans at some extent to predict future market movements.Trading strategies are characterized by the time horizon theyconsider. In general, the shorter the horizon is, the moreprofitable and risky the trades are.

Whatever the time horizon is, trades have in commonuncertainty related to future market movements. Althoughthis characteristic is highly undesirable for the investor, itis also unavoidable whenever the stock market is chosen asan investment tool. In addition, automated trading requiresinstruments able to manage and reduce uncertainty on quan-titative basis. Therefore, prediction (forescasting) of futuremarket movements is one step in the process of designing areliable automated trading strategy.

Among the different styles, day trading offers extremelyprofitable (or extremely unprofitable) investment opportu-nities. In day trading, positions are rarely (or never) heldwhen the market is closed or overnight. Efficient MarketHypothesis (EMH) [1] assumes that all new informationis instantly absorbed by the market, and prices changeaccording to that. Therefore, there is not space for prediction,and prices will change whenever new information will reachthe market. As far as day trading deals with high frequencydata, the absorption process of new information is notanymore instantaneous. This provides new opportunities tocapture signals able drive an automated day trading strategy.

In this context, forecasting the market becomes an in-teresting and challenging task. A valuable information in

deploying day trading strategies, is the trend forecast atthe the market opening, especially for trend and counter-trend followers. Several techniques have being investigatedfor predicting market movements, and trend in particular.Among them, probabilistic reasoning offers a theoreticallyrobust and reliable framework in which develop machinelearning techniques able learn from historical data patternwhich predict market trends in the short term.

In this paper we propose a three-level hierarchical hiddenMarkov model (HHMM) which describes the market trendin terms of states whose transition/permanence probabilitiesdepends on past states and underlying price dynamics, withina given time window. The reminder of this paper is organizedas follows: in Section 2, we briefly provides a generaloverview of trend forecasting techniques investigated andin use; Section 3 offers preliminaries on HHMM for readersthat are novel to this technique; Section 4 is devoted to de-scribe in detail our approach and Section 5 to the discussionof experimental results; Section 6 outlines conclusions andfuture works.

2. Trend Prediction

In literature a number of different methods have beenapplied in order to predict stock market movements andtrends. Providing an extensive overview of the plethoraof available studies, methods and techniques, would gofurther the scope of this paper. We will just limit to abroad overview, able to outline different approaches, prosand cons. In general, methods can be classified in fourmajor categories: (i) Technical Analysis, (ii) FundamentalAnalysis, (iii) Traditional Time Series Forecasting and (iv)Machine Learning Methods.

Technical Analysis is probably the most common ap-proach to trend forecasting. A large literature is available(e.g. see reference [2]). Technical analysis makes use ofcomposite functions, such as indicators and oscillators,derived by time series, and heuristic rules able to revealsignals of change in the market trends. Popular examplesof methods are Moving Average Convergence Divergence(MACD), Relative Strength Index (RSI), Stochastic oscil-lator. This approach relies on the belief that markets are

199978-1-4244-7818-7/10/$26.00 c©2010 IEEE

Page 2: [IEEE 2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM) - Krackow, Poland (2010.10.8-2010.10.10)] 2010 International Conference

mostly driven by psychology, more than economics. There-fore, trading opportunities can be discovered by carefullyanalyzing the behavior of other investors, that is reflectedon price movements. Indeed, detected trends are assumedto be based on supply and demand issues which often havecyclical or noticeable patterns.

Although, this approach is very popular among practi-tioners to predict the market, it received several criticisms,especially from academia. The major source of criticism isthat, rule used to identify trend signals, often relying onvisual patterns on charts, the large number of parameters onwhich indicators and oscillators are based, the absence ofa theoretical framework able to explain why this approachworks and how to choose and tune the different tools, makethis class of techniques largely subjective. In addition, beingthese mostly based on human judgment, makes technicalanalysis not appropriate for algorithmic trading. However,recent studies provides support to Technical Analysis asuseful fro predicting market trends [3].

Fundamental Analysis assumes that market trends aredriven by then economic context and financial figures ofcompanies traded. Fundamental Analysis aims at estimatingthe intrinsic value of a stock, so that if the current valueis lower than the intrinsic value, additional investmentsare expected, otherwise disinvestment will occur. However,although this approach relies on economic fundamentals [4],and can lead to profitable trading strategies [5], this approachis more appropriate for long-term strategies, than near-termstrategies, as those employed by day trading.

Both Technical and Fundamental Analysis do not performany quantitative analysis of time series. Traditional TimeSeries Forecasting relies on linear models able to translatethe body of knowledge of stochastic signal processing,providing a quantitative approach to Finance [6]. For a dated,but still valid overview of statistic forecasting see reference[7].

Trend forecasting can be regarded as a problem of patternmatching or approximation, so that methods studied in thearea of Machine Learning, Soft Computing and Computa-tional Intelligence have being experimented for this task.These methods use a set of samples to generate an approx-imation of the underling function and relationship betweendata. In common, they share the aim of drawing predictionswhen unseen data are presented to a model. There is a richliterature related to the forecast of the market on daily basis.Among the different models, probably Artificial NeuralNetworks are the most prominent example. As an example,Saad, Prokhorov and Wunsch [8] discuss a comparison threearchitecture of neural networks, namely time delay (TDNN),recurrent (RNN) and probabilistic neural networks (PNN),for stock trend predictions. They argue that short-term trendsare particularly attractive for neural network analysis andthey can be used profitably. However, false trading signalscan lead to wrong decisions and losses. They advocate

neural networks are able to filter out false trading signals, ifproperly trained.

Other forecasting models are based on Evolutionary Com-puting, such as Genetic Algorithms (e.g. [9]) and GeneticProgramming (e.g. [10]). However, if we assume that stockmarket follows a random walk, methods based on Prob-abilistic Reasoning are probably the most appropriate todeal with this problems. Indeed the forecasting problementails to find the most probable value of trend in thefuture. Hidden Markov models represent an initial solutiondue their ability in processing, classifying and forecastingsequences of data. This approach has been investigated bysome researchers (e.g. [?]). The common idea is that trendis and hidden random state, whose realization depends onthe observable series of (e.g. close) prices. Main limitationsof this approach are that HMM states are discrete, and thata direct (linear) link between prices and trend is established.Dynamic Bayesian networks (DBN) provides a general-ization of HMM, able to better model the complexity ofrelationships between trend and prices. Hierarchical hiddenMarkov models (HHMM) have been investigated for trendprediction. For instance, Jangmin et al. [11], propose a threelevel hierarchical hidden Markov model (HHMM). Similarlyto our approach, they consider a link to prices made bya Gaussian mixture. To train the HHMM, they manuallylabel the trend states of first layer, and they provide mixtureparameters by clustering training sequences. In our case,given the sequence of trends and price variations, the modelis left to find parameters that best fit observed data byExpectation Maximization.

3. Hierarchical Hidden Markov Models

The hierarchical hidden Markov model (HHMM) [12] isan extension of the hidden Markov model, in which statesare organized in a hierarchy. In other terms, HHMMs arestructured multi-level stochastic processes. They generalizeordinary HMM by making states probabilistic models ontheir own. Therefore, HHMMs are recursively defined, sothat each state at level l relies on an HHMM of layer l+1.When a state in an HHMM is activated, it becomes activealso its own probabilistic model, and one of the states of theunderlying HHMM is activated recursively. This process isrepeated until a production state, which is a state that emits asingle observation symbol, is activated. The states that do notdirectly emit observations symbols are called internal states.Production states do not hold a sub-model, and they are notable to transit to other states. So, when the production stateis reached, a symbol of sequence is produced, the controlgoes back to the calling state, that in turn gives the controlto another state at the same level. When a terminal state isreached, the control is moved to the upper level.

An example of HHMM structure is given in Fig.1. Statesare represented by circles, and transitions between states by

200 2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM)

Page 3: [IEEE 2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM) - Krackow, Poland (2010.10.8-2010.10.10)] 2010 International Conference

Figure 1. Hierarchical Hidden Markov Model

arrows. In particular, black solid lines show intra-level transi-tions. For our convenience, we only provided transitions withnon-zero probability, but it is possible to transit from anystate to another within a level. Transition from upper to lowerlevel are denoted by fine-dashed lines. Transitions back toupper levels are denoted by dashed lines. Bold circles are theproduction states, which are the solely allowed to producea symbol σk ∈ Σ, Σ is the alphabet of emitted symbols,observed in sequences. Gray cirles are terminal. The othercircles are internal.

Formally an HHMM is defined as a pair 〈G, λ〉, where G

is D-layer directed graph, and λ ≡ {λd}d=1..D a collectionof parameter, specific of each layer d. The graph G ismade of inner, production and terminal states. An innerstate at level d of an HHMM is denoted by qd, while qdEis the terminal state and odk is a production state emittingthe symbol σk ∈ Σ. We collect the inner states and theproduction state at level d in Qd ≡ {qdi , q

dE}. It is not

required an internal state to have the same number of sub-states, although any HHMM can be transformed into a modelwith an equal number of sub-states for each internal state.For each internal state qdi there is a probability transition

matrix Aqd = {aqd

ij }, where aqd

ij = Pr(qd+1

j |qd+1

i ) isthe probability of moving from i-th to j-th sub-state ofqd. Similarly, πqd

i = Pr(qd+1

i |qd) is the initial probabilityassigned to sub-states by qd. It can be regarded as theprobability of performing a vertical transition, that is theprobability by which the state qd activates the sub-stateqd+1

i . We denote the activation probability distribution by

Πqd = {πqd

i }. Finally, Bqd = {bqd

(ok)} = Pr(od+1

k |qd)is the probability that internal state qd will activate theproduction state od+1

k , which in turn will emit the symbol

σk. Therefore bqd

(ok) is also the probability that the symbolσk ∈ Σ is produced when the state qd is activated. Insummary, we get λd = {Aqd ,Πqd , Bqd

E}qd,qdE∈Qd

. Statesand observations can be either discrete or continuous.

Similarly to ordinary hidden Markov models, HHMMs areuseful in applications dealing with sequences of symbols,such as in signal identification and classification, behaviorrecognition, handwritten character recognition, text analysis,and other pattern recognition problems. Indeed, it can beproven that each HHMMs can be transformed into anequivalent HMM [12]. But, the structure in layers of HHMMcan be exploited in order to adopt more efficient inferencealgorithms and robust learning algorithms.

More specifically, HHMMs can be employed to (i) cal-culate the likelihood of a sequence, that is the probabilityPr(O|λ) of a sequence O to be generated by the modelλ; (ii) to find the most probable state sequence S∗ =argmaxS Pr(O|λ, S), given a sequence O and the model λ;(iii) to find the most probable states and observations givena partial subsequence of symbols. Our problem belongs tothe latter class.

As other graphical models, HHMM can be trained usinga set of past observations. In other terms, given a structureG, and a set of sequences {Ot}, the model parameters λ canbe found as the most probable λ∗ = argmaxλ Pr({Ot}|λ).Expectation Maximization is notably able to solve effec-tively this problem.

Since HHMMs can be led to equivalent HMMs, they canalso can be regarded as special cases of dynamic Bayesiannetworks. For example the network in Fig.2 is equivalentto the model depicted in Fig.1. In particular we reportedthe canonical 2 slice temporal Bayesian network (2BTN),which provides all information regarding intra and inter sliceinterdependencies required to extend the structure to timewindow of any size.

Figure 2. Dynamic Bayesian Network (2TBN)

Each sub-model in the HHMM can be regarded as arandom variable, whose states are those considered by the

2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM) 201

Page 4: [IEEE 2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM) - Krackow, Poland (2010.10.8-2010.10.10)] 2010 International Conference

model. Variables can be discrete or continuous, according tothe nature of states they represent. We notice that, althoughthere is no transition in the HHMM between states belong-ing to non-contiguous levels, since sub-model parametersdepend on activation state, a link between DBN variables isrequired in order to model such a dependency.

4. Approach

Our approach is based on an HHMM structured as il-lustrated in Fig.3. It is a 3-layer model. It can be used forforecasting trends of either market index or traded securities.Squares show discrete variables, circles are continuous.Observable variables are denoted by bold borders.

Figure 3. Trend HHMM (DBN) model

The first layer deals with the states of the trend. Wedivide the trend into three classes, namely “up”, “stable”,and “down”. Our choice is due the need of keeping stableand simple the model output, in order to use it to dive tradingstrategies. This layer is made observable. Trend is assignedaccording to a precise and quantitative criterion. We chosethe following:

1) Compute a (simple) moving average of index, or price,with a given lag

2) Calculate mean and standard deviation of movingaverage at times in the lag

3) If the moving average at the current time, is over(down minus) a given factor of standard deviation frommean, then the trend is “up” (“down”). Otherwise itis “stable”

In other terms, we compute the mean deviation rate as

md(t) =ma(t)−mean(ma)

sd(ma)(1)

where ma(t) is the moving average at time t, mean(ma)and sd(ma) are respectively the moving average’s mean andstandard deviation within the given lag, e.g. 20 days.

The second layer provides a Gaussian mixture of valuesobserved at the third layer. This layer means for decoupling

the trend layer from the variation layer. Although the se-quence of index (price) variations depends on trend, thislayers makes possible to better adapt negative variations topositive trends, and positive variations to negative trends. Inour approach we assumed a four class mixture, but differentchoices are possible.

The third layer refers to index (price) variations. They arecontinuous. They are the realization of the trend states pass-ing though the Gaussian mixture. Variations are computedas

V (t) =close(t)− close(t− 1)

close(t− 1)(2)

The model can have different memory, that is the numberof slices replicated in time. Looking at a sequence of thelatest k observations regarding index (price) variations andlooking ahead to h predictions, we need k + h slices,as depicted in Fig.4. Rumbles denote observations madeavailable to the model, i.e. evidence. We are interested topredict the most probable trend symbol for the next orfollowing days. This can drive decisions regarding tradesin the short term. In addition a forecast of index (price)variation can help to set-up orders placed overnight.

Figure 4. Model in use: predicting trend and variation atthe end of the day.

At market opening, new information becomes available.The variation between the day open and the day before closecan be used in order to refine the trend prediction for the dayjust started. This makes to correct and fine-tune the tradingstrategy. In this case we consider a model as that outlinedin Fig.5.

5. Experimental Results

As experimental setting, we consider the S&P CNX Niftyindex within the period January 1st 2008 - June 30rd 2010.The S&P CNX Nifty is the headline index on the National

202 2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM)

Page 5: [IEEE 2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM) - Krackow, Poland (2010.10.8-2010.10.10)] 2010 International Conference

Figure 5. Model in use: predicting trend and variation atthe begin of the day.

Stock Exchange of India Ltd. (NSE). The index tracks thebehavior of a portfolio of blue chip companies,the largestand most liquid Indian securities. It includes 50 of theapproximately 1781 companies listed on the NSE, covering22 sectors of the Indian economy captures approximately60% of its equity market capitalization and it is a truereflection of the Indian stock market.

0 100 200 300 400 500 6002500

3000

3500

4000

4500

5000

5500

6000

6500

JUN 30th, 2010JAN 4th, 2010DEC 31st, 2009JAN 1st, 2008

TESTINGTRAINING

NIFTYMA20

Figure 6. Nifty and MA20

The period of interest (612 data points) is split in twosegments. The first (489 data points, 79.9%), from January1st 2008 to December 31st 2009 provides the data forlearning model parameters, whilst the second (123 datapoints, 20,1%), from January 4th 2010 to June 30rd 2010,serves as testing benchmark for the model. The time seriesand related 20-days moving average is outlined in Fig.6

If we assume ±1.0 as mean deviation rate for assigningthe trend class, in the training period we have 193 (39.47%)“down”, 113 (23.11%) “stable” and 183 (37.42%) “up”. Weused this distribution as priors of initial states. Instead, in thetesting period we have 46 (37.40%) “down”, 21 (17.06%)

“stable” and 56 (45.53%) “up”. This is according to the netprofit is -943.30 in the training period, and 80.30 in thetesting period.

As first test, we aim at assessing the trend forecastaccuracy in the testing period. Results are reported in Tab.1.

Table 1. Predition accuracy

Down Stable Up Overall

Cases 46 21 56 123Correct 44 17 51 115Rate 95.65% 80.95% 96.43% 93.50%

Confusion matrix, in Tab.2 highlights that the model neverinverted the signals “up” and “down”, that would cause awrong a trading decision.

Table 2. Confusion matrix

ActualDown Stable Up

Down 44 2 0Stable 2 17 2

Up 0 2 0

Obviously this result tends to become weaker by extend-ing the time horizon we look ahead, as depicted in Tab.3.Prediction is also affected by the model memory.

Table 3. Accuracy at different look ahead

Look Ahead 1 2 3 4 5

Cases 123 122 121 120 119Correct 115 106 97 88 81Rate 93.50% 86.89% 80.17% 73.33% 68.06%

Another factor influencing the behavior of our model isthe specification of moving average used to classify thetrend. The first aspect to consider is the lag, as shortermoving averages are more reactive to price changes, longerare less. Accuracy at different moving average lags isreported in Tab.4.

Table 4. Accuracy with different moving averages

MA Lag Down Stable Up Overall Accuracy

10 42 11 59 112 91.06%15 41 16 56 113 91.87%20 44 17 51 115 93.50%30 41 33 42 116 94.31%40 37 32 47 116 94.31%

Extending the lag entails a smoother moving average,therefore the number of “stable” signals increases (see Tab.5)as it becomes more difficult to exceed the limit imposed onstandard deviation for accessing the trend classes “up” and“down”. This means, that if one one side we have a more

2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM) 203

Page 6: [IEEE 2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM) - Krackow, Poland (2010.10.8-2010.10.10)] 2010 International Conference

stable sequence of signals, on the other we are not able tocatch profitable opportunities. For this reason we considerthe moving average at 20 days a good trade-off.

Table 5. Trend signals by different moving averages.

MA Lag 15 10 20 30 40

Down 45 44 46 45 36Stable 16 16 21 29 39Up 62 58 56 49 48

The other element in the definition of trend able to affectthe classification of signals in our model is the definitionof the deviation factor. Indeed a larger factor produces theeffect of enlarging the central band labeled as “stable”, asillustrated in Tab.6.

Table 6. Trend signals as provided by mean deviationrate.

SD Factor 0.5 1.0 1.5 2.0

Down 57 46 33 12Stable 7 21 51 97Up 63 56 39 14

Furthermore, we verified if opening price has effect onmodel accuracy. In this case we use the model according toscheme outlined in Fig.5. Experimental results did not showany evidence of improvement when the open is consideredfor determining the trend of the current day. The model islargely influenced by the recent observations on trend morethan index or price variations. This is due mostly to theinertia of moving average in changing the trend signal.

In the testing period there are 71 days providing a positivereturn. Among them, 20 days recorded a return over 1%, Thetrend model labeled the 71 days as “up” in 31 cases, “stable”in 27, and “down” in 13 cases, resulting in a negative matchrate of 18.31%. With repsect to days over 1% profit, themodel labeled 6 days as “up”, 3 as “stable” and 11 “down”.There not being able catpure large increments that generallyfollow a sequence of price decrements. Similar conclusionscan be drawn in the case of remaining 52 negative days. Inthis case, 25 days were labeled as “up”, 8 as “stable” and19 as “down”.

In summary, although the model provides robust predic-tion about trend in the short term, inertia of trend couldlead to miss profitable trading opportunities. This problemseems to be more structural than due to a specific defini-tion of trend. Changing the criteria by which the trend isdetermined, would not necessarily lead to a more reactivebut still accurate prediction model. This should suggest toconsider history in a different way.

6. Conclusions and future work

We proposed a HHMM organized in 3 layers, able to takeinto the account dependency the value percentage variationand trend history within a finite time window. Experimentalresults are encouraging. However, a major limitation comesfrom the inertia of trend. This makes the model less reactiveto capture movements, that are stronger especially at thebeginning of a new trend.

Predicting the next future is one aspect in designingprofitable day trading strategies. In the future, we aim atexperimenting HHMM as a tool to link trading strategiesoperating in high frequency, to market trend.

References

[1] E. Fama, “Efficient capital markets: A review of theory andempirical work,” Journal of Finance, vol. 25, pp. 383–417,1970.

[2] R. D. Edwards, J. Magee, and W. H. C. Bassetti, Technicalanalysis of stock trends. CRC Press, 2007.

[3] A. W. Lo, H. Mamaysky, and J. Wang, “Foundations of tech-nical analysis: Computational algorithms, statistical inference,and empirical implementation,” Journal of Finance, vol. 40,pp. 1705–1765, 2000.

[4] J. S. Abarbanell and B. J. Bushee, “Fundamental analysis,future earnings, and stock prices,” Journal of AccountingResearch, vol. 35, no. 1, 1997.

[5] ——, “Abnormal returns to a fundamental analysis strategy,”The Accounting Review, vol. 73, no. 1, pp. 19–45, 1998.

[6] P. Wilmott, Paul Wilmott on Quantitative Finance 3 VolumeSet (2nd Edition). Wiley, March 2006.

[7] W. Gilchrist, “Statistical forecasting–the state of the art,”Omega, vol. 2, no. 6, pp. 733 – 750, 1974.

[8] E. Saad, D. Prokhorov, and D. Wunsch, “Comparative studyof stock trend prediction using time delay, recurrent andprobabilistic neural networks,” Neural Networks, IEEE Trans-actions on, vol. 9, no. 6, pp. 1456–1470, 1998.

[9] W. Leigh, R. L. Purvis, and J. M. Ragusa, “Forecastingthe nyse composite index with technical analysis, patternrecognizer, neural network, and genetic algorithm: a casestudy in omantic decision support,” Decision Support Systems,vol. 32, no. 4, pp. 361–377, 2002.

[10] M. A. Kaboudan, “Genetic programming prediction of stockprices,” Computational Economics, vol. 16, pp. 207–236,2000, 10.1023/A:1008768404046.

[11] O. Jangmin, J. W. Lee, S.-B. Park, and B.-T. Zhang, “Stocktrading by modelling price trend with dynamic bayesiannetworks,” in IDEAL, ser. Lecture Notes in Computer Science,Z. R. Yang, R. M. Everson, and H. Yin, Eds., vol. 3177.Springer, 2004, pp. 794–799.

[12] S. Fine, “The hierarchical hidden markov model: Analysis andapplications,” Machine Learning, vol. 32, pp. 41–62, 1998.

204 2010 International Conference on Computer Information Systems and Industrial Management Applications (CISIM)