[ieee 2009 ieee international conference on industrial technology - (icit) - churchill, victoria,...

6
Intelligent Transport Navigation System using LookAhead Continuous KNN Geng Zhao Clayton School of IT Monash University Australia Kefeng Xuan Clayton School of IT Monash University Australia David Taniar Clayton School of IT Monash University Australia Wenny Rahayu Dept of Comp Sc La Trobe University Australia Bala Srinivasan Clayton School of IT Monash University Australia Contact emails: [email protected], [email protected] Abstract-One of the most popular queries in vehicle navigation, continuous k nearest neighbor, has been widely addressed. However, none of them focuses on continuous LookAhead k nearest neighbor. Hence, in this paper, we propose a new approach, called Continuous LookAhead K Nearest Neighbor (CLKNN). CLKNN query is different from the traditional continuous k nearest neighbor, whereby in our CLKNN, mobile users concerns with only the interest points in the forward space of query point according to a predefined moving direction. Interest points, which are behind the moving query point, are not of interest anymore. We propose algorithms for LookAhead KNN as well as Continuous LookAhead KNN. The former is used for static query point, whereas the latter is used for moving query point. Our experiments verify the applicability of the proposed approach to solve queries which involve LookAhead k nearest neighbors continuously. I. INTRODUCTION Intelligent transport system has been used to manage vehicles, routes as well as loads in order to save time, fuel and even to improve safety. In order to navigate in road network, most research in mobile databases focus on queries such as range search [1] and KNN [1, 2, 3, 4,5]. Range search can be used when user wants to get all objects within a fixed distance from the query point. Whereas KNN search can be used when a user wants to get k objects, which are closest to query point. In KNN, k is pre-defined and the directions from the query point to the interest points are not critical. However, users may want to find, for example, 3 nearest neighbors along the path from A to B, and only ‘future’ interest points are required. In other words, interest points that we have left behind are not of interest anymore. In this case, the path direction from A to B and the current position of the query is critical. This is a new kind of mobile queries that have not been addressed before in any previous work. In our case, we are only interest in interest points, which are in the forward direction of the path from the current user location (or query point) which is moving along the predefined path. To address this, we define two novel mobile navigation queries: LookAhead k nearest neighbor (LKNN) for static query points, and LookAhead “continuous” k nearest neighbor (CLKNN) for dynamic/moving query points. The LookAhead k nearest neighbor query can be defined as given the start point and destination point, for any query point, find k nearest neighbors which are in the forward direction of moving direction. The LookAhead “continuous” k nearest neighbor query is aiming at finding split points along the path where the static LookAhead KNN changes. II. RELATED WORK In this section, we give an overview of previous work related to KNN (e.g. VN3 [3], PINE [5]), and CKNN (e.g. DAR, eDAR [6] and Intersection Examination (IE) [7]). A. KNN algorithms There is a number of important existing works on k nearest neighbor (KNN) based on the network distance. The first two are Incremental Euclidean Restriction (IER) and Incremental Network Expansion INE, which are proposed by Tao et al in 2004 [3]. IER restricts the interest point using the k th node’s Euclidean distance and checks all interest point within this range. INE algorithm is based on expansion from the query point and locates interest object by expansion similar to Dijkstra’s algorithm [7]. These two approaches are inapplicable to us, because they only retrieve all closer interest points and the ones which we have already passed by may also be included. Two further works, VN3 and PINE, are all based on Voronoi diagram. VN3 proposed by Kolahdouzan et al [4], keeps the result in ascending order, adopts filter/refinement to generate and filter candidate result, and uses localized pre-computed network distances to save response time. The other approach is PINE proposed by Safar [5]. PINE firstly divides the large network into small regions by generate a large network Voronoi diagram by the interest points [5]. Each interest point is the cell’s centre. The border of every adjacent interest points has the same distance to the adjacent interest points. One advantage of PINE is that it pre-compute distances across the regions to save disk access. Also PINE uses region expansion instead of network expansion and ignores the inner network within each region. All of these approaches merely concern the distance from interest point to query point. They ignore the path direction to the interest points. Additionally, these approaches are not relevant to CKNN because they do not consider moving direction. In contrast, our proposed LookAhead KNN is

Upload: wenny

Post on 07-Mar-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: [IEEE 2009 IEEE International Conference on Industrial Technology - (ICIT) - Churchill, Victoria, Australia (2009.02.10-2009.02.13)] 2009 IEEE International Conference on Industrial

Intelligent Transport Navigation System using LookAhead Continuous KNN

Geng Zhao Clayton School of IT Monash University

Australia

Kefeng Xuan Clayton School of IT Monash University

Australia

David Taniar Clayton School of IT Monash University

Australia

Wenny Rahayu Dept of Comp Sc

La Trobe University Australia

Bala Srinivasan Clayton School of IT Monash University

Australia Contact emails: [email protected], [email protected]

Abstract-One of the most popular queries in vehicle navigation, continuous k nearest neighbor, has been widely addressed. However, none of them focuses on continuous LookAhead k nearest neighbor. Hence, in this paper, we propose a new approach, called Continuous LookAhead K Nearest Neighbor (CLKNN). CLKNN query is different from the traditional continuous k nearest neighbor, whereby in our CLKNN, mobile users concerns with only the interest points in the forward space of query point according to a predefined moving direction. Interest points, which are behind the moving query point, are not of interest anymore. We propose algorithms for LookAhead KNN as well as Continuous LookAhead KNN. The former is used for static query point, whereas the latter is used for moving query point. Our experiments verify the applicability of the proposed approach to solve queries which involve LookAhead k nearest neighbors continuously.

I. INTRODUCTION

Intelligent transport system has been used to manage vehicles, routes as well as loads in order to save time, fuel and even to improve safety. In order to navigate in road network, most research in mobile databases focus on queries such as range search [1] and KNN [1, 2, 3, 4,5]. Range search can be used when user wants to get all objects within a fixed distance from the query point. Whereas KNN search can be used when a user wants to get k objects, which are closest to query point. In KNN, k is pre-defined and the directions from the query point to the interest points are not critical.

However, users may want to find, for example, 3 nearest neighbors along the path from A to B, and only ‘future’ interest points are required. In other words, interest points that we have left behind are not of interest anymore. In this case, the path direction from A to B and the current position of the query is critical. This is a new kind of mobile queries that have not been addressed before in any previous work. In our case, we are only interest in interest points, which are in the forward direction of the path from the current user location (or query point) which is moving along the predefined path.

To address this, we define two novel mobile navigation queries: LookAhead k nearest neighbor (LKNN) for static query points, and LookAhead “continuous” k nearest neighbor (CLKNN) for dynamic/moving query points. The LookAhead k nearest neighbor query can be defined as given

the start point and destination point, for any query point, find k nearest neighbors which are in the forward direction of moving direction. The LookAhead “continuous” k nearest neighbor query is aiming at finding split points along the path where the static LookAhead KNN changes.

II. RELATED WORK

In this section, we give an overview of previous work related to KNN (e.g. VN3 [3], PINE [5]), and CKNN (e.g. DAR, eDAR [6] and Intersection Examination (IE) [7]).

A. KNN algorithms There is a number of important existing works on k nearest

neighbor (KNN) based on the network distance. The first two are Incremental Euclidean Restriction (IER) and Incremental Network Expansion INE, which are proposed by Tao et al in 2004 [3]. IER restricts the interest point using the kth node’s Euclidean distance and checks all interest point within this range. INE algorithm is based on expansion from the query point and locates interest object by expansion similar to Dijkstra’s algorithm [7]. These two approaches are inapplicable to us, because they only retrieve all closer interest points and the ones which we have already passed by may also be included.

Two further works, VN3 and PINE, are all based on Voronoi diagram. VN3 proposed by Kolahdouzan et al [4], keeps the result in ascending order, adopts filter/refinement to generate and filter candidate result, and uses localized pre-computed network distances to save response time.

The other approach is PINE proposed by Safar [5]. PINE firstly divides the large network into small regions by generate a large network Voronoi diagram by the interest points [5]. Each interest point is the cell’s centre. The border of every adjacent interest points has the same distance to the adjacent interest points. One advantage of PINE is that it pre-compute distances across the regions to save disk access. Also PINE uses region expansion instead of network expansion and ignores the inner network within each region.

All of these approaches merely concern the distance from interest point to query point. They ignore the path direction to the interest points. Additionally, these approaches are not relevant to CKNN because they do not consider moving direction. In contrast, our proposed LookAhead KNN is

Page 2: [IEEE 2009 IEEE International Conference on Industrial Technology - (ICIT) - Churchill, Victoria, Australia (2009.02.10-2009.02.13)] 2009 IEEE International Conference on Industrial

tightly related with moving direction, and all LookAhead KNN results are ahead of the query point. This is the reason why previous approaches cannot be used.

B. CKNN algorithms There are two important existing works on Continuous

KNN (CKNN). The first one is DAR/eDAR proposed by Safar et al [6].

These algorithms are based on PINE, which uses road networks as the underlying map. These two algorithms start with dividing the path into segment, where there is any network intersection node. Then they find KNN tables for two adjacent nodes, compare the two tables, and swap the position to make these two the same. Every swap would be a split point, and when the two tables are exactly same, all split points have been found.

The last approach is Intersection Examination IE [8], which is based on VN3. IE separates the pre-defined moving path into segment as like PINE. IE then tries to find the split point by defining the trend for each interest point in the current KNN result list. When there is any change in the position of interest point, it becomes a split point.

All of these works are considered as the traditional CKNN because they retrieve any objects closed to the query point, regardless whether the query points are actually the points that we have passed and we will never go back.

In contrast, our proposed LookAhead CKNN concerns only with forward interest points: all nearest interest points, which are ahead of the current query point. Hence, the existing approaches are not applicable in our case.

III. THE PROPOSED ALGORITHM

LookAhead KNN (LKNN) is to find KNN that is in the forward direction of the moving path, and CLKNN, which we propose in this paper, is to find the split point where LKNN changes. In this section, we describe LKNN first, and then CLKNN.

A. LookAhead KNN (LKNN) LKNN is the conditional KNN that all interest points are in

the forward direction of the moving path. So firstly, we define the direction line. Secondly, we define the space division line for the given query point and separate the space into forward and backward half space. Finally, we find LKNN by expanding the path to determine the forward half space.

Definition 1: Direction line SE is the straight line from the starting point S to destination E.

Definition 2: Space division line lq of query point q is the line, which is closer to E if we do perpendicular line to the direction line SE for every point of path q.

Definition 3: Forward half space SFq for query point q is the half space divided by lq which includes destination point E in addition of the space division line lq. Backward half space SBq for query point q is the half space divided by lq which includes start point S.

Definition 4: For a given segment MN, if every point of MN is in SFq, we define MN as forward segment for q

expressed as Spaceq(MN)=Forward, otherwise, if there is any point of MN fall into SBq, we define MN as backward segment for q and expressed as Spaceq(MN) = Backward.

Definition 5: For a given query point q and a given interest point P, there is a path PT which links q and interest point P. Suppose PTbranch as part of PT which is not overlapping with moving path. If every point of PTbranch is in SFq, then P is defined as forward interest object for q in regard to PT and is expressed as Spaceq,PT(P)=Forward. Otherwise, if there is any point of PTbranch falls into SBq, P is defined as backward interest object for q in regard to PT and is expressed as Spaceq,PT (P)=Backward.

Definition 6: For an interest point P and its link PT, if Spaceq,PT(P)=Forward, the main path is the part of PT which is overlapping with the given moving path, expressed as Mainq,PT(P), whereas the branch path is the rest of PT, expressed as Branchq,PT(P).

Property 1: If Spaceq,PT (P)=Forward, any part of PT can overlap with lq but no node can exceed lq into SBq.

Property 2: If P is in SBq, for any PT, Spaceq,PT (P)=Backward.

Property 3: For interest point P, if PT has more than one intersect node with lq , we cannot define whether Spaceq,PT (P)=Forward or Backward.

Property 4: There is the chance that for the same interest point P, Spaceq,PT1(P)=Forward and Spaceq,PT2(P)=Backward.

The proposed LKNN algorithm is given in Figure 1.

Algorithm LKNN (q, k, start point S, end point E) Input: objects number k, query point q,start point S and end point E Output: LKNN result 1. Initial lq. From q, draw a perpendicular line to SE. 2. ninj = find the segment which covers q. // ni is intersect point or interest object. 3. dmax=∞. //dmax is the kth interest point’s distance to q. Initial dmax=∞ 4. If Spaceq,qni(j) (P) = Forward

IPS = <(ni(j), dist(q,ni(j)), ni(j)’s arriving path)> // add ni or nj or both into IPS if Spaceq,qni(j) (P)=Forward // ni can be intersect node of the path or the interest object. // IPS is a set of P in ascending sequence of dist(q,n)

5. Check node n in IPS, if it is an interest point, put into RS. RS = <(n, dist(q,n), n’s arriving path h)>

6. If number of nodes in RS is k, the kth node as nk, set dmax = dist(q, nk).

7. Dequeue the top node n in IPS with the smallest dist(q,n). 8. While (dist(q,n)<dmax) 9. For each adjacent node nx of n and the link nnx is in SFq 10. Enqueue (nx,dist(q, nx), n’s arriving path+ nnx) into IPS 11. If nx is object node,

Check whether it is already in RS. 12. If not, put (nx,dist(q, nx), n’s arrv path+ nnx) into RS 13. Else update nx in RS with the smaller distance one. 14. If number of nodes in RS is k, the kth node as nk,, set dmax = dist(q, nk). 15. End while. 16. Go to step 7 until IPS is empty. End LKNN Figure 1. The proposed LKNN algorithm

Now, the process to find LKNN begins. Firstly, define an empty Interest point Set (IPS) to collect a

set of interest point. Secondly, locate the segment which covers q and put the its

start node ni or end node nj or both into IPS if Spaceq,qni(j) (P)=Forward.

Page 3: [IEEE 2009 IEEE International Conference on Industrial Technology - (ICIT) - Churchill, Victoria, Australia (2009.02.10-2009.02.13)] 2009 IEEE International Conference on Industrial

Thirdly, check any interest object on segment ninj. For all object P on this segment, check whether Spaceq,qP (P)=Forward. If yes, put (P, distance(qP), qP) into Result Set (RS), which stores the candidate interest object in ascending sequence ordered by their distance to q.

Fourthly, expand node n on top of IPS. De-queue n and add n’s adjacent nodes nx (including the network nodes and interest objects) into the IPS only when Spaceq(nnx)=Forward. If an interest point is added into IPS, check whether it is already in RS. If it is already in RS, compare the new distance to q with the old ones. Leave the smaller one and update the link field as well we the distance field.

Finally, continue to expand the path until the number of interest points in RS is greater than k, and all nodes in IPS have longer distance than the distance of kth node in RS. Another condition may terminate the algorithm is that when there is no node in IPS. This means that no forward interest point for q can be retrieved.

To make our algorithm clearer, lets have a look at Figure 2. In our example, we want to find LookAhead 3NN for query point q, when we travel from S to E.

Figure 2. The example of LookAhead KNN algorithm

• Step 1: SE is the direction of the path. dmax=∞. lq is q’s space division line. Initialize IPS to store all expansion routes, and RS to store the result.

• Step 2: Find AB which covers q. Because Spaceq,qA (A)=Backward and Spaceq,qB (B)=Forward, add B into IPS=< (B, 1, q→B) >.

• Step 3: Expand node B which is on the top of IPS. C and P6 are B’s adjacent nodes. Because Spaceq(BC)=F and Spaceq(BP6)= Forward, C and P4 are added into IPS.

IPS = < (C, 2, q→B→C), (P6, 4, q→B→P6)>. Add P6 (an interest point) into RS=<(P6, 4, q→B→P6)>.

• Step 4: Expand C which is on the top of IPS. D and P4 are C’s adjacent nodes. Because Spaceq(CD)=Forward and Spaceq(CP4)= Forward, D and P4 are added into IPS.

IPS = < (P6, 4, q→B→P6), (P4, 4, q→B→C→P4), (D, 4, q→B→C→D)>, and

RS = < (P6, 4, q→B→P6), (P4, 4, q→B→C→P4) >. • Step 5: Expand P6. F is P6’s adjacent node. Because

Spaceq(P6F)= Forward, we add F into IPS. IPS = < (P4, 4, q→B→C→P4), (D, 4, q→B→C→D),

(F, 6, q→B→P6→F) >.

• Step 6: Expand P4. H is P4’s adjacent node. Because Spaceq(P4H)= Forward, we add H into IPS.

IPS = < (D, 4, q→B→C→D), (H, 5, q→B→C→P4→H), (F, 6, q→B→P6→F) >.

• Step 7: Expand D. E is D’s adjacent node. Because Spaceq(DE)= Forward, we add E into IPS.

IPS = < (H, 5, q→B→C→P4→H), (E, 5, q→B→C→D→E), (F, 6, q→B→P6→F) >.

• Step 8: Expand H. P2 is H’s adjacent node. Because Spaceq(HP2)=Backward, we pull H out from IPS.

IPS = < (E, 5, q→B→C→D→E), (F, 6, q→B→P6→F) >. • Step 9: Expand E. P5 and P7 are E’s adjacent nodes.

Because Spaceq(EP5)= Forward & Spaceq(EP7)= Forward, P5 and P7 are added into IPS.

IPS = <(F, 6, q→B→P6→F), (P5, 8, q→B→C→D→E→P5),

(P7, 11, q→B→C→D→E→P7)>, and add P5 and P7 into RS. RS = < (P6, 4, q→B→P6),

(P4, 4, q→B→C→P4), (P5, 8, q→B→C→D→E→P5)>.

As k=3, ignore P7. 3rd LKNN is found, dmax = 8. • Step 10: Expand F. P5 is F’s adjacent node. Because

Spaceq(FP5)= Forward, we add P5 into IPS. As P5 is already in IPS, we update the P5 with the one with smaller distance to q.

IPS = < (P5, 7, q→B→P6→F→P5), (P7, 11, q→B→C→D→E→P7)>.

Because P5is interest point and it is in RS, we update it as nearer distance, dmax = 7

RS = < (P6, 4, q→B→P6), (P4, 4, q→B→C→P4), (P5, 7, q→B→P6→F→P5)>.

• Step 11: As 3KNN have been found and all nodes in IPS have larger distance to q than dmax. This algorithm terminates. The final results are P6, P4 and P5.

B. Continuous LookAhead KNN (CLKNN) Since the previous approach gives the solution to get

LKNN for a given point, the approach of how to find CLKNN becomes crucial because it indicates the query point is moving. It is unprocurable to get the LKNN result for every point on the path because of low response time and huge overhead. Split points have been used either in continuous range search or continuous KNN. The same idea has been used in our proposed CLKNN algorithm.

CLKNN is aiming at finding the split points where LKNN changes. Another advantage of CLKNN proposal is that it does not divide the moving path into small segment like in both DAR and IE. It ignores all interest points on the path and gets the split point according to the conditions we define below. Consequently, our approach saves time.

The following paragraphs illustrate the conditions where LKNN changes, in other words, where the split point exists. In summary, while the query point is moving, the division space line is also moving. There will be a point where the previous LKNN is not LKNN for the current query point any

Page 4: [IEEE 2009 IEEE International Conference on Industrial Technology - (ICIT) - Churchill, Victoria, Australia (2009.02.10-2009.02.13)] 2009 IEEE International Conference on Industrial

more. This location is a split point. We will show different scenarios using examples.

1) NO position change of LKNN results Lemma 1: In CLKNN, for a query point q, its LKNN result is < (P1, dist(p1, q), PT1), (P2, dist(P2, q), PT2)…(Pk, dist(Pk, q), PTk)>. When q is moving, the result P1, P2…Pk will not change until at least one of interest point Pi, according to the original path Pi , Spaceq,PTi (Pi)=Backward. Pre-assumption:

1) 1st query point’s location is q1 and 2nd location is q2. 2) For all interest points P, Spaceq,PT(P)=Forward

q: any node from q1 to q2。 PT: original PT in result list. 3) At q1, the result is <(P1,dist(p1,q1),Mainq1,PT1(P1)+Branchq1,PT1(P1))… (Pk,dist(pk,q1),Mainq1,PTk(Pk)+Branchq1,PTk(Pk))>

Conclusion to be proved: The result for q2 will be <(P1,dist(p1,q2),Mainq2,PT1(P1)+Branchq2,PT1(P1))… (Pk,dist(pk,q2),Mainq2,PTk(Pk)+Branchq2,PTk(Pk))> Not only the interest points do not change, even the

sequence and the link to interest point remain the same. Proof: According to pre-assumption (2), we can get following conclusion based on definition of forward interest point:

1. Every q on q1 to q2, the Branchq,PT1(P1)…Branchq,PTk(Pk) are all in SFq.

2. Segment q1q2 is part of Mainq1,PT1(P1)…Mainq1,PTk(Pk) because if Mainq1,PTi(Pi) does not contain q1q2, there must be a point qi not included in Mainq1,PTi(Pi). In other word, at point qi, Pi is not valid any more. This contradicts with our pre-assumption.

The result for any P in {P1,P2…Pk} at query point q2, the result will be (P, dist(p,q1) – dist(q1q2), Mainq1,PT1-q1q2(P) + Branchq1,PT1(P)), meaning that the distance(p,q2)= distance(p,q1) – dist(q1q2), main path of q2 equals to main path of q1 subtract q1q2 part and the branch path keeps the same. In summary, as the distance is reducing in same amount which is equal to the query point moving distance, when the query point is moving, as long as all the interest points keep valid; their position in result list would not change. □

Figure 3. Example of no split point for changing LKNN result position

Figure 3 illustrates lemma 1 in details. q’s 2-LNN result is: 2NN = <(P4, 4, q→B→C→P4), (P5, 8, q→D→E→P5)>.

When query point is moving from q to B, the distance from query point to P4 and P5 descends by 1, which is the distance of qB. According to Lemma 1, from q to C (C is the point where according to PT (q→B→C→P4), P4 is moving into C’s

backward space), sequence of interest point and the path of reaching interest point remain the same.

2) Split point existing condition Lemma 2: Split point only incurs when satisfies the following two conditions: 1) In current LKNN result list, one or more interest point’s branch path intersect with space division line. 2) Space division line moves following the query point. Pre-assumption: Suppose there is one interest point Pi which is in current LKNN result list, and its branch path is Branchq,PTi(Pi). At current query point location q, Branchq,PTi(Pi) has one or more intersection node N with current query point’s lq. Conclusion to be proved:

1. If the lq is moving along q, there will be a split point. 2. If lq keeps static when q is moving, it is not a split point.

Proof: Conclusion (1) can be proven easily because if one of the interest point in the LKNN result list, when lq is moving is moving forward, definitely part of its branch will fall into q’s backward space, in other words, split point will occur because LKNN result changes.

Conclusion (2) means that if lq is static for awhile, Pi is still valid in this time period. As long as Pi is valid, the point cannot be defined as split point. There will be a final split point where lq begins to move. □

Figure 4. Example of lemma 2

We will use figure 4 as an example to show conclusion 1 and 2 discussed above. Suppose at q1, LookAhead 3NN is

3NN = <(P1, dist(P1,q1), q1→A→P1 ), (P2, dist(P2,q1), q1→C→P2 ), (P3, dist(P3,q1), q1→B→P3 )>.

When q moves from q1 to q2, Branchq2,AP1(P1) has intersect node with lq2 and also lq2 is moving forward after q2. As q2 satisfies conditions (1) and (2), we can draw a conclusion that q2 is a split point.

Then q moves from q2 to q3, Branchq3,CP2(P2) has intersect node with lq3 which means it satisfies condition (1) in lemma 2. But from the definition of space division line, we can get that when q moves from q3 to q4, lq remains static. So q3 is a false split point, because it does not satisfy the second condition of lemma 2 and q4 is a real split point. q5 is false split point and q6 is real split point following the same way.

3) Split point existing supplemental condition There may be a coincidence that at some query point, the

result and even the results sequence remain the same, only the path to some of interest point changes, we still consider this point as split point.

Page 5: [IEEE 2009 IEEE International Conference on Industrial Technology - (ICIT) - Churchill, Victoria, Australia (2009.02.10-2009.02.13)] 2009 IEEE International Conference on Industrial

Figure 5 is the example to show this coincidence. When q is moving from q1 to q2, the result of LookAhead 3NN is

3NN = < (P4, 4, q2→C→G→P4), (P5, 7, q2→C→H→D→E→P5),

(P7, 10, q2→C→H→D→E→P7)>. Branchq2,q2CP4(P4)=CP4 has an intersect node which are lq2

and lq2 that are moving forward. Theoretically, q2 is a split point. Even when we do LookAhead 3NN q2, the result becomes:

3NN = < (P4, 5, q2→C→H→P4), (P5, 7, q2→C→H→D→E→P5 ),

(P7, 10, q2→C→H→D→E→P7)> which means P4 P5 P7 and even their sequence remains the same, but the path to P4 is changing from q2→C→G→P4 to

q2→C→H→P4.

Figure 5. Example of scenario 3

4) Split point at branching path entry point Lemma 3: If the interest point’s branch path does not intersect with lq, when query point moves to the entry point, there will be a split point, if space division line moves following the query point. Proof: This condition is similar to scenario (2) Split point existing condition if we trade the dividing point of main path and branch path as part of branch path. So it is self-evident.□

Figure 6 gives an example to explain this condition. We can conclude that C is a split point indubitably because when we reach C, P4 will be out of the LKNN result as P4’s branch path is going to intersect with the space division line, if space division line is move forward.

Figure 6. Example of a split point at branching path entry point

In summary, for every point where one or more interest point’s branch path or the branch path entry point have intersect node with current query point’s space division line, it becomes a split point if space division line moves following the query point after this point. This is the only condition where split point occurs. One thing that should be mentioned is that at a split point, even when the interest points and their position are exactly the same with the previous one, it is still a

split point because some interest point’s path to query point must have been changed.

Let us go further to show LookAhead CKNN in Figure 7. We have got LookAhead 3NN of q which is P6, P4 and P5 from the previous sample of LKNN.

According to the Lemma 2, q will be a split point because P6 will be out even we just move a little forward. Then we should do LookAhead 3NN again for the node after we move. The LookAhead 3NN will be P4, P5 and P7.

Then when we move to C, according to Lemma 3, C will be the 2nd split point and the LookAhead 3NN result changes.

Continuously moving until I, P5 will be out. So I will be the 3rd split point. The process continues to find all split points along the road.

Suppose P8 is the one of KNN result of previous node of G. When we move to G, P8’s branch path intersect with space division line, but G is not the split point while J is, because after J, the space division line is moving forward comparing with G, the space division line remains static.

Figure 7. Example of CLKNN algorithm

IV. PERFORMANCE EVALUATION

In the experimentations, we use Melbourne city map and Gippsland map from the whereis website [10]. All interest points, network links and intersect nodes are real-world data. In our experimentations, we choose 100 interest points, 5 different query positions for 20 different 1km moving paths to illustrate for different interest points density, different moving path and different k, the expansion steps, runtime or number of split points will be different.

A. CLKNN Performance Evaluation Considering LKNN, expansion steps and runtime vary

between low density and high density interest points. We used 15 interest points to represent a low density sample and 100 interest points to represent a high density. The average runtime and expansion steps of different k from 1 to 5 are calculated by 5 different query points for each moving path. After evaluating the performance, we can conclude that for the same map if the interest point density is low, the runtime will increase, because more time will be needed to expand the segment. Also if k increases, the runtime will increase because more time will be spend on trying to find more interest point. Figures 8 and 9 give the experiment results of CLKNN performance.

Page 6: [IEEE 2009 IEEE International Conference on Industrial Technology - (ICIT) - Churchill, Victoria, Australia (2009.02.10-2009.02.13)] 2009 IEEE International Conference on Industrial

Figure 8. Expansion steps between different density interest points

Figure 9. Runtime between different density interest points

B. CLKNN Performance Evaluation For CLKNN, number of split points will be the most

important criteria. Suppose k is 3 and interest points number is 20. We test 20 moving path (about 10km) which contains different number of times that query point will move backward when moving along the path (from 0 to 7). In general, number of split points will increase if the moving path has more U-turns. Also the experiment shows that there is not much relationship between k and number of split points in different density condition as viewed in Figure 10.

Figure 10. Split points in different number of query points moving back

V. CONCLUSION

In this paper, we have presented an approach to LookAhead k nearest neighbor (LKNN) and LookAhead “Continuous” k nearest neighbor (CLKNN) based on network distance on road network.

The key point for this approach is how to define the space division line and a given interest point’s belonging space. The definition of space division line is based on the most users’ ideas of forward space. That is, although users may temporarily go backward; the space division line would not go backward with the user because our general direction is forward. The same idea is used in the definition of moving direction line. We do not adopt the idea that the moving direction line is changing along the path because even on the path, user may do a U-turn, we cannot say the moving direction is changing backward. All other ideas are based on

these two definitions. Finally, the proposed LKNN approach can give LKNN results for any query point.

CLKNN can get split points without dividing the moving path into small segments, unlike the existing work by other researchers. Comparing with other continues nearest neighbor search methods, our method is the great improvement either in reducing response time or overhead occupation. In addition, we performed several experiments to measure the performance of algorithms in different network conditions.

This paper adds a most useful condition, which is look ahead on KNN. This condition makes this algorithm more practical as most drivers do not concern with the interest points that already passed by. This approach will enrich the content of our intelligent transport system and give more benefits to mobile users.

In the future, incorporating intelligent techniques [11] in mobile query processing and addressing various indexing and broadcasting schemes [12,13,14,15] for mobile query processing is important to address the performance issues.

REFERENCES [1] D. Papadias, J. Zhang, N. Mamoulis, and Y. Tao, “Query

Processing in Spatial Network Databases”, Proc. of VLDB, Morgan Kaufmann, pp. 802-813, 2003.

[2] M. Kolahdouzan, and C. Shahabi, “Voronoi-Based K Nearest Neighbor Search for Spatial Network Databases”, Proc. of VLDB, Morgan Kaufmann, pp. 840-851, 2004.

[3] N. Roussopoulos et al, “Nearest Neighbor Queries”, Proc. of SIGMOD, ACM Press, pp. 71-79, 1995.

[4] A. Okabe, B. Boots, K. Sugihara, and S. Nok Chiu, Spatial Tessellations: Concepts and Applications of Voronoi Diagrams, John Wiley and Sons Ltd., 2nd edition, 2000.

[5] M. Safar, “K nearest neighbor search in navigation systems”, Journal of Mobile Information Systems, 1(3):1-18, 2005.

[6] M. Safar, and D. Ebrahimi, “eDAR Algorithm for Continuous KNN Queries Based on Pine”, Intl. J. of Information Technology and Web Engineering, 1(4):1-21, 2006

[7] E. W. Dijkstra, “A Note on Two Problems in Connection with Graphs”, Numerische Mathematik, 1(22):269-271, 1959.

[8] M. Kolahdouzan, and C. Shahabi, “Alternative Solutions for Continuous K nearest neighbour Queries in Spatial Network Databases”, Geoinformatica, 9(4):321-341, 2005.

[9] Y. Tao, D. Papadias, and Q. Shen, “Continuous Nearest Neighbor Search”, Proc. of VLDB, pp. 507-518, 2002.

[10] Telstra Corporation, 2008, whereis, Melbourne, viewed 10 June, 2008, http://www.whereis.com

[11] J. Goh, D. Taniar: ‘Mining Frequency Pattern from Mobile Users’, Proc. of the 8th Intl Conference on Knowledge-Based Intelligent Information and Engineering Systems, Part III. LNCS 3215, Springer, pp. 795-801, 2004.

[12] D. Taniar, J.W. Rahayu, ‘A Taxonomy of Indexing Schemes for Parallel Database Systems’, Distributed and Parallel Databases, 12(1):73-106, 2002.

[13] D. Taniar, J.W. Rahayu. ‘Global parallel index for multi-processors database systems’, Information Sciences, Elsevier, 165(1-2):103-127, 2004.

[14] A.B. Waluyo, B. Srinivasan, D. Taniar, ‘Optimal Broadcast Channel for Data Dissemination in Mobile Database Environment’, Proc. of the 5th Intl Workshop on Advanced Parallel Programming Technologies APPT, LNCS 2834, Springer pp. 655-664, 2003.

[15] A.B. Waluyo, B. Srinivasan, D. Taniar, ‘A Taxonomy of Broadcast Indexing Schemes for Multi Channel Data Dissemination in Mobile Database’, Proc of the 18th Intl Conference on Advanced Information Networking and Applications (AINA 2004), IEEE Computer Society, pp. 213-218, 2004.