an augmented reality software architecture for information...
TRANSCRIPT
1
An Augmented Reality Software
Architecture for Information
Retrieval
Zhe Li
U5832581 14 June 2019
This report is submitted for partial credit for the course Comp4560
“Advanced Computing Project”
Supervised by Henry Gardner
Name (ID card Number): 5832581
Supervisor: Henry Gardner
2
DECLARATION
I have read and understood the rules on cheating, plagiarism and appropriate
referencing as outlined in my handbook and I declare that the work contained in this
assignment is my own, unless otherwise acknowledged.
No substantial part of the work submitted here has also been submitted by me in other
assessments for this or previous degree courses, and I acknowledge that if this has
been done an appropriate reduction in the mark I might otherwise have received will
be made.
Signed candidate___Zhe Li_____________________________
3
Acknowledgement
First of all, I would like to express my heartiest thanks to my supervisor, Mr. Gardner,
who provided me so many valuable suggestions and a meticulous guidance regarding
the project over this period. This project could not be finished without each of his
prompt responses and inspirations.
I am also greatly thankful to my dearest friends and roommates for their continuous
encouragement.
Finally, I would like to express my heartfelt thanks to my parents for both the
financial and mental support they provided for me.
Name (ID card Number): 5832581
Supervisor: Henry Gardner
4
Table of Content
CHAPTER 1 7
INTRODUCTION 7
1.1 AUGMENTED REALITY 7 1.2 MOTIVATION 8 1.3 CONTRIBUTIONS 9
CHAPTER 2 10
BACKGROUND 10
2.1 DEVELOPMENT TOOLS 10 2.2 OTHER APPLICATIONS 10
CHAPTER 3 12
DESINGN AND IMPLEMENTATION 12
3.1 USER REQUIREMENTS 12 3.1.1 USER SURVEY 12 3.2 DATABASE 15 3.2.1 VUFORIA DATABASE 15 3.2.2 ONLINE DATABASE 17 3.3 APPLICATION ARCHITECTURE 18 3.3.1 CLIENT SERVER ARCHITECTURE 18 3.3.2 SYSTEM FLOWCHART 19 3.4 USER INTERFACE 20 3.4.1 USER INTERFACE DESIGN PRINCIPLES 20 3.4.2 SCENE DESIGN 21 3.4.3 SCENE SWITCHING 22 3.4.4 INFORMATION BOARD FUNCTION 23
CHAPTER 4 25
EVALUATION 25
4.1 COGNITIVE WALKTHROUGH 25 4.2 RESULTS 26
5
CHAPTER 5 30
DISCUSSION 30
CHAPTER 6 32
CONCLUSION 32
REFERENCES 33
APPENDIX 35
Name (ID card Number): 5832581
Supervisor: Henry Gardner
6
List of Figures Figure 1. An example of AR technology adopted on iPhone ............................................................. 7 Figure 2. Yelp - Monocle AR view ............................................................................................... 11 Figure 3. When are search failures most likely to occur? ............................................................... 13 Figure 4. Have you ever used AR products or not? ...................................................................... 13 Figure 5. Use cases activity diagrams for designed application ..................................................... 15 Figure 6. The Vuforia processed image target with extracted features marked in yellow .................. 16 Figure 7. Using phpMyAdmin to create and manage the MySQL database ...................................... 17 Figure 8. The C# script and PHP script for querying from online database ..................................... 19 Figure 9. System flowchart for explorer ....................................................................................... 20 Figure 10. System flowchart for contributor ................................................................................. 20 Figure 11. The connectivity of three scenes in this application ....................................................... 21 Figure 12. The welcome scene built in unity development environment ........................................... 22 Figure 13. The inspector panel of the button and the code for scene switch ..................................... 23 Figure 14. The translucent information board locates on the top of the screen ................................ 23 Figure 15. The action sequence of tested task ............................................................................... 26 Figure 16. The screenshot of Initial searching interface step ......................................................... 27 Figure 17. The screenshot of step Search potential target .............................................................. 28 Figure 18. The screen shot of request information about a target step ............................................ 28 Figure 19. The screenshot of close information step ...................................................................... 29
7
Chapter 1
INTRODUCTION
1.1 Augmented reality
Augmented Reality (AR) enables views of real objects to be overlaid with computer
generated virtual models (as computer graphics) [1]. While Virtual Reality (VR) aims
to create an immersive virtual environment that allows users to interact with a fully
graphical world [2], AR focused on juxtaposing relevant information and real items
together. Augmented reality (using 2D graphics) is already familiar to smart phone
users since it is commonly used in cameras function where symbols or slide bars are
sometimes overlaid on a camera’s image of a scene to adjust the focus or light
intensity such as is shown in Figure 1.
Figure 1. An example of AR technology adopted on iPhone
(https://www.imore.com/how-take-great-portrait-lighting-selfies-iphone-x)
With the growing mobile user base, the expectation and requirements of the mobile
Name (ID card Number): 5832581
Supervisor: Henry Gardner
8
applications is increasing. Since AR enables user to perceive the object in an
interactive way, it could improve the applications’ performance in several areas such
as industry, education and public service [3]. Also, the rapid development of hardware
for mobile and tablet computer enables AR applications to be well designed and
adopted on mobile platforms. With increasing computing power, AR applications will
be able to become even more widespread and ambitious in the future. For instance,
AR software might be able to detect and recognize the targets in the camera frame by
visual detection or GPS location and present the useful and relevant information to the
users in real-time.
In the future, AR applications will be required to handle problems in the real world
with much larger-scale data than they presently handle. Because of this, their software
architectures will need to incorporate databases that can be searched based on keys
that are discoverable in the world. The aim of this project is to build such software
architectures and to complete a simple application as a case study for illustrating them.
This simple case study is an app for discovering information about restaurants that
you find as you are walking around – information such as the restaurants with high
recommendations.
1.2 Motivation
Nowadays, traditional information retrieval methods are being challenged to satisfy
daily demands. Traditional information retrieval requires that a user types the name
and question into the search engine to get relevant results. However, in some cases,
this requirement is hard for them to meet. For instance, users might be unable to type
an objects’ name correctly, particularly if this name is written in foreign language. In
addition, users might only have a vague idea of the name to type in. Thus, the
information searching method need to be modified and improved.
9
Consider a situation where a user is facing an object in the real world and wants to
find out more information about that object. In this case a combination of AR and
computer vision techniques could provide an elegant solution to present relevant
information to a database and to return results from that data search to users. Instead
of searching information based on textual description, such a method would be based
on recognition of a target part of an image which is captured by the mobile device’s
camera. The related information will be overlaid on the detectable target in the image.
To fulfill this purpose, a software architecture that properly connects a database with
images and returns information as AR graphics is needed. Also, a user interface is
needed to support and facilitate such an information search. The user interface needs
to be designed in a user-friendly style so that the users could receive and understand
information in a pleasurable and understandable fashion. Such an experience of
pleasure, delight and playfulness is a major advantage of using augmented reality.
1.3 Contributions
In this project, a prototype Restaurant Recommender application is completed on the
iOS platform and tested on an iPhone X. The main program is coded in Unity. The
online database and main frame are settled and hosted to serve the purpose of storing
information and answering the request of gathering information from a cloud sever.
The Vuforia package is enabled in the application to deal with object detection and
recognition.
This application enables users to receive useful information by aiming their mobile
phone camera towards the object. Once the target is detected, then different aspects of
knowledge about it will be provided to users to select. Thus, the users could gain
desired information without typing text into a traditional search engine.
Name (ID card Number): 5832581
Supervisor: Henry Gardner
10
Chapter 2
BACKGROUND
2.1 Development Tools
One of the reasons that AR applications are becoming more widespread is that some
mature and functionally rich software development environments are now available.
Unity is one such platform. As a well-known cross-platform game engine, it supports
3D and 2D development with a range of artist-friendly design tools. It also includes a
powerful UI design built-in system which enables developer to build up a user
interface easily. Unity also interfaces with several packages that extend its
functionality including powerful 2-demonisonal visual recognition which is built-in
Vuforia, it is a proper tool to build an AR application. Thus, the combination of
Vuforia and Unity becomes as numerous software engineers’ first choice to develop
their software.
2.2 Other Applications
There are several applications that exist already that are designed to be “real-life
search” engines like the Restaurant Recommender. They aim to present information
about the places or locations around the users [4]. Two of the most famous products
with this feature are Wikitude SDK and Yelp --- both of them pre-processed
geo-referenced data to determine the target, which means the sensor of these AR
application is based on GPS rather than image recognition [5]. It requires the
developer to settle the position-based marker so that the target could be detected if the
distance between the marker and user is close enough. This leads to the difficulty of
defining the boundary about when to trigger the AR function. The reason is that the
densities of detectable targets in different situations could be significantly different.
11
For example, shops and restaurants are concentrated in city center while only a few
stores are scattered in rural areas. If the trigger range is too small and the density of
detectable targets is high, a large crowd of AR models will appear on screen at same
time in the city center, which causes a terrible user experience as the image of real
world is totally covered up by the computer graphics, as shown in Figure 2 .
Yelp launched an AR function named “Monocle” within their application in 2009. It
enabled users to find out about the restaurant near them by aiming the mobile phone
in a certain direction [6]. Then they remove this function later. It is possible that the
reason it was removed could be the problem we mentioned above.
Figure 2. Yelp - Monocle AR view
(https://www.manifest-tech.com/ce_gallery/portable_gallery_apps.htm)
Name (ID card Number): 5832581
Supervisor: Henry Gardner
12
Chapter 3
DESINGN AND IMPLEMENTATION
3.1 User requirements
3.1.1 User survey
A survey has been designed and conducted by the author in order to understand users’
experience of search engines. The sample size was 50 and conducted online. The
complete questionnaire was in the appendix with more details.
The result of Question 1 demonstrates that Google and Baidu are the two search
engines that people most commonly use, both of which adopt traditional search
patterns.
According to the outcome of Question 2, among a variety of new-type search engines,
those based on image and audio recognitions have the largest number of users. The
reason could be that some software companies have started to apply audio and image
detection and recognition techniques into their product. For example, Shazam, a
music recognition application, was acquired by Apple Music in 2018. The responses
to this question indicate that people have a strong demand for new-type search
engines.
With regards to Questions 3 and 4, most of the interviewees chose “Yes” as the author
expected, showing that the traditional search engines like Google and Baidu can not
meet users needs to some extent. A large number of users has experienced search
failure due to the dull way of inputting text in order to search.
As shown below, the result of the next question goes on to show that search failures
13
mentioned above mostly occur when working or traveling -- 36% and 44%,
respectively.
Figure 3. When are search failures most likely to occur?
The survey also asked respondents whether they have ever used AR products or not. It
can be seen from the graph that 66% of respondents said no and 22% of them were
not sure about it. Only few have used AR products before.
Figure 4. Have you ever used AR products or not?
The final two questions asked about users’ expectations of new types of search
14%
36%6%
44%
Study Work Daily life Travel Others
Name (ID card Number): 5832581
Supervisor: Henry Gardner
14
engines from two aspects: input mode and output mode. Most users expressed that
recognizing and inputting items using a camera would bring a lot of convenience in
certain situations. In terms of output mode, displaying information in diversified
patterns such as audio and 3D models are more attractive than traditional text format.
Two overall observations can be drawn from the results of this survey. The first
observation is that traditional search engines might fail to satisfy users expectations in
some circumstances especially when people are travelling or working. The reason
could be that individuals easily encounter unknown items while they are traveling
overseas or working with an unfamiliar item in the first time. For example, when
visiting another country people might be interested in experiencing local specialty
food. They walk on the street and find out the restaurant around them are mainly
named in a foreign language which is unrecognizable to them. In this situation, they
might not even be able to type in a search query if the foreign language script is
unrecognizable.
The second observation from the survey is that, although AR has now been developed
for several decades, users are still unfamiliar with using AR techniques in practical
settings. Indeed, although people are now becoming familiar with some graphical
overlays on top if video images when using cameras in modern phones, many do not
even realize that this is an AR function and they are unaware of where this technology
might lead.
3.1.2 Use Cases Activity Diagrams
15
Figure 5. Use cases activity diagrams for designed application
3.2 Database
3.2.1 Vuforia Database
Vuforia provide a web-based tool named Vuforia Target Manager for developer to
process the target image and store them in a Vuforia Database [16]. Among three
different provided types of Vuforia image database, the Devices Databases and Cloud
Databases were adopted to store the targets’ images. The Cloud Databases enables the
application to access the images stored in cloud sever, which could significantly
lighten the application size. However, since the Vuforia is a commercial production,
the usage of cloud service is strictly limited for free account. Thus, the expected
Cloud Vuforia Database will not be implemented during developing, though the
databases are designed on cloud. It could be easily achieved when the application is
ready to be published by purchasing Vuforia cloud service.
For developing and testing phase, the Devices Databases is chosen to support Vuforia
function in Unity environment. It requires the developer to upload images to the
Name (ID card Number): 5832581
Supervisor: Henry Gardner
16
Vuforia Target Manager where images will be processed. The image detection and
recognition technology of Vuforia is based on feature extraction, which means the
feature representation model of each image will be created for local recognition [17].
Figure shows the processed target image of a comic book and extracted features are
marked in yellow. From the information given at the right of the figure 6, the Target
ID is unique identifier, which is automatically generated and the Augmentable
indicate the expectation of detection performance [18]. To ensure the accuracy of
detection, usually the image with 4 or 5 stars will be accepted, while the Augmentable
heavily depend on the quality of the image itself.
Figure 6. The Vuforia processed image target with extracted features marked in yellow
After generating the image target, the Device Database is available to be entirely
exported from Vuforia Target Manager as a Unity package. Finally, the package will
be imported to Unity which enables the application to access the image database
locally. For future work, it will be replaced with Cloud Databases, which will be
discussed in next section.
17
3.2.2 Online Database
MySQL is chosen to build the online database to store the items’ related information,
including their ID, English name and rating, etc. It is one of the most popular free
open source relational database, which accepts Structured Query Language (SQL) for
querying and supports multi-threaded communication [19]. Since it provides reliable
and high performance and it is easy to be implemented, it is widely used in a large
number of software companies [20].
In this application,
In this application, a simple table was created with four columns including of Name,
Category, Rating and Comment, which are considered as the most desirable
information that users want to receive. A piece of sample record was inserted for
testing the information communication. In this stage, the phpMyAdmin is introduced
and utilized to achieve the goal of implementing and managing the MySQL database.
It is a free PHP language tool that enables users to dispose the administration of
relational database and complete the basic task of modulating and querying [21].
Figure 7. Using phpMyAdmin to create and manage the MySQL database
Next, the database needs to be hosted online to provide available remote access for
each device, which will be discussed in section in detailed.
Name (ID card Number): 5832581
Supervisor: Henry Gardner
18
3.3 Application Architecture
3.3.1 Client server architecture
Client-server architecture describes a computer system where the powerful central
server controls and transmits most of the information to the clients. With MySQL, a
client-server architecture distribution system allows multi-user and multi-access with
high efficiency [22]. In this project, a hosting service provider named Hostinger is
selected, since it supports utilize MySQL to mange the database and it provides a
convenient method to upload and handle the file on cloud. Also, a domain named
rr824.cf is applied to serve the purpose of hosting.
Except for hosting the database online, files stored online plays an important role of
establishing the connection and determining the behavior of using the database. As
shown below, one C# script need to be created to send the request of accessing web
page by using WWW method [23]. In this script, it requires to operate the function
coded in PHP language, which is stored in cloud server called itemSelect. The main
content of this function is query sentence to select desired item from the database.
Thus, the local device with Internet connection could easily search the information
from the online database.
19
Figure 8. The C# script and PHP script for querying from online database
3.3.2 System Flowchart
In this part, the system flowchart is defined and drew by the author. The user groups
could be divided into two different kinds of typical players. Some are the image-based
search engine target users who prefer to explore the unknown item. They are the main
custom to experience the function. The figure illustrates the interaction between their
local client and the cloud server.
Name (ID card Number): 5832581
Supervisor: Henry Gardner
20
Figure 9. System flowchart for explorer
Another group of users are defined as the contributor to enrich the database by
interacting with Vuforia database and the online information database. In this case,
the owner of the restaurant or foodie might be interested sharing the information in a
new method. The Figure below illustrate how they can have the ability to fill the
database.
Figure 10. System flowchart for contributor
3.4 User Interface
3.4.1 User Interface Design Principles
The main purpose of AR user interface (UI) design is to provide a natural interaction
method and a proper way to display the information, which requires a balanced
combination of 3D environment and users’ current knowledge about 3D environment
[7]. However, as an emerging technology, AR user interface design also raises a
challenge that there are no specific defined design guidelines [8]. One common
problem is that the AR application might force users to switch their attention between
the items in real world and device’s screen, since the digital information is overlaying
21
above the original camera image [9]. In our case, to ensure the user could gain
positive experience of reading searched information and view the items from the
camera frame without disturbance of alternative virtual UI object, a minimalist style
was adopted during the UI design phase. Thus, only the indispensable UI functions
are considered to build to avoid redundancy.
3.4.2 Scene Design
In Unity, the scene is defined as the unique designable space that includes the
environment and game object [10] and which supports switching between different
scenes. Three scenes were created in this application and with each scene has its own
purpose. The connectivity of three scenes are illustrated in Figure 11.
Figure 11. The connectivity of three scenes in this application
The Welcome Scene has two pressable buttons (Figure12). Instead of pushing users
directly into the mixed reality environment, this scene is designed to provide a buffer
space before users enable the searching function by providing two available options to
users. To encourage users experiencing this unfamiliar search method, the upper
button named “Let’s Explore” allows users to initial a new scene and active the search
engine. While, the lower button link to a scene with a Help document in case users
prefer to get a guidance before use . Also, two buttons are created in Help Scene for
Name (ID card Number): 5832581
Supervisor: Henry Gardner
22
the convenience of users by allowing them to jump into Search Scene or go back to
the beginning scene.
Figure 12. The welcome scene built in unity development environment
3.4.3 Scene Switching
As mentioned in 3.2.1.1, the buttons should be implemented in the development
environment to achieve the connectivity between scenes. In Unity development, a
defined UnityEvent will be triggered when the UI button is clicked by users with a
built-in On Click () function, which is usually used for confirming an action. [11] In
our case, the button is used for sending a request to change to another scene as the
consequence of pressing it. The On Click () provides a channel to listen to the input of
click without default response event, which means the default button could not invoke
any event. Thus, a script is coded to determine the expected response of each button.
Unity provide a sample method named SceneManager.LoadScene() to load the certain
scene with the scene name as argument.
23
Figure 13. The inspector panel of the button and the code for scene switch
3.4.4 Information Board Function
In this application, Information Board Function refers to the method of display the
searched information, performed in Search Scene. It basically adopts UI text to
present the textual material.
Figure 14. The translucent information board locates on the top of the screen
Name (ID card Number): 5832581
Supervisor: Henry Gardner
24
To avoid the digital content covering up the items in camera image, which is
mentioned before, two possible solutions are proposed. The first one is considering
the information board as a child of image target, which means it will automatically
appear in the screen when the image target is detected and recognized. A potential
problem of this solution is that the information board will keep changing when users
sweep their camera over several items. It will lead to a disturbance with keep showing
the information of uninterested items. Thus, another solution is raised and adopted in
this case, which is using a button to control the display of information board. The
button is created as a child of image target to enable it to be only presented above the
item when the target is found by the application. UnityEvent.AddListener is added its
script to monitor the click event so that the information board could switch the state of
presence by pressing button repeatedly.
25
Chapter 4
Evaluation
4.1 Cognit ive Walkthrough
The goal of usability inspection is to determine the design problem by allowing
evaluator to inspect the application interface. [7] Among several user-based usability
inspection methods, the Cognitive Walk Through is chosen in our case. It requires the
evaluator to follow the specific and particular instruction to simulate the environment
where the user might fail to get desired result with correct operation at each stage. [8]
In each stage, evaluators need to consider the potential users’ behaviors with four
questions listed below: [9]
1. Will the user try to achieve the right effect?
2. Will the user know that the correct action is available?
3. Will the user associate the correct action with the effect one is trying to achieve?
4. If the correct action is performed, will the user see that progress is being made
The assumption of basic usage requirements needs to be clearly defined, which is
shown as following:
i. Define Device:
a) Mobile phone model: iPhone X
b) Screen type: Full touch screen
c) Operating system: iOS 12.3.1
ii. Assumption:
a) Mobile phone works functionally well.
b) Battery is enough for using and in good state.
Name (ID card Number): 5832581
Supervisor: Henry Gardner
26
c) Network is stable for data transmission.
d) The tested application is properly installed.
iii. Task:
Searching items’ information
iv. Operation step:
The operation step for evaluated task is given in figure.
Figure 15. The action sequence of tested task
4.2 Results
To perform the Cognitive Walkthrough, a usability expert (my supervisor, Henry
Gardner) was invited to evaluate the task mentioned above. Specific instructions were
defined for the evaluator to follow.
In presenting the results of the cognitive walkthrough below, a screenshot is provided
to illustrate the actions and problems found by Henry at each stage of the walkthrough.
Note that I have used a dummy picture of a comic book rather than a real picture of a
restaurant in the following example.
27
The success story and potential usability errors are described following each step:
Step 1: Initial searching interface
Press on the button named “Let’s Explore!” (Figure 1a)
Figure 16. The screenshot of Initial searching interface step
Response:
l Pressed button color goes dark (Figure 16b).
l The camera is active, and the display jump to the camera dynamic image (Figure
16c).
Usability issues:
- The dark button is not dark enough to show that it has been selected.
- The initiation of Vuforia might take a long time (some seconds) to jump to
another scene. Without any message or loading hints, users might consider the
button do not work and lose patience waiting for a response.
- The Help button is blocked while waiting for the Vuforia scene to be initialized.
Step 2: Search potential target
Aim the item with camera
Name (ID card Number): 5832581
Supervisor: Henry Gardner
28
Figure 17. The screenshot of step Search potential target
Response:
l The images of items in real world are captured by camera.
l A blue icon appears in the center of the screen.
Success story: The frame is smooth while user moves their device.
Usability issue: An iOS button for “disability assistance” is displayed over the video
image. First time users might be confused by seeing this icon. It is nothing to do with
my app.
Step 3: Request information about a target
Open information board
Press on the icon in center of the screen (Figure 18a).
Figure 18. The screen shot of request information about a target step
Response: The information board with related information appears on the left corner
of the screen (Figure 14b).
29
Failure story: the icon in the center of the screen is not suggestive to be pressed. Users
may not realize the icon could be pressed to present the information without reading
the instruction.
Step 4: Close information
Press the icon in center of the screen again (Figure 15a).
Figure 19. The screenshot of close information step
Response: The information board disappears (Figure 19b).
Failure story: Users are not clear how to close the information board without clear
indication. Once they lost the target in the frame, the icon will disappear, which leads
to the information could not be closed.
Name (ID card Number): 5832581
Supervisor: Henry Gardner
30
Chapter 5
Discussion
Basically, this prototype could serve the purpose of searching and presenting the item
information to users by using the combination of object detection and AR technique.
It also shows the potential diversity of dealing different kind of objects, since it gains
high accuracy and speed of detecting. However, some disadvantages were found
during developing.
One of the disadvantages is that, even though the Vuforia cloud sever has limitation
of use for free account, the flaw is rapidly noticed that it slows down the recognition
speed during small number of tests. The reason is clearly that communication with
high quality image to route server would rely heavily on Internet connection. The
possible solution is improving the image feature extraction method to lighten the
image size or next generation communications could benefit the large file
transmission.
Other problems are mainly found during evaluation section, some UI problems are
exposed and evaluated. To schedule the working plan for fixing the issues in the
future, the problems need to be measured from difficulty and severity. For example,
the initiation of Vuforia takes long time and forces the users to wait without any
response. The severity could be middle level since there are alternative plans to cover
this disadvantage like using loading hint. However, fixing this problem itself is at
high level of difficulty, which requires to unsealing the deep code of it. By balancing
these two factors, this application could gain a stable of development life-cycle.
For future work, one potential problem is that, presently, the area of returned
31
information is decided by writer, which could lead to the misguide, since the users
could be interested in different area for different objects. Thus, machine learning
should be introduced to analyze the possible interested aspects for each object.
Name (ID card Number): 5832581
Supervisor: Henry Gardner
32
Chapter 6
Conclusion
This project provides a simple and efficient AR architecture for implementing search
engine based on the image, take advantage of free and open-source software. It
indicates a possible method to replace the traditional search engine by constructing
the database online to lighten the application itself and gain high performance at same
time. However, the disadvantage is obvious that it requires high quality of network. In
the future, with next generation communication or lighter image target and AI, AR
could present the information in a more natural and enjoyable way.
33
References
1. Tang, Q., Chen, Y. , Schaefer, G. , & Gale, A. G. . (2018). The development of an augmented reality (AR) approach to mammographic training: overcoming some real world challenges. Image-guided Procedures, Robotic Interventions, & Modeling. 2. Jiang, Y., O'neal, E., Yon, J., Franzen, L., Rahimian, P., Plumert, J. and Kearney, J. (2018). Acting Together. ACM Transactions on Applied Perception, 15(2), pp.1-13. 3. Lu, G., Xue, G. and Chen, Z. (2011). Design and Implementation of Virtual Interactive Scene Based on Unity 3D. Advanced Materials Research, 317-319, pp.2162-2167. 4. Nielsen, J. and Mack, R. (1994). Usability inspection methods. New York [etc.]: John Wiley & Sons. 5. Wikitude. (2019). Location-based AR to specific locations in the real-world. [online] Available at: https://www.wikitude.com/geo-augmented-reality/ 6. Ebling, M. and Cáceres, R. (2010). Gaming and Augmented Reality Come to Location-Based Services. IEEE Pervasive Computing, 9(1), pp.5-6. 7. Livingston, M., Ai, Z., Karsch, K. and Gibson, G. (2010). User interface design for military AR applications. Virtual Reality, 15(2-3), pp.175-184. 8. Gabbard, J. and Swan, J. (2008). Usability Engineering for Augmented Reality: Employing User-Based Studies to Inform Design. IEEE Transactions on Visualization and Computer Graphics, 14(3), pp.513-525. 9. Siriborvornratanakul, T. (2018). Enhancing User Experiences of Mobile-Based Augmented Reality via Spatial Augmented Reality: Designs and Architectures of Projector-Camera Devices. Advances in Multimedia, 2018, pp.1-17. 10. Technologies, U. (2019). Unity - Manual: Scenes. [online] Docs.unity3d.com. Available at: https://docs.unity3d.com/Manual/CreatingScenes.html [Accessed 14 Jun. 2019]. 11. Technologies, U. (2019). Unity - Scripting API: UI.Button.onClick. [online] Docs.unity3d.com. Available at: https://docs.unity3d.com/ScriptReference/UI.Button-onClick.html
Name (ID card Number): 5832581
Supervisor: Henry Gardner
34
16. Library.vuforia.com. (2019). Vuforia Target Manager. [online] Available at: https://library.vuforia.com/content/vuforia-library/en/articles/Training/Getting-Started-with-the-Vuforia-Target-Manager.html 17. Peng, F., & Zhai, J. . (2017). A mobile augmented reality system for exhibition hall based on Vuforia. 2017 2nd International Conference on Image, Vision and Computing (ICIVC). IEEE. 18.Vuforia. (2019). ImageTargets. Available at: https://library.vuforia.com/content/vuforia-library/en/articles/Training/Image-Target-Guide.html 19. Paul DuBois. (2013), MySQL, 5th Edition, UK: Addison-Wesley Professional. 20. Tasi, M. B., Stanimirovi, P. S., & Pepi, S. H. (2011). Computation of generalized inverses using php/mysql environment. International Journal of Computer Mathematics, 88(11), 2429-2446.
21. Docs.phpmyadmin.net. (2019). Introduction — phpMyAdmin 5.0.0-dev
documentation. [online] Available at: https://docs.phpmyadmin.net/en/latest/intro.html
[Accessed 14 Jun. 2019].
22. Bertocco, M., Ferraris, F., Offelli, C. and Parvis, M. (1998). A client-server
architecture for distributed measurement systems. IEEE Transactions on
Instrumentation and Measurement, 47(5), pp.1143-1148.
23. Docs.phpmyadmin.net. (2019). Introduction — phpMyAdmin 5.0.0-dev documentation. [online] Available at: https://docs.phpmyadmin.net/en/latest/intro.html [Accessed 14 Jun. 2019].
35
Appendix
User Experience of Search Engines
1. Which is your most commonly used search engine? A. Google B. Yahoo C. Baidu D. Bing E. If other, please specify 2. Have you ever used the following types of search engines? A. Based on image recognition B. Based on geographic coordinates C. Based on audio files D. None of above E. If other, please specify 3. Have you ever encountered a situation where the search engine failed because the subject of the problem was in foreign language and could not be entered into the search engine accurately? A. Yes B. No 4. Have you ever encountered a situation where the search failed because the subject of the problem could not be identified and described accurately? A. Yes B. No 5. If the above problems mentioned in Q3 & Q4 occur, which areas do you think are more likely to occur? A. Study B. Work C. Daily life D. Travel E. If other, please specify
Name (ID card Number): 5832581
Supervisor: Henry Gardner
36
6. Have you ever used AR products? A. Yes B. No C. Not sure 7. Do you think it is more convenient if the search engine can recognize various input formats , such as voice, images, than in the traditional text format? A. Yes B. No 8. Do you think it is more attractive to present search results in a variety of formats, such as audio, 3D images, than in the traditional text format? A. Yes B. No