an usability evaluation of trio’s e- learning modules ...1466296/...an usability evaluation of...

33
IN THE FIELD OF TECHNOLOGY DEGREE PROJECT MEDIA TECHNOLOGY AND THE MAIN FIELD OF STUDY COMPUTER SCIENCE AND ENGINEERING, SECOND CYCLE, 30 CREDITS , STOCKHOLM SWEDEN 2020 An usability evaluation of TRIO’s e- learning modules enhancing the communication between cancer patients, clinicians and carers MELANIE BONNAUDET KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

Upload: others

Post on 19-Sep-2020

0 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

IN THE FIELD OF TECHNOLOGYDEGREE PROJECT MEDIA TECHNOLOGYAND THE MAIN FIELD OF STUDYCOMPUTER SCIENCE AND ENGINEERING,SECOND CYCLE, 30 CREDITS

, STOCKHOLM SWEDEN 2020

An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians and carers

MELANIE BONNAUDET

KTH ROYAL INSTITUTE OF TECHNOLOGYSCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

Page 2: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Abstract The involvement of carers in oncology is important for the health of people diagnosed with cancer as well as carers themselves. To improve their involvement, three groups; patients, their carers, and clinicians should maintain good communication. The e-learning interface, eTRIO, has a learning module for each of these three groups. The design of eTRIO is based on research from psycho-oncologists. This study aims to answer the question; What are the strengths and weaknesses of the eTRIO interfaces for clinicians, carers, and patients in terms of their usability? Heuristic evaluation and think-alouds have been conducted to answer this. The results of this study show that interactive activities, as well as neatly presented content, are engaging the user, buttons and content should have clear purposes. The eTRIO interface will enhance carers' involvement with good usability, making it easy for users to retain important information. Strengths and areas for improvement will be presented in this study.

Page 3: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Sammanfattning Inkludera cancerpatienters närstående i onkologi är viktigt för både cancerpatienter och deras närstående. För att förbättra de närståendes inkludering måste tre grupper; patienter, deras närstående och läkare ha god kommunikation mellan varandra. E-lärande platformen, eTRIO, har en modul för varje ovannämnd grupp. Designen av eTRIO är baserad på forskning av psyko-onkologer. Denna studie har som syfte att besvara frågan; Vad är eTRIOs gränssnitts styrkor och svagheter för läkare, cancerpatienter och närstående med avseende på användarvänlighet? En heuristisk utvärdering och think-alouds har gjorts för att svara på denna fråga. Resultaten av denna studie visar att interaktiva aktiviteter och visuellt tilltalande presenterat innehåll engagerar användarna samt att knappar och innehåll behöver tydliga syften. Gränssnitten av eTRIO kommer att förbättra närståendes inkludering med bra användarvänlighet och gör det lätt för användare att komma ihåg viktig information. Styrkor och områden för förbättring kommer att presenteras i denna studie.

Page 4: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

An usability evaluation of TRIO’s e-learning modulesenhancing the communication between cancer patients,

clinicians and carers

Melanie BonnaudetThe University of Sydney

Sydney, [email protected]

ABSTRACTThe involvement of carers in oncology is important for thehealth of people diagnosed with cancer as well as carers them-selves. To improve their involvement, three groups; patients,their carers, and clinicians should maintain good communica-tion. The e-learning interface, eTRIO, has a learning modulefor each of these three groups. The design of eTRIO is basedon research from psycho-oncologists. This study aims to an-swer the question; What are the strengths and weaknessesof the eTRIO interfaces for clinicians, carers, and patientsin terms of their usability? Heuristic evaluation and think-alouds have been conducted to answer this. The results ofthis study show that interactive activities as well as neatlypresented content are engaging the user, buttons and contentshould have clear purposes. The eTRIO interface will enhancecarers’ involvement with good usability, making it easy forusers to retain important information. Strengths and areas forimprovement will be presented in this study.

Author KeywordsUsability; user experience; think-alouds; heuristic evaluation;e-learning; medical teaching.

1. INTRODUCTIONCancer incidences are rising and becoming overall more com-mon, according to Bray et al.’s global cancer statistics 2018[3]. As cancer affects millions of people across the world it isimportant to provide accurate cancer care.

The cancer patient’s medical situation and decisions have animpact on their relatives’ life and health [1]. Therefore, it isimportant that relatives can participate in the patient’s medi-cal consultations and treatment decision-making. The carersto the patient have expressed needs in getting medical andbehavioural information [14]. Oncologists also revealed tohave needs in learning how to provide adapted information forcarers [21]. In general, patients value the help and support ofcarers [12]. They need support outside the medical system,from family or friends, in the challenging cancer experienceand in their interactions with the medical system. Hereby, forideal cancer care, patients, carers and oncologists need to en-hance their communication and behavioural skills for medicalsituations.

A project with this aim is "TRIO, Clinician-patient-familyworking together for quality care". It is carried out by mem-bers of the Psycho-oncology Co-operative Research Group(PoCoG) from the University of Sydney and is grant-fundedby Cancer Australia and Cancer Council NSW. The TRIOFramework is introduced by Laidsaar-Powell et al. [13] andhas three main persons:

• The cancer patient: a cognitively competent adult cancerpatient.

• The main clinician: an oncology physician.• The main caregiver: someone related to the patient biolog-

ically, legally or emotionally, accompanying the patient tomedical consultations and assisting in the patient’s care.

To enhance the communication between cancer patients, theircaregivers and oncologists, the TRIO project has created an on-line e-learning platform, called eTRIO, with content based onTRIO guidelines [9, 10]. Each group in the TRIO Frameworkhas its own learning module with sections covering differ-ent topics. It is a multimedia e-learning interface as it hastext, videos and different interactive activities. These are, forexample, yes-no questions, sorting cards according to theirimportance, surveys about the user’s emotional profile, etc.(see Appendix A for eTRIO walk-through). This type of con-tent makes learning an active and engaging process. Howeverinadequate user experience can affect the learning outcomenegatively [8]. Therefore, this study contributes to the TRIOproject by evaluating the usability of eTRIO. It investigates theuser experience of the TRIO e-learning interfaces to determinewhat interactions and presentations of content are needed tosuccessfully meet the users’ communication and informationneeds.

2. BACKGROUND

2.1 The TRIO Framework and their needsLaidsaar-Powell et al. [13] introduced the notion of a TRIOFramework, also called TRIO Triangle. It has a cancer pa-tient, a main clinician and a main caregiver. The purpose wasto study why it is important to understand the involvementof family caregivers in cancer care. Their study shows howmuch they are involved in treatment decisions in cancer con-sultations. Results have shown that caregivers’ involvementdepended on several factors and varied from person to person.

Page 5: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

These factors can be demographic, psychological, relational,cultural and medical. Three different cases are given withdifferent influences in decision-making from the caregiver.

An example of the first case is where a patient and oncolo-gist discuss whether to undergo chemotherapy or not. Thecaregiver states they will support whatever decision the pa-tient makes, their influence in this decision is very small. Asshown in Figure 1, the decision is focused on the patient andoncologist.

Figure 1: TRIO Framework/Triangle with focus between the patient (left)and the clinician (top). Figure from [13].

A second example case is of a patient who has to choosewhether to delay chemotherapy or to undergo fertility treat-ment. The patient discusses this with her husband, and hestates he would prefer the fertility treatment. As shown inFigure 2, the decision-making focus is mainly on the patientwith influence from her husband.

Figure 2: TRIO Framework/Triangle with focus between the patient (left)and the caregiver (right). Figure from [13].

A third case is where a patient with limited English proficiencyhas cancer and his son, fluent in English, exchanges all infor-mation with the oncologist. The son directs the conversationwith the consent of the patient. The decision-making of thetreatment focuses on the son, the caregiver.

How much each party is involved is important informationto the TRIO e-learning website as all three are target groupsfor eTRIO. Triadic communication has been shown to be

helpful in medical encounters but can also be challenging [11].To facilitate this communication, psychologists, cliniciansand academics have through years of research determinedthe TRIO guidelines. These have been used to design theTRIO e-learning modules. Guidelines consist of enhancing thecollaboration of caregivers’ involvement in medical encountersof oncology as well as how to handle challenging interactionswith caregivers [9, 10].

What needs family members have in the decision-making oftreatment for people with chronic diseases have been studiedby Lamore et al. [14]. Family members need to be pro-vided with medical knowledge and often want to participatein decision-making discussions. In these discussions, Lamoreet al. have shown that adopting helpful behaviours is neededsuch as not being dominating, providing information and sup-porting the patient. The patient must also be allowed to decidewhen they need a private conversation with the physician with-out their family member. Lamore’s study mentions that allthree parties, caregivers, patients and physicians are positivetowards the involvement of caregivers in medical consultationsand decisions. Although, for the involvement to work well,caregivers need information, both medical and behavioural.The TRIO e-learning aims to provide this type of information,which is why it is important to study the interface before beingreleased for accurate learning and caregiver involvement thatfulfils the three parties’ needs.

How the partner’s involvement is related to decision-makingin triadic cancer consultations has been studied by Bracher[2]. In these consultations, the partner has had different roles,behaviours and relationships with the patient and clinician.Some have been dominant, interruptive, difficulties to cedewhile other characteristics have been emotional support andhelpful contributions such as self-initiated questions. Spousesand children to the patient are more likely to engage thanother relatives and friends. Some physicians interacted morewith the patient than the partner and often shifted back theconversation to the patient. This showed how all involved intriadic consultations need to get a better understanding of theirrole and communication with each other.

A literature review about the health of patients and their care-givers by Hodges et al. [7] showed there is an association be-tween their well-being. The caregivers’ psychological healthand stress level are strongly related to the patients’ health andstress. The patients are also very likely to become distressedwhen their caregiver does; they both get a similar level ofdistress. Bevans et al. [1] have studied how a caregiver’s lifeand health get influenced by the caregiving responsibilitiesof a cancer patient. These responsibilities bring stress andburdens that affect the caregiver negatively. Stressful momentsare inevitable, but they can be eased. Bevans concludes thatavoiding barriers between caregivers and physicians by let-ting caregivers participate in the medical proceedings, is goodprevention. McCarthy [15] has studied carers’ informationneeds: they need medical information and often want to hearit from the physicians. It is recurrent that they have to activelyseek it from physicians and can feel ignored. Soothill et al’sstudy [20], shows that caregivers to patients have more unmet

Page 6: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

needs than the patients. This shows how communication isimportant for everyone’s well-being. The TRIO e-learninghas an important aim to target these problems and educate allthree user groups to avoid unmet needs.

2.2. E-learning and Usability of e-learningRuiz et al. [18] have done a literature study on e-learning’seffectiveness and how it can be applied in the medical world.They state that e-learning enhances individual learning andcan be integrated into education as well as be used during dutyhours. Being able to use e-learning in these circumstancescan be convenient for the medical world. Ruiz’s study alsopoints out important things to think about when evaluatingan e-learning platform, for example, the ease of use, naviga-tion, the material, interaction, etc. Without investigating theseexamples, e-learning loses its effectiveness. Therefore, theeTRIO can be suitable for its user groups, if the usability isadequate.

The first steps in developing e-learning for oncologists to im-prove their information-giving skills have been studied by Stuijet al. [21]. They have shown the oncologists’ learning needswithin how they should provide information to patients as wellas their training preferences within their profession. Oncol-ogists want to be able to adapt the information for patients,structure the information and deal with patients’ emotions. Fo-cus groups and interviews revealed that the preferred learningmethod is a digital platform with multimedia content such asvideos. Feedback from peers, experts and patients would alsobe appreciated. They want to be able to adapt their learning totheir own personal needs, have it easily accessible with simpleuse. The eTRIO aims to fulfil these needs by letting the userchoose which module they want to do without having to followa particular order and can complete them whenever they wantto.

Huang [8] shows that designing an interactive multimedialearning tool with dynamic content makes learning an activeand engaging process. Examples of features to enhance this,are to be able to immediately test your knowledge, easilyvisualise information, having content in different forms suchas animations and videos. Huang recommends several stepsto create such a tool. Firstly, it is necessary to understand thelearning goal and the user needs, to then design the content andutilize adequate technology. Multimedia materials, content indifferent forms (text, video, images, etc.), are recommendedand should be implemented in an e-learning platform. Theplatform should consider web standards and human factors(e.g. how human behave with the platform) to achieve goodusability. User tests are then required to be able to evaluateand improve the design. When the module is built, user testsand heuristic evaluation are needed to know how the modulesperform. Especially since technical problems can interferewith the intended learning outcomes. Science, education andtechnology knowledge must be combined to create an accurateeducational media. The current state of the eTRIO is that itneeds to be evaluated to ensure good user experience. Userexperience evaluations such as think-alouds and a heuristicevaluation will be conducted in this study.

2.3. Purpose and research questionThe TRIO e-learning aims to enhance the communication be-tween all parts of the TRIO framework. This is an importantobjective for the well-being of patients, carers and clinicians.To achieve this, the usability of the e-learning must be ad-equate. Therefore, this study aims to answer the followingresearch question: What are the strengths and weaknesses ofthe TRIO interfaces for clinicians, carers and patients in termsof usability?

3. METHODThis section describes the scientific methodology used to an-swer the research question. The method includes a heuristicevaluation, conducting and analysing think-alouds and a Sys-tem Usability Scale questionnaire.

Prior to think-alouds, a heuristic evaluation was conducted todiscover major usability flaws, a method that does not involveany users. In this evaluation, the eTRIO interfaces were exam-ined to identify problems that didn’t comply with recognizedusability principles. These principles are heuristics determinedby Nielsen and Molich [17]. The problems found have beenranked according to their severity, prevalence and ease to solve.Severity is ranked from 1-5 where 1 is the least severe and5 the most severe in terms of usability and stopping the userfrom using the functions properly. Prevalence is ranked from1-5 where 1 is an infrequently occurring problem and 5 is afrequently occurring problem. The ease to solve is rankedfrom 1-5 where 1 is a hard and time-consuming problem to fixand 5 is an easy problem to fix. The higher the ranking in thethree categories, the more the problem is a priority to fix.

To determine eTRIO’s effectiveness, think-alouds have beenconducted. It consisted of having participants use the interfaceand express their thoughts out loud as they go through it.Think-alouds have several advantages as described by Nielsen[16] such as being simple to learn. This was beneficial forthe collaboration with the TRIO team and the participants thatare not familiar with this method. They were conducted bythe author of this paper and members from the TRIO team.The first think-alouds were conducted in person but were thenconducted remotely with the video-conferencing tool Zoom 1.This allowed recording the users’ interactions on their screenand hear their comments which benefit analysis [6]. The think-aloud method measured the effectiveness and user satisfactionof the TRIO e-learning interfaces. These are two importantpoints also Brooke [5] mentioned to analyse the usability of awebsite.

Eleven think-alouds were conducted with one participant ata time. The inclusion criteria for participation were that theywere doctors or nurses within oncology, patients diagnosedwith cancer at least six months ago, or carers to such a patient.Five think-alouds were conducted with clinicians, three withcarers, and three with patients. Nine females and two malesparticipated and were between the ages of 35 and 77. Patientswere in average 65 years, carers also 65 and clinicians 47. Theparticipants were recruited by the TRIO team.1https://zoom.us/

Page 7: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

First, each participant filled out a questionnaire about theirbackground to get demographics. The participants had aroundan hour to perform tasks that consisted of going through sev-eral sections of the module that included pages with text,videos (Figure 4), and various interactions such as sliders(Figure 5), bubbles (Figure 6) and scenarios (Figure 7). Theywould begin at the dashboard (Figure 3) and were free tochoose which sections they wanted to go through. Thisbecause eTRIO is developed without any obligations to gothrough the sections in any specific order. It was ensured thatall sections had been gone through by at least one participantso that all potential usability issues get covered, also severalsections contain similar layout and interactions (see AppendixE for user progress during think-alouds). The layouts and in-teractions shown in Figure 3-7 will be mentioned in the resultpart.

Figure 3: eTRIO clinician dashboard with its sections. Green: the sectionhas been completed, Light blue: the section has not been started. Dark blue:the section has been started but not completed. In the corner of each sectionis the estimated time to complete the section.

Figure 4: Video activity with button to click within a section.

The activity in Figure 4 is explained in the blue informationbox above the video. It consists of clicking on a button (underthe video, not seen here) when the doctor makes rapport withthe carer. On the next page, the user will be able to see howwell they have done on the activity.

Figure 5: Sliders interaction for the clinicians within a section.

The white circle of the sliders in Figure 5 should be movedto a white line, according to how much the user agrees witheach statement. At the bottom of the page, a text will appeardescribing their attitude towards carers depending on theiranswers.

Figure 6: Bubbles interaction for the clinicians.

Clicking on the initially blue bubbles in Figure 6 reveals moreinformation and makes them turn purple. In this case, theblue bubbles are sayings the carers and patients might say, thepurple side of the bubble gives an example answer.

Figure 7: Scenario activity for clinicians with text-box.

The green box in Figure 7 gives a scenario about a patientand a carer. In the free text box underneath, the clinicians canreflect whether they would encourage the carer to come. On

Page 8: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

the next, examples of what can be done in this scenario aregiven. The clinicians, carers and patients have the same layout,interactions and activities but with different content.

To determine user satisfaction, the participants were welcometo give free comments about their experience interacting withthe e-learning module. Lastly, the users got to fill out a SystemUsability Scale (SUS) questionnaire [5]. The SUS question-naire has ten statements to which the participants answeredwith a Likert scale from strongly disagree to strongly agree(see Appendix B for detailed statements). A version withonly positive statements was used as research after Brookehas shown that there is little evidence that the advantagesof having an alternation of positive and negative statementsovercome the disadvantages [19]. This version will ensurefewer mistakes in the participants’ answers. The creator ofSUS, Brooke [4], also later approves this new version. TheSUS score, obtained through an algorithm, determines if thewebsite is usable or not. The algorithm consists of subtracting1 from each answer, add up the total score and multiply by 2.5to get a score out of 100 [19].

To analyse the collected data, the think-alouds have been tran-scribed and important usability aspects highlighted which al-lowed gathering interesting quotes which are presented in theresults. For an overview of the results, a table (AppendixD) with the mentioned strengths and weaknesses was made.These were then classified into categories and how many usershad mentioned each strength and weakness. The results of theheuristic evaluation and think-alouds were then compared.

4. RESULTS

4.1 Heuristic evaluationUsing the heuristic evaluation, 44 problems were found. Theareas of the identified problems were:

• Inconsistency of icons and redundancy in buttons.• Buttons and interactions not working.• Responsive design problems in layout.• Presentation of content.

In these categories, 37 problems were classified (see AppendixC); less severe content problems such as spelling mistakesor inconsistencies in the use of terms are not included in thisclassification. Twelve out of 37 problems were seen as easyto solve with the rating 5. Seventeen out of 37 problems werevery prevalent with the rating 5. Twelve out of 37 problemswere classified as severe with the rating 5.

Following this evaluation, eight severe problems were fixedbefore conducting think-alouds. As shown in Table 1, allproblems were ranked as very severe, ranked 4 to 5, and six ofthem had a high prevalence, ranked 4-5.

Problem S P ENothing happens when clicking on print button 5 5 3Clicking on complete on last page of any partmakes the whole dashboard go green althoughseveral parts are not completed

5 5 3

Nothing happens in card activity 5 1 3What is written in textbox does not get saved 5 4 3

"Save and exit" button is only on the first pageof each part 5 5 4

Progress bar is not showing the correct progress 5 5 2Asterisk after names lead to nowhere 5 5 5Green colour on slider for the "bad" part 4 1 4

Table 1: Problems fixed after the heuristic evaluation. S: Severity (1-notsevere, 5-severe), P: Prevalence (1-not often, 5-often), E: Ease to solve (1-hardto solve, 5-easy to solve).

Fourteen problems identified with the heuristic evaluationwere also mentioned in the think-alouds. Table 2 below,presents the problems that heuristic evaluation and think-alouds have in common. The problems’ severity ranking variesfrom 2 to 5, prevalence varies from 1 to 5, and ease to solvevaries from 3 to 5.

Problem S P EArrows and back/next buttons have the samefunction 4 1 5

No link on guidelines that are referred 2 2 5Can be unclear what the button "save and exit"means 4 5 4

Not possible to save on what page in a part theuser stopped 5 5 3

No margin on left side of the text on pages, textis almost cut on bigger screens 4 5 3

Text is not aligned with the textbox/bar, text isoverlapping each other when opening severalinformation boxes

5 4 3

Some boxes get thinner than others whenopened 4 4 3

Star when bookmarked goes from green to pur-ple, when unbookmarked does not go back togreen

2 5 3

Fading in of text 3 5 5No hover highlighted feedback on bubbles toclick on 3 2 4

Clicking on button of video activity gives nofeedback 3 4 4

Spelling mistakes 3 3 5Unclear what you get for completing the train-ing 3 1 5

Table 2: Problems that the heuristic evaluation and think-alouds have incommon. S: Severity (1-not severe, 5-severe), P: Prevalence (1-not often,5-often), E: Ease to solve (1-hard to solve, 5-easy to solve).

Further results of the think-alouds will be presented in the nextsection.

4.2. Think-aloudsEleven think-alouds have been conducted, three with patients,three with carers and five with clinicians. They are being keptanonymous and will be mentioned as carer (Ca1-3), patient(P1-3) and clinician (Cl1-5).

The frequency of strengths and weaknesses mentioned by allparticipants in think-alouds has been analysed (see AppendixD for full table). In table 3, frequently mentioned commentsthat were mentioned by at least four out of eleven participants

Page 9: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

(36% of the participants) are presented. They are ordered fromstrengths to weaknesses.

Comments about T + [P,Ca,Cl] - [P,Ca,Cl]Content the user can relateto 10 10 [2,3,5] 0

Content emphasising theimportance of the carers 8 8 [3,3,2] 0

Quotes 8 8 [2,3,3] 0Short videos 8 6 [0,2,4] 2 [2,0,0]Easy navigation 7 6 [1,2,3] 1 [1,0,0]Popping bubbles to revealmore information 6 4 [1,1,2] 2 [1,0,1]

Estimated time on each sec-tion 5 4 [1,2,1] 1 [0,0,1]

References 5 4 [1,1,2] 1 [0,0,1]Pictures 5 4 [1,1,2] 1 [0,0,1]Colours used throughoutthe e-learning 4 3 [1,1,0] 2 [0,1,1]

Busy slides 6 0 6 [1,2,3]Text size in general 7 2 [1,1,0] 5 [1,2,2]

Table 3: Frequency of mentioned comments by participants in think-alouds.T: Total amount of participants mentioning the comment, +: Total amountof participants mentioning the comment as a strength, -: Total amount ofparticipants mentioning the comment as to improve. [P,Ca,Cl]: Amount of[patients,carers,clinicians] mentioning the comments as a strength or weak-ness.

The next sections will present more details about the frequentlymentioned comments with testimonies and will finish withother notable comments also mentioned by participants. Thequotes will show views of the participants in their own wordsto provide a richer picture of the tables above.

4.2.1. Comments about content4.2.1.1. Content the user can relate toTen participants agree with the content, especially when theycan relate to it.

“I think this is a very useful slide. When we went in to our firstmeeting we were just there me and my son did this.” - Carer 1

“It is definitely something that we deal with a lot day to day.And I think it’s taken my attention because I think understand-ing these things more definitely has a direct impact day to dayon what we do.” - Clinician 2

One participant mentions they can not relate with some contentbut still think it’s useful information and good advice.

“It says group text messages or emails summarising the consul-tation, that never happened with us, although I agree it shouldhappen.” - Carer 3

4.2.1.2. Content emphasising the importance of carersEight participants believe it is essential to emphasise how im-portant the carer is. Here are statements from one participantof each user group:

“I get your message about including the family in consultationsand a couple of tips and stuff like that. They are the gems.” -Patient 1

“The importance of carers like their role, I mean some peoplemay not know what they should and shouldn’t do in order tohelp the person, is good first knowledge. And then I think thatwould make the other parts very productive for them.” - Carer2

“Family need to be involved and there is a lot of stress involvedso that was good it was pointed out“ - Clinician 3

4.2.1.3. QuotesEight participants mention enjoying the quotes included in thee-learning. For example, a clinician mentions to like them forthe evidence they give.

“I like the use of the quotes. It gives a bit of a supportiveevidence to it so I like that.” - Clinician 5

A patient mentions the quotes make the e-learning more per-sonal and likes the reality of it. Another patient mentions itgives a good break between heavier pages.

“Lovely, really nice. So breaks it up with little quotation.” -Patient 1

4.2.1.4. Short videosSix participants mention liking the videos.

“I think really the use of videos is so important now. Peopleare expecting to be able to click on stuff so yeah I think that’sgreat.” - Carer 2

One of the clinicians likes that it gives some variety to thee-learning.

“Oh a video that’s interesting, it’s sort of mixing it up, it’s niceto have the different things.” - Clinician 2

The clinician also mentions the content of the videos and theactors are good for the learning.

“I think that’s good. I think listening to her language, I thinkthat’s what I struggle with. A lot of the struggle is how tochoose the right words to say what she said. So she’s done thatvery eloquently and it’s good to hear that point.” - Clinician 2

Two carers mention they like that the videos are not too long.Meanwhile, two patients do not like videos for learning.

“I’m not going to play that. I’m not a very video type people.” -Patient 1

4.2.1.5. ReferencesFour participants like the references. One clinician like beingable to see the references by hovering the reference number.

“I quite like the way the little references pop up, that’s quiteeasy to use.” - Clinician 2

One carer likes to have a document with all references.

“I’m just gonna click on the references and yeah that’s goodyou’ve got those.” - Carer 2

One clinician does not immediately understand how to see thereferences.

Page 10: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

4.2.1.6. PicturesFour participants like having pictures throughout the module,a patient explains the pictures’ benefits.

“So instantly more engaging because there is a photo, visuallyappealing.” - Patient 1

Although, one clinician and one carer that like pictures doesnot understand what some of them represent and why thosehave been chosen.

“What’s going on with that picture, why have you chosen that?”- Carer 2

4.2.1.7. Busy slidesSix participants mention disliking busy and text-heavy slides.

“I think you are making this page very busy with text and it’s abit confrontational.” - Carer 2

One patient would rather have more visually appealing con-tent.

“It is pretty text-heavy and I guess that I am more of a visuallearner so it might be nice to have some more pictures, icons,to make it a little bit more visually appealing.” - Patient 1

Two clinicians mention they prefer pages where they quicklycan get the message of the text.

“When I read something apart from patients notes, I skim it.So it’s gotta be something that I can get the message with aglance.” - Clinician 1

4.2.2. Comments about usability4.2.2.1. Easy navigationSix participants found the navigation through the website easyand straightforward like one carer mentions here:

“I think it’s pretty easy and straightforward. I think anybodywho’s used to doing online training, modules and so on willprobably find it really easy.” - Carer 1

On the other hand, one patient had troubles with the navigation,it is overwhelming mentions the patient below.

“Overwhelming, once you get into the thing, I have no ideawhat I have to do.” - Patient 2

4.2.2.2. Popping bubbles to reveal more informationFour participants like the interaction of clicking on bubbles toreveal more information. The interactivity of it is appreciatedas one clinician mentions below.

“See this makes me wanna click on it which is what you wantus to do yeah. I like the bubbles! Yeah I quite like that.” -Clinician 5

One clinician does not understand the purpose of the bubblesand one patient does understand that they should click onthem.

“I’m not sure why there are 3 bubbles here.” - Clinician 1

4.2.2.3. Estimated time on each moduleFour participants mention liking to be able to see how muchtime each section will take. This helps them to plan how to gothrough the module as a carer states below. They call sectiona part.

“Looking at this dashboard I like it that it tells you how longeach part is going to take just so you know in advance. You’rebusy and maybe you just have time to do half of it and thenyou can sort of plan how you’re going to tackle it.” - Carer 1

Another clinician likes this feature but found it odd that theestimated time is so precise, for example, an estimated time of4 minutes instead of 5 minutes.

“But the time in the top right corner here, is that like how longit should take you to complete that module? It’s very precise,isn’t it?” - Clinician 5

4.2.2.4. Colours used throughout the websiteThree participants mention positive aspects of the coloursand three mention colour features to improve. The coloursmentioned regard both background colours and colours of text.One carer mentions they like the colour theme of the interface.

"I think blue is a good colour, the purple in the back is pleas-ant" - Carer 2

Another carer is unsure if the colours, purple and turquoise, ofthe interface, work well together. Although, both these carerslike the colours of the dashboard and the feature of colourschanging when a section is completed.

“I’m really impressed by the dashboard, I like the colours andit looks clear to me where you start and where you finish." -Carer 2

“I like the colours changing as you’ve completed it” - Carer 1

One patient likes when important text is written in a differentcolour.

“Even with heavy text, the fact that you have highlighted themin different colours immediately makes them more accessibleto focus.” - Patient 1

A carer, on the other hand, does not like text in blue, theywould prefer more consistency in choosing text colour, boldingand making text italic.

"It’s just in blue and in italics, it’s not bolded or yeah there’sone bolded word. I think you can probably present that better.”- Carer 2

One clinician prefers when text is written on a coloured back-ground, it feels less harsh than black on white.

4.2.2.5. Text size in generalTwo participants found the text size alright. Five other partici-pants mention problems with the text size. One patient foundthe text too big.

“Fonts are a tiny bit big so it comes across as a header ratherthan a textbox.” - Patient 1

Page 11: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

In contrary, two clinicians and two carers found the text toosmall and wonder if they can change it. Two carers mentionspecifically at which places it is too small, like this carermentions:

"The text might be a little bit small in these little boxes. I knowyou are trying to create subtext, but maybe if you just use thesame font and size but just indent it.” - Carer 2

4.2.2.6. Other notable commentsComments with positive tendenciesOverall, all participants have positive comments about interac-tive layout and activities. For example, one clinician mentionsthey are more engaged with text when it is interactive.

“On that slide you get the principles you just go yeah yeah.This is the kind of slides you pass over a lot more quickly thanthe interactiveness that draws you to engage much more withwhat’s being written.” - Clinician 2

Three participants mention it is practical to complete the mod-ule in their own pace and in the order they prefer. Three alsomention they like the dashboard layout. Two clinicians alsomention appreciation of having a variety of different activities:

“The activities were good and it was good to have like a mixtureof different activities in there as well. You definitely engagelike a thousand percent more like I said with the activities” -Clinician 2

“I think it’s nice to have a few interactive things throughout, itbreaks up it when you are just reading and reading things allthe time.” - Clinician 5

Three participants enjoy being able to build and download aquestion list of questions to ask clinicians. The participantsbelieve it is a very important thing to be able to do.

“You can click on whatever ones you want, so this is reallygood. Then you download the checklist. I like that a lot.” -Patient 1

The patient mentions this question list tool is great as well asbeing able to build a list of their carers and their roles.

“Some of the activities like the questions, I really liked. Theones where you wrote down what you thought the carer mightdo for you if you then use it as a communication tool, reallygood as well.” - Patient 1

Five participants have shared opinions on different video activ-ities. Clicking on a button when something specific happensin a video is enjoyed by two participants, although its usehas also shown to be confusing for two of them. It is also anactivity they have not seen before, which they enjoy.

“Oh it’s pretty good. So that comes up on every time thatI’ve clicked on it. I like this section it’s really good. I likethat activity. I’ve never done one of those before that’s reallygood.“ - Clinician 2

“Did I press enough for this one? Oh it’s every time I pressed.I only got 6 out of 12. Sometimes he was just going on withthe same thing, so I didn’t want to overpress. How do I feel

about pressing every time, I’m not sure about that. I wasn’tthat comfortable with that.“ - Clinician 3

One clinician mentions the positive aspects of having textboxes to fill in with free text.

“I think that’s quite good because at least you are giving peoplea little bit more of themselves, they can write free text insteadof just clicking on things. That you ask people to actually writesomething is good.” - Clinician 4

One carer did acknowledge the possibility to bookmark pagesand liked the feature.

“Oh! So I can bookmark my pages. So this means even whenI’ve done the module, if I can’t remember everything I canread it again at a later stage so that might be handy.” - Carer1

Three participants mentioned the content has a simple lan-guage without being too basic. Although, one participant didfind that it goes from simple to complex on some pages.

“I find it easy and I’m impressed with the content. I’m glad it’snot too basic, you know it’s got useful information in there.” -Carer 1

Two participants mentioned that eTRIO’s content covers im-portant topics.

“From what I’ve seen it looks like you’ve put together a veryuseful package.” - Carer 3

There are 19 main strengths, the mentioned nine here and tenin the frequently mentioned comments.

Comments with negative tendenciesSliders do not work like three participants would have ex-pected. It is not possible to slide to wherever the user wants,which has shown to not be intuitive.

“Let’s see if you can just click and drag hopefully. Ah so juston the lines ok. I think it would be nicer if you could move itwherever you wanted.” - Clinician 2

Three participants have mentioned the redundancy of arrowsand next/back buttons to navigate through the pages. Theymention the downsides of having arrows in the middle ofthe page is that content further down that page can easily beskipped.

“I could easily just press next and I would then miss the mostimportant thing.” - Carer 1

Boxes showing more content when clicked, appeared to haveglitches. Text is overlapping and the boxes are changing sizeswith no reason. This has been identified in the heuristic evalu-ation as well as being mentioned by two participants. It is notappealing for the user as a carer states below.

“This oncologist one has overlapped with the surgeon one. Andwhy is his box so big? Was it like this when the page first cameup, I can’t remember?” - Carer 1

The meaning of the "Save and exit" button has shown to beunclear for two participants.

Page 12: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

“How do I get back to the other screen? Save and exit. Ohreally?” - Clinician 5

There are six main weaknesses, these are four of them (sliders,button redundancy, bugs, meaning of "Save and Exit" but-ton) on top of the two mentioned in the frequently mentionedcomments (busy slides, text-size).

4.3. System Usability Scale questionnaireAll participants have answered ten questions from the SystemUsability Scale questionnaire. In Table 4, it is shown how theSUS score can be translated into an adjective average, goingfrom awful to excellent.

Awful Poor Okay Good Excellent< 51 51–68 68 68–80.3 > 80.3

Table 4: System Usability Scale scores translation into an adjective average.

Table 5 shows that four out of five clinicians give a good ad-jective average of the page and one results in an OK adjectiveaverage. One patient gives a good adjective average of thepage, one an OK adjective average and one a poor adjectiveaverage. Two out of three carers give an excellent adjectiveaverage of the page and one a good adjective average. In aver-age, the interface has a System Usability Scale score of 76.8which is a good adjective average.

Cl1 Cl2 Cl3 Cl4 Cl5 P1 P2 P3score 80 80 67.5 75 72.5 67.5 77.5 55Ca1 Ca2 Ca3 Total average97.5 95 77.5 76.8

Table 5: System Usability Scale scores of all participants. Cl1: clinician1, Cl2: clinician 2, Cl3: clinician 3, Cl4: clinician 4, Cl5: clinician 5. P1:patient 1, P2: patient 2, P3: patient 3. Ca1: carer 1, Ca2: carer 2, Ca3: carer3.

5. DISCUSSIONIn general, all participants have shown appreciation about theinterface, especially its important aim on educating how toinclude the carers in the challenging cancer experience.

Users enjoyed visual representations of the text in form ofpictures and icons as long as they match the topic of the page.Quotes were also appreciated by many of the participants, as itgives some reality to the content. Having a layout with a pagecontaining only a quote, gave the participants a break frompages with more content. The participants found activitiesvery engaging and appreciated them as long as the users knewhow to interact with them. This corresponds well with thestudy from Huang [8] that interactive and multimedia contentmakes the user feel more engaged, as long as the usability isadequate. Think-alouds also showed that seeing referenceswas important to several participants, especially clinicians.Only one clinician did not understand from the beginning howthe references were presented. To see how much time a sectiontakes was appreciated by many, to be able to plan how to fulfiltheir learning.

Things to improve content-wise are text-heavy slides and con-sistency in the use of terms. Within usability, consistency is

also needed between buttons and their icons. It needs to beclear what is clickable or not as well as what the buttons andinteractions do as some participants had some trouble withthat. There was of course content and activities that usershad different opinions on. For example, almost half of theusers mentioned liking short videos whilst two users wouldskip them as it is not their type of learning. Bubbles to clickon and reveal more content were appreciated by many users,only two were not sure about them and how to interact withthem. Therefore, with a mix of different activities, everyonecould find something they like. Several participants wantedbigger text although one participant mentioned some text wastoo big and could come across as a header. A participantalso mentioned it would be great to be able to choose the textsize. Activities of building question list and a carer team wereshown not only to be engaging but were also a practical toolfor participants, which they highly enjoyed.

Several problems within the interfaces were identified in boththe heuristic evaluation and the think-alouds (see AppendixF for full comparison). These problems were mostly withinusability such as redundant and unclear buttons or missinguser feedback when clicking on some buttons. The activityof clicking on a button during a video was appreciated forits originality. As mentioned in the heuristic evaluation, thebutton does not give any feedback when being clicked. In thethink-alouds, some participants did mention that they wereunsure if they had clicked and if they had clicked enoughtimes. The heuristic evaluation mentioned the progressionbar’s placement at the bottom of each page was not in favourof the system while in the think-alouds no participant acknowl-edged this bar.

Most participants started by the introduction and would gothrough the parts in order but they also liked the freedomof being able to choose the order they want to complete themodule in. This way they could go through the topics thatare the most interesting to them first. The participants wentthrough the sections but did not open any hamburger menus,use the bookmark function or explored their profile page.

Using heuristic evaluations, some problems might not be men-tioned by users and all problems found by users might notbe found in the heuristic evaluation [17]. In this study, prob-lems only mentioned in the heuristic evaluation were mostlyabout pages the users did not visit or problems that had al-ready been fixed. On the other hand, problems not identifiedin the heuristic evaluation that users mentioned, were mostlycontent problems which were not the main focus of the heuris-tic evaluation. This shows good use of heuristic evaluationthat allowed to identify major flaws of the program withouthaving to recruit any users. Heuristic evaluation is a goodmethod to use before think-alouds, but it is also importantto consider additional usability testing methods to reinforcethe results’ credibility. Exploring these methods enabled dis-covering which problems each reveals. Heuristic evaluationwas done first while think-alouds were being organised andparticipants recruited.

The System Usability Scale score that classifies the eTRIOinterfaces as good confirmed the overall feeling revealed by

Page 13: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

the users. The interfaces have good usability and content buthave indubitably room for improvement. The scores weresignificantly lower for patients, especially one of them (P3).This could be explained by their higher age and lower com-puter literacy than the other users. The patient (P3) with thelowest score had troubles starting the e-learning and sharingtheir screen via Zoom, therefore, in this particular case only,they used the e-learning through the screen sharing of the in-terviewer. This made the e-learning quite slow to react andcould affect the overall feeling of the e-learning for the user.Carers had overall high scores. Two of them were participantswho mentioned many things (see Appendix D to see howmuch each participant mentioned), they were very talkativeand found most of the content and activities very useful. Theirpositive attitude is reflected in high System Usability Scalescores. The third carer did not mention as many things asthe other two. This carer was not very computer literate butstill mentioned they found the navigation straightforward. Al-though most comments were not negative, this carer’s scorewas lower. Clinicians are used to completing other e-learningsregularly in their profession, therefore they could comparethis to what they regularly do. Some of the clinicians were ingeneral more critical than the other participants, thus givinglower scores than carers.

A patient with lower computer literacy showed a lower SUSscore while a carer also with low computer literacy foundthe navigation of eTRIO particularly easy. This shows thatcomputer literacy can, but does not have to, affect the users’experience with eTRIO. Different computer literacy’s effectcould be further researched.

5.1. Method criticismIn this study’s method, think-alouds started by being con-ducted in-person physically and were then pursued onlinethrough Zoom. Using Zoom had both benefits and disadvan-tages. With Zoom, more people from areas further away couldparticipate in the study. It also enabled recording the partic-ipants’ actions on their screen on top of the audio with theircomments. The use of Zoom was new for both the interview-ers as well as the participants. This resulted in technologicaldifficulties such as not finding how to share screen, whicheventually was solved by launching the interface on the inter-viewer’s computer.

Like mentioned earlier, interesting observations during thethink-alouds were made: users did not explore the hamburgermenus and pages the menu leads to. Flaws of think-aloudsare that the users might not interact or navigate the interfaceas they would by themselves outside of the space of think-alouds. The setting of someone watching what you are doingand talking while using the e-learning can be an unnaturalenvironment for the participants. Using it by their own couldhave resulted in different time spent on each page, use morefunctions such as bookmarking and perhaps explore morepages such as their profile, what they have bookmarked, etc.

The think-alouds took 40 minutes to one hour, dependingon the participant. The participants went through differentsections, some went through more than others (see AppendixE). Having the participants not go through the exact same

sections may have resulted that some strengths and weaknesseshave not been mentioned by as many participants as if theywould all have gone through the same things.

Other methods for evaluating the interfaces are for exampleto log the users’ actions and time spent on each page. Thiscould have solved the disadvantages of think-alouds and couldhave given different results. Nonetheless, this study has useda combination of several usability testing methods to rein-force the results’ credibility. Using only one of these, fewerproblems would have been found, and the identified problemswould have been less justified. Each interface could have beenevaluated with more people in each group as big differencesbetween the groups has not been found. Instead of evaluatingthe whole e-learning, the study could have focused on somekey interactions. These methods would have given differentresults, perhaps more detailed.

The System Usability Scale questionnaire originally has analternation of positive and negative statements. In this study,a SUS questionnaire with positive statements only was used.Having both positive and negative statements alternated couldhave given less satisfying results as the alternation is to reducebias into only positive answers [5]. Both the arrangementand the state of the questions can alter answers, however laterresearch has also shown that the benefits of the alternationmight not out-weight the disadvantages of misinterpretingstatements or forgetting to reverse their answer by mistake [19,4].

5.2. Future researchFuture research could focus on evaluating mobile medical e-learnings. This has shown to be of interest by the users ofthis study. Another orientation could be to investigate more inthe activities and interactions specifically. As this study canbe interesting to developers of e-learning and their content,research could be done in to which extent this can be appliedto e-learnings outside the medical system.

6. CONCLUSIONThe outcome has given criteria to improve e-learning’s us-ability and content. Strengths of the website are the overalleasy navigation with several different interactive activities andlayouts. Weaknesses are text-heavy slides, unclear instructionsor buttons and unclear purposes of some content and activities.Users want valuable and useful content that is easily read,visually appealing and helps them in their everyday life ofcancer experiences.

ACKNOWLEDGMENTSI would like to express my sincere gratitude to my supervisorJudy Kay at the University of Sydney for giving me the oppor-tunity to work on this project, for her continuous and valuablesupport and feedback. A special thank you to the TRIO teamfor the insightful collaboration, especially Rebekah Laidsaar-Powell for conducting most of the think-alouds and providingsupport during the entire project. I would also like to thank myinstitution KTH and my supervisor there, Kjetil Falkenberg,for their help. Last but not least, a big thank you to all theparticipants for sharing their valuable thoughts.

Page 14: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

REFERENCES[1] Margaret Bevans and Esther M. Sternberg. 2012.

Caregiving burden, stress, and health effects amongfamily caregivers of adult cancer patients. JAMA -Journal of the American Medical Association (2012).DOI:http://dx.doi.org/10.1001/jama.2012.29

[2] Mike Bracher, Simon Stewart, Claire Reidy, Chris Allen,Kay Townsend, and Lucy Brindle. 2019. Partnerinvolvement in treatment-related decision making intriadic clinical consultations – A systematic review ofqualitative and quantitative studies. (2019). DOI:http://dx.doi.org/10.1016/j.pec.2019.08.031

[3] Freddie Bray, Jacques Ferlay, Isabelle Soerjomataram,Rebecca L. Siegel, Lindsey A. Torre, and AhmedinJemal. 2018. Global cancer statistics 2018:GLOBOCAN estimates of incidence and mortalityworldwide for 36 cancers in 185 countries. CA: ACancer Journal for Clinicians (2018). DOI:http://dx.doi.org/10.3322/caac.21492

[4] John Brooke. 2013. SUS: a retrospective. Journal ofusability studies 8, 2 (2013), 29–40.

[5] John Brooke and others. 1996. SUS-A quick and dirtyusability scale. Usability evaluation in industry 189, 194(1996), 4–7.

[6] Monty Hammontree, Paul Weiler, and Nandini Nayak.1994. Remote usability testing. Interactions 1, 3 (1994),21–25.

[7] Laura. J. Hodges, Gerry. M. Humphris, and Gary.Macfarlane. 2005. A meta-analytic investigation of therelationship between the psychological distress of cancerpatients and their carers. Social Science and Medicine(2005). DOI:http://dx.doi.org/10.1016/j.socscimed.2004.04.018

[8] Camillan Huang. 2005. Designing high-qualityinteractive multimedia learning modules. ComputerizedMedical Imaging and Graphics (2005). DOI:http://dx.doi.org/10.1016/j.compmedimag.2004.09.017

[9] Rebekah Laidsaar-Powell, Phyllis Butow, Frances Boyle,and Ilona Juraskova. 2018a. Facilitating collaborativeand effective family involvement in the cancer setting:Guidelines for clinicians (TRIO Guidelines-1). (2018).DOI:http://dx.doi.org/10.1016/j.pec.2018.01.019

[10] Rebekah Laidsaar-Powell, Phyllis Butow, Frances Boyle,and Ilona Juraskova. 2018b. Managing challenginginteractions with family caregivers in the cancer setting:Guidelines for clinicians (TRIO Guidelines-2). (2018).DOI:http://dx.doi.org/10.1016/j.pec.2018.01.020

[11] Rebekah Laidsaar-Powell, Phyllis Butow, Stella Bu,Cathy Charles, Amiram Gafni, Wendy Wing Tak Lam,Jesse Jansen, Kirsten Jo McCaffery, Heather Shepherd,Martin Tattersall, and Ilona Juraskova. 2013.Physician-patient-companion communication anddecision-making: A systematic review of triadic medical

consultations. (2013). DOI:http://dx.doi.org/10.1016/j.pec.2012.11.007

[12] Rebekah Laidsaar-Powell, Phyllis Butow, Stella Bu,Alana Fisher, and Ilona Juraskova. 2016. Attitudes andexperiences of family involvement in cancerconsultations: a qualitative exploration of patient andfamily member perspectives. Supportive care in cancer24, 10 (2016), 4131–4140.

[13] Rebekah Laidsaar-Powell, Phyllis Butow, Cathy Charles,Amiram Gafni, Vikki Entwistle, Ronald Epstein, andIlona Juraskova. 2017. The TRIO Framework:Conceptual insights into family caregiver involvementand influence throughout cancer treatmentdecision-making. (2017). DOI:http://dx.doi.org/10.1016/j.pec.2017.05.014

[14] Kristopher Lamore, Lucile Montalescot, and AurélieUntas. 2017. Treatment decision-making in chronicdiseases: What are the family members’ roles, needs andattitudes? A systematic review. (2017). DOI:http://dx.doi.org/10.1016/j.pec.2017.08.003

[15] Bridie McCarthy. 2011. Family members of patientswith cancer: What they know, how they know and whatthey want to know. (2011). DOI:http://dx.doi.org/10.1016/j.ejon.2010.10.009

[16] Jakob Nielsen. 2012. Thinking aloud: The# 1 usabilitytool. Nielsen Norman Group 16 (2012).

[17] Jakob Nielsen and Rolf Molich. 1990. Heuristicevaluation of user interfaces. In Proceedings of theSIGCHI conference on Human factors in computingsystems. 249–256.

[18] Jorge G. Ruiz, Michael J. Mintzer, and Rosanne M.Leipzig. 2006. The impact of e-learning in medicaleducation. (2006). DOI:http://dx.doi.org/10.1097/00001888-200603000-00002

[19] Jeff Sauro and James R Lewis. 2011. When designingusability questionnaires, does it hurt to be positive?. InProceedings of the SIGCHI Conference on HumanFactors in Computing Systems. 2215–2224.

[20] Keith Soothill, Sara M. Morris, Carol Thomas, Juliet C.Harman, Brian Francis, and Malcolm B. McIllmurray.2003. The universal, situational, and personal needs ofcancer patients and their main carers. European Journalof Oncology Nursing 7, 1 (2003), 5–13.

[21] Sebastiaan M. Stuij, Nanon H.M. Labrie, Sandra VanDulmen, Marie José Kersten, Noor Christoph, Robert L.Hulsman, Ellen Smets, Stans Drossaert, Hanneke DeHaes, Arwen Pieterse, and Julia Van Weert. 2018.Developing a digital communication training tool oninformation-provision in oncology: Uncovering learningneeds and training preferences. BMC Medical Education(2018). DOI:http://dx.doi.org/10.1186/s12909-018-1308-x

Page 15: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

Appendix A shows the eTRIO, its layout, features, interactions and activities.

eTrio interface walkthrough

The clinician module is presented here. The modules for patients and carers have the same layout

and activities but with different content.

Clinician module

Figure 1: Clinician dashboard

In figure 1, the dashboard with its different sections is shown. In the clinician module, they are called

Guidelines.

• Light blue sections have

not been started.

• This section is compulsory

not compulsory.

• Dark blue sections have

been started but not

completed.

• This section is compulsory

not compulsory.

• Green sections have

been completed.

• This section is

compulsory.

The estimated time to

complete the section is

shown in the top right

corner.

Page 16: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

• At the bottom of each page

is a progression bar.

• One little bar per section.

o Grey: not started

o Green: completed

o Blue: started but not

completed

• EST~ : estimated time to

complete section.

An introduction page

with explanations of

different buttons.

Arrows and “Back” and “Next”

buttons have the same of

function to navigate through

pages.

• Stars for bookmarking

o Green: not

bookmarked

o Purple:

bookmarked

• Print button to print the

page

“Save and exit” to

save progress and

get back to the

Dashboard.

Left Hamburger menu.

Navigate to different

pages of the section.

Right Hamburger menu.

Page 17: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

Layout

• Introductory info boxes in

light blue.

• Guiding principles in

boxes.

• Introductory navigation

instructions in light blue.

• Icons for different

sections within the page,

text appearing

underneath.

• Page with bullet

points giving an

overview of the

program.

• Important information

highlighted by bolding

text.

Page 18: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

• Layout of the

summary page.

• Possibility to

download summary as

PDF.

• There is a summary

on the last page of

each section.

• Quote from academic

papers.

• Logo of the journal

where the quote

comes from.

• Quotes have their

own page.

• Boxes with

information.

• Clicking on the logo

situated on the right

side of the box reveals

more information.

• On the top, the box is

opened.

• On the bottom, the

box is closed.

Page 19: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

• A map with the

different states of

Australia (the e-

learning is developed

for cancer care in

Australia).

• Clicking on each state

gives more

information about the

procedures in the

state.

• A page with a picture

illustrating the text.

• When starting the e-

learning, clinicians get

a message about how

they can complete the

sections and what is

mandatory.

• When the compulsory

parts of the e-learning

are completed for

clinicians, they get a

message. It also

encourages them to

continue with not

mandatory sections.

Page 20: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

Activities

• Questions with true or

false answers.

• Sliders to respond to

agreement or non-

agreement of

statements.

• At the end, it shows a

short summary about

your attitude towards

family members

according to your

answers.

• Bubbles with

information on.

• Clicking on the

bubbles makes them

turn purple and

reveals more

information.

Page 21: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

• Cards to select the

ones with the most

important information

needs of family carers.

• Blue contour: card

selected

• Blue card: card not

selected

• Feedback of card

activity.

• Pink: wrong answer.

• Green: right answer.

• White: answer was

not selected but

should have been.

• Clicking on numbers

located on the picture

reveals more

information.

• The information is

shown in a box like

shown on the picture.

• Clicking on “+” located

on the picture reveals

more information.

• The information is

shown in a box like

shown on the

previous picture.

Page 22: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

• In the green box a

scenario is described.

• In the textbox the

user can write how

they would respond

to this scenario.

• On the next page,

feedback about the

activity is given.

• Two important points

that should have been

mentioned in the

answer.

• Two example

answers.

• In this video activity,

the user should click

the pink button,

located under the

video, every time they

see the doctor build

rapport with a carer.

Page 23: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

• Feedback on the video

with button activity.

• Green tick: they

clicked the button at

the right time.

• Pink cross: a moment

that should have been

clicked but has not.

• Another video

activity.

• The user watches the

video and will be

asked on the next

page how they would

handle the situation

presented in the

video.

• The user gets

suggestion on how to

handle the situation

and should chose the

most accurate one.

• The user gets

feedback on the

answers.

Page 24: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix A eTRIO walkthrough

• Finally, the user gets

to see a video of how

the situation can be

handled.

• Another video

activity.

• The user gets to

watch a video and will

be asked how they

would handle the

situation.

• The user gets to

reflect in open text

how they would

handle the situation.

• Finally, the user gets

to see a video of how

the situation can be

handled.

Page 25: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix B System Usability Score Questionnaire

Appendix B shows the detailed results of the SUS questionnaire.

Carers Patients CliniciansCa1 Ca2 Ca3* Pa1* Pa2* Pa3 Cl1* Cl2* Cl3* Cl4 Cl5

1. I think that I would like to use the eTRIO website frequently.

3 2 0 0 4 2 1 3 3 3 2

2. I found the eTRIO website to be simple. 4 4 4 3 3 3 2 3 3 3 3

3. I thought the eTRIO website was easy to use.

4 4 3 2 3 3 3 3 3 3 2

4. I think that I could use the eTRIO website without the support of a technical person.

4 4 4 4 3 1 4 4 3 4 4

5. I found the various functions in the eTRIO website were well integrated.

4 4 4 2 3 2 3 3 2 2 3

6. I thought there was a lot of consistency in the eTRIO website.

4 4 3 1 3 3 3 3 3 3 3

7. I would imagine that most people would learn to use the eTRIO website very quickly.

4 4 4 3 2 1 4 4 3 3 2

8. I found the eTRIO website very intuitive. 4 4 2 4 4 3 4 2 2 3 3

9. I felt very confident using the eTRIO website.

4 4 3 4 3 2 4 3 2 3 3

10. I could use the eTRIO website without having to learn anything new.

4 4 4 4 3 2 4 4 3 3 4

Sum 39 38 31 27 31 22 32 32 27 30 29 Average > 80.3 Excellent

SUS score 97,5 95 77,5 67,5 77,5 55 80 80 67,5 75 72,5 77 68 > sus > 80.3 Good

Score rounded 98 95 78 68 78 55 80 80 68 75 73 68 OK

1: strongly disagree 51 > sus > 67 Poor

5: strongly agree < 51 Awful1-5 scores converted to 0-4* audio recorded

Page 26: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix C Heuristic Evaluation

Appendix C shows problems identified in the heuristic evaluation, with the problems' severity (S), prevalence (P) and ease to fix (E).

Problems S P EInconsistency of icons

Inconsistent use of icons, a star and a book to bookmark 4 5 5 S (1-not severe,

5-severe)Search magnifying glass icon to view content 4 1 5 P (1-not often,

5-often)Can view content through clicking on title and on search icon 2 1 5 E (1-hard, 5-

easy to fix)Redundancy in buttons

Arrows and back/next buttons have the same function 4 1 5

When clicking on small play button of videos, big play button stays on screen 5 5 3

Buttons and interactions not workingStar when bookmarked goes from green to purple, when unbookmarked does not go back to green

2 5 3

"Delete content" is misleading as you want to delete page from bookmarks and not delete the content

3 1 5

Hover text says "sort", does not explain you should click and drag to place where you want it. "Sort" misleads to think to be able to sort by different categories not manually.

3 1 4

Top bar with "Title" should not have dots as you cannot move that part 3 1 5

Not possible to click on whole bar to open it and no highlighted hover feedback when mouse is over the health professional button

3 3 3

Nothing happens when clicking on the print button* 5 5 3

Clicking on complete on last page of any part makes the whole dashboard go green although several parts are not completed (but still say "Begin")*

5 5 3

What is written in textbox does not get saved* 5 4 3

No error message more than an exclamation mark when textboxes are not filled in

4 5 5

Unclear what the button "save and exit" means 4 5 4

Not possible to save on what page in a part the user stopped 5 5 3

Activity progress is lost when clicking on "Save and exit" or when reloading the page

4 4 3

Clicking to continue a part on Dashboard does not lead to the last visited page in that part

5 5 3

Page 27: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix C Heuristic Evaluation

Clicking on button of video activity gives no feedback 3 4 4

Nothing happens in card activity* 5 1 3Text in textboxes not saved when going back 4 2 3

No hover highlighted feedback on bubbles 3 2 4

Responsive design problems in layoutConstant hamburger menus on left and right side, even on big screens 3 5 3

Text is not aligned with the textbox/bar, text is overlapping each other when opening several information boxes

5 4 3

"Save and exit" button is only on the first page of each part* 5 5 4

Guidelines on Dashboard look like divided into two on smaller screens 4 5 3

No margin on left side of the text on pages, text is almost cut on bigger screens

4 5 3

No link on guidelines that are referred 2 2 5Presentation of content

Unnecessary fading in of text 3 5 5Progress bar at bottom of the page, easily unseen, not clear what each section represents

2 5 4

"EST ~5min" on progress bar - message might not be understood 2 5 4

Progress bar is not showing the correct progress* 5 5 2

Some boxes get thinner than others when opened 4 4 3

No information about the arrow buttons 5 1 5

Asterisk after names lead to nowhere* 5 5 5Choice of green colour on slider for the "bad" part* 4 1 4

Unclear what you get for completing the training 3 1 5

Page 28: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix D Frequency of strengths and weaknesses

Appendix D shows the frequency of mentioned comments in the think-alouds. It shows whether the comment was mentioned as something positive or something to improve. It also shows how much each participant commented during their think-aloud.

Pa1* Pa2* Pa3 Ca1 Ca2 Ca3* Cl1* Cl2* Cl3* Cl4 Cl5 Sum all mentioned + strength+ - + - + - + - + - + - + - + - + - + - + - - to improve

Buttons and functions Ca CarerBookmark pages x 1 Cl ClinicianEasy navigation x x x x x x x 7 Pa PatientBeing able to complete the module at your own pace and wished order

x x x 3* audio recorded

Video activity with button x x x 3Video activity with free text x 1Activity of sorting answers x x 2Redundancy in arrows and back/next buttons x x x 3

Back button on first page of each part x 1

Meaning of "Save and exit" button x x 2

Star to bookmark not working x 1Sliders x x x 3

Layout and design

Dashboard layout x x x 3Bolding important words or part of phrases x 1

Colours used throughout the website x x x x 4

Different text size in bullet points x x 2Text size in general x x x x x x x 7Text font x 1Opening a definition of roles of healthcare teams it does not show up properly, text is overlapping, the sizes of the boxes are changing

x x 2

Tick in continue mode x 1Text not centered, cropped on the side x 1

Animation

Fading in of text x 1Interaction

Popping bubbles to reveal more information x x x x x x 6

Build and download your own question list x x 2

Create your caregiving team x x x 3Content

Questions to ask health professionals x x x 3

Page explaining the buttons of the page x 1

Estimated time on each module x x x x x 5Short videos x x x x x x x x 8Content emphasising the importance of the carer's role x x x x x x x x 8

Content about the healthcare system x x 2

Content in general x 1Quotes x x x x x x x x 8Links to recommended podcast, app and websites x x 2

Simple language without being to basic x x x 3

Content the user can relate to x x x x x x x x x x 10Truth about myths x 1References x x x x x 5Term "carer" x x x 3Pictures x x x x x 5No explanation of the name of eTrio x 1

No link to see videos of other pages when on a page with a video

x 1

Inconsistency in referring to a specific to a specific state but is used throughout the whole country

x 1

Spelling mistakes x x x 3Inconsistent use of terms x x 2Terms with connotations x x 2Busy slides x x x x x x 6Download summary x x 2Unclear what you get for completing the training x 1

Positive/negative mentioned 13 4 4 3 4 2 16 9 13 10 7 1 8 10 7 4 4 2 7 1 8 8Total mentioned 17 7 6 25 23 8 18 11 6 8 16

Page 29: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix E User progress in think-alouds

Appendix E shows which sections the participants went through during the think-aloud and how

much time the think-aloud session took.

“x” means the participant went through that section.

Progress of carers in their module

Progress of clinicians in their module

Progress of patients in their module

Page 30: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix F Heuristic evaluation and think-alouds comparison

Appendix F compares the problems found in the heuristic evaluation and the mentioned problems in the think-alouds. It shows which problems were mentioned in both methods,

which problems were only identified in one of them and which problems were fixed after the heuristic evaluation but before the think-alouds. The ranking of the problems given in the

heuristic evaluation is also presented here. TA HE S P E Fixed

Buttons and functionsArrows and back/next buttons have the same function x x 4 1 5

No link on guidelines that are referred x x 2 2 5

Can be unclear what the button "save and exit" means x x 4 5 4

Not possible to save on what page in a part the user stopped x x 5 5 3

Clicking on complete on last page of any part makes the whole dashboard go green although several parts are not completed (but still say "Begin")

x 5 5 3 x

Nothing happens when clicking on the print button x 5 5 3 x

Nothing happens in card activity x 5 1 3 xProgress bar is not showing the correct progress x 5 5 2 x

Can view content through clicking on title and on search icon x 2 1 5

"Delete content" is misleading as you want to delete page from bookmarks and not delete the content

x 3 1 5

When clicking on small play button of videos, big play button stays on screen

x 5 5 3

Hover text says "sort", does not explain you should click and drag to place where you want it. "Sort" misleads to think to be able to sort by different categories not manually.

x 3 1 4

Not possible to click on whole bar to open it and no highlighted hover feedback when mouse is over the health professional button

x 3 3 3

No error message more than an exclamation mark when textboxes are not filled

x 4 5 5

Activity progress is lost when clicking on "Save and exit" or when reloading the page

x 4 4 3

Text in textboxes not saved when going back x 4 2 3

Back button on first page of each part x

Layout and design

Page 31: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix F Heuristic evaluation and think-alouds comparison

Opening a definition of roles of healthcare teams it does not show up properly, text is overlapping, the sizes of the boxes are changing

x x 4 4 3

Text is not aligned with the textbox/bar, text is overlapping each other when opening several information boxes

x x 5 4 3

Some boxes get thinner than others when opened x x 4 4 3

No margin on left side of the text on pages, text is almost cut on bigger screens

x x 4 5 3

Star when bookmarked goes from green to purple, when unbookmarked does not go back to green

x x 2 5 3

"Save and exit" button is only on the first page of each part x 5 5 4 x

Choice of green colour on slider for the "bad" part x 4 1 4 x

Inconsistent use of icons, a star and a book to bookmark x 4 5 5

Search magnifying glass icon to view content x 4 1 5

Top bar with "Title" should not have dots as you cannot move that part

x 3 1 5

Constant hamburger menus on left and right side, even on big screens x 3 5 3

Progress bar at bottom of the page, easily unseen, not clear what each section represents

x 2 5 4

"EST ~5min" on progress bar - message might not be understood x 2 5 4

Guidelines on Dashboard look like divided into two on smaller screens x 4 5 3

Different text size in bullet points xTick in continue mode x

AnimationFading in of text x x 3 5 5

InteractionPopping bubbles to reveal more information x x 3 2 4

No hover highlighted feedback on bubbles to click on x x 3 2 4

Clicking on button of video activity gives no feedback x x 3 4 4

What is written in textbox does not get saved x 5 4 3 x

ContentSpelling mistakes x xUnclear what you get for completing the training x x 3 1 5

Page 32: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

Appendix F Heuristic evaluation and think-alouds comparison

Asterisk after names lead to nowhere x 5 5 5 x

No information about the arrow buttons x 5 1 5

No explanation of the name of eTrio x

No link to see videos of other pages when on a page with a video x

Inconsistency in referring to a specific to a specific state but is used throughout the whole country

x

Inconsistent use of terms xTerms with religious connotations xBusy slides xTerm "carer" xPictures x

Fixed Problems fixed before think-aloudsTA think-aloudsHE heuristic evaluationS severityP prevalenceE ease to fix

Page 33: An usability evaluation of TRIO’s e- learning modules ...1466296/...An usability evaluation of TRIO’s e-learning modules enhancing the communication between cancer patients, clinicians

www.kth.se

TRITA -EECS-EX-2020:436