Table of Contents Author Guidelines Submit a Manuscript
Advances in Human-Computer Interaction
Volume 2018, Article ID 1518682, 13 pages
https://doi.org/10.1155/2018/1518682
Research Article

Heuristic Evaluation: Comparing Generic and Specific Usability Heuristics for Identification of Usability Problems in a Living Museum Mobile Guide App

Faculty of Cognitive Sciences and Human Development, Universiti Malaysia Sarawak, Kota Samarahan, Sarawak, 94300, Malaysia

Correspondence should be addressed to Mohd Kamal Othman; ym.saminu@lamakmo

Received 5 April 2018; Revised 23 July 2018; Accepted 31 July 2018; Published 2 September 2018

Academic Editor: Vesna Popovic

Copyright © 2018 Mohd Kamal Othman et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

This paper reports on an empirical study that compares two sets of heuristics, Nielsen’s heuristics and the SMART heuristics in the identification of usability problems in a mobile guide smartphone app for a living museum. Five experts used the severity rating scales to identify and determine the severity of the usability issues based on the two sets of usability heuristics. The study found that Nielsen’s heuristics set is too general to detect usability problems in a mobile application compared to SMART heuristics which focuses on the smartphone application in the product development lifecycle instead of the generic Nielsen’s heuristics which focuses on a wide range of interactive system. The study highlights the importance of utilizing domain specific usability heuristics in the evaluation process. This ensures that relevant usability issues were successfully identified which could then be given immediate attention to ensure optimal user experience.

1. Introduction

Cultural and heritage sites have a long history of adapting mobile technologies as visitor’s guides. According to Tallon [1], mobile guide technology was first used at Stedelijk Museum in Amsterdam in 1952. Changes made throughout the years ranged from the digitization of the objects to the use of emerging technologies. The evolution of mobile guide technologies in cultural heritage sites has transformed the visitors’ experiences at such venues. Kenteris and Gavalas & Economou [2] classified mobile guides used in museums into four different groups: (1) mobile guide applications, (2) web-to-mobile applications, (3) mobile phone navigational assistants, and (4) mobile web-based applications.

The use of smartphone technologies, particularly apps to replace other mobile guide technologies at cultural and heritage sites, could eliminate some issues faced by visitors. For example, it reduces the learning curve as visitors do not need to learn how to operate the technology and can focus on the content in the mobile guide. Jaěn, Mocholĺ, Esteve, and Bosch & Canós [3] highlighted this as an important criteria in designing the multimedia content browsers on mobile guides. In addition, the use of different types of mobile guides in cultural heritage sites also enables the visits to become more visitor-oriented and not fully controlled by curator, particularly through the personalization of information in accordance to visitors’ need [46]. A recent study by Pallud [7] on the use of interactive technologies in a French museum to engage the audience and promote positive learning experience suggested that the ease of use and interactivity features of the technologies provided could influence the emotional process (authenticity and cognitive engagement), which in turn could influence learning. A prior research by Othman et al. [8] also suggested that visitors who use multimedia guide during their visit to cultural heritage site are significantly more engaged in the experience as compared to those who do not use any multimedia guides.

Usability and user experience (UX) have always been the predominant concerns of software products [9]. Helyar [10] highlights that mobile apps suffered from usability issues such as inept content and interface design. This resulted in the lack of user acceptance and the applications being rejected within months of the launch [11]. Zhang and Adipat [12] discussed the challenges, methodologies, and issues in the usability testing of mobile application. For example, the unique features of mobile devices pose challenges in usability testing such as mobile context of use, connectivity which is usually related to the bandwidth or network, small screen size, different display resolutions, limited or different processing capabilities, and data entry methods.

Gomez et al. [13] suggested that usability evaluation technique can be carried out during the implementation of a particular system to make sure the system enables users to achieve their goals efficiently. For example, the usability aspects of interacting with the smartphone apps, presenting information on the screen, learning about the smartphone apps’ functionality, and controlling the devices prior research have shown that inspection method such as heuristic evaluation (HE) is effective in detecting usability problems in an interface compared to other methods [14, 15]. The issues identified by experts during the heuristic evaluation will be rectified before conducting the usability testing with actual visitors.

Most of the time, specific heuristics will work better as it is completed by specific usability checklists. However, a previous study by Law and Hvannberg [16] also highlights that it is difficult to map the problem to the matching heuristic. Hvannberg, Law, and Lérusdóttir [17] suggested a framework to conduct heuristic evaluation by comparing two sets of heuristics and process, procedure, and support to conduct usability evaluation to ensure its’ effectiveness. Joyce and Lilley [18] argued the effectiveness of the previously developed specific heuristic such as Inostroza’ TMD heuristics due to the similarity of the heuristics with Nielsen’s traditional heuristics. They further mentioned that experts may feel ambiguous as the heuristic title remain the same although the definitions are different. The SMASH heuristics also suffered from the same issues of having the same name with the traditional heuristics by Nielsen although only one heuristics modified and one new heuristics added. It is important to highlight that the issues of mapping the usability issues with the accurate heuristics could lead to confusion in construing the issues. This can be an obstacle to designing the best solution the problem. Thus, the objective of this study is to examine the importance of integrating domain specific heuristics evaluation in the design and development of the smartphone apps for a living museum.

1.1. Living Museum Mobile Guide Application at Sarawak Cultural Village (SCV)

The smartphone app in this study was developed as a mobile guide for a living museum, namely, Sarawak Cultural Village (SCV). SCV is chosen for this study because it was set up to preserve and showcase the finest Sarawak cultural heritages for the past 25 years. SCV also serves as one of the remaining sources of Sarawak cultures and its’ ethnics [19]. In addition, there are only a handful of living museums (living museum is best described by Anderson [20] as way of simulating life in another time, particularly the past that best presented as a living history filled with activities that could possibly have an impact on learning experiences.) in Malaysia and SCV is considered as one of the best places to visit. Furthermore, SCV acts as a medium to preserved cultural heritages (i.e., architectures, artefacts, costumes, etc.) of major ethnics in Sarawak. Dellios [21] argued with the development of such themed museums and how these cultural villages could be considered as on par with the authentic traditional villages at the outskirts of civilization. She further acknowledged that this is possibly a solution to the various issues in cultural tourism in Sarawak. Although tourists are eager to visit the homes and experiencing the lives of different cultures, previous studies raised concerns associated with disturbing the real lives of the residents at traditional villages [19] and possibilities of endangerment or troubles during the visit [22, 23].

Hitchcock and Stanley & Siu [24] described living museum as a venue that combined tangible and intangible cultural heritage that influenced by the open-air folk museum in North America and Europe. They further discussed two types of living museum in South-east Asian such as China Folk Culture Village in China and Taman Mini museum in Indonesia, while Anderson [20] discussed the various types of living museum in Europe and North America.

Anderson [20] discussed three main reasons why living museums were built: to effectively interpret the tangible and intangible culture; a place for research to test archaeological thesis or ethnographic study; and a place for visitors to actively participate in the activities provided as part of learning experiences. Being a living museum, SCV has a different setting compared to a conventional museum and fulfilled 2 main reasons why living museums were built previously described by Anderson [20]. In SCV, groups of people (staffs) showcase the lives, daily activities, and artefacts of the various ethnic communities in Sarawak. The staff are the source of information to the visitors because minimal or no information is provided due to the natural setting of SCV. When there is a lack of available staffs such as during lunch breaks, this has a negative impact on the overall visitors’ experience because visitors could not get the sufficient information when needed. Previous studies highlight the importance of different attributes particularly facilities and services provided to the visitors at SCV to improve their experiences [25, 26].

The introduction of a smartphone mobile guide app for SCV could possibly have a positive impact on visitors’ experience because it offers information on the go to the visitors that can be accessed before, during, and after each visit. However, delivering a mobile guide app with unresolved usability issues will jeopardize this positive impact on the visitors’ experience. It is important to address the usability issues in mobile guide applications before being deployed for use by visitors at the cultural and heritage sites.

1.2. Designing for User Experience (UX)

Designing for user experience UX is not trivial, particularly mobile applications that require the users to seamlessly interact with the mobile applications and their environment. Charland & Leroux [27] discussed the importance of UX in both native and web mobile application development to ensure its adoption. They further added that mobile UX can be divided into two main categories: (1) the context in which the elements must be understood but not changeable such as hardware affordances and UI convention and (2) the implementation which refers to the elements that are changeable such as the design and features. Previous studies have shown that the focus of user experience (UX) studies has changed to the UX of technologies rather than the usability of devices thus making the user studies more complicated [2832]. Hassenzahl & Tractinsky [28] stated that the UX comprises three different perspectives: emotion and affect and the experiential and beyond the instrumental as illustrated in the Figure 1.

Figure 1: Facets of UX ([28], p 95).

The emotion and affect focus on users’ emotion and how it is influenced by the affective computing concept. The experiential perspective focuses on the overall users’ experience with the technology, whether it is situated or temporally bounded. The idea behind of experiential in UX is the duration of users’ interaction with the product. This in turn will have impact on the UX, whether the short-term interaction between users and product only lasts for a short time or can be prolonged (e.g., [3335]). The temporally bounded experience or temporality in UX have been discussed by the various researchers in the past (i.e., [33, 3641]). The third perspective, which is beyond the instrumental, aims to create a more holistic computing system that takes human needs into consideration. For example, different technologies have been created to support different types of users. In the context of museums, Tate Modern Museum provides handheld devices that play videos with sign language for hearing impaired visitors. The British Sign Language (BSL) was used and first piloted in 2003. Evaluation on the guides showed that users were satisfied with the guide and it significantly improved their visit with 79% of visitors agreeing that they were highly satisfied with the BSL during their visit [42].

1.3. HE and Usability Evaluation of Smartphone Apps

Although Agarwal and Ventakesh [43] pointed out that usability is not fundamentally objective but relative to evaluator’ personal interaction during the evaluation, Yáñez Gómez, Caballero, and Sevillano [44] suggested that the evaluation can be designed to balance the personal interpretation. HE technique and usability testing are the mainstay of modern practice among usability professionals [45]. HE is a method for finding usability problems in a specific user interface design by taking a small set of evaluators to evaluate the interface and judge its compliance with recognized usability principles such as Nielsen’s heuristics [46]. HE was originally developed as a usability engineering method for evaluators who had some knowledge of usability principles.

Heuristics is a well-established set of principles used to measure usability in an interface. There are mainly two alternatives which are to use either general heuristics or specific heuristics when performing a HE [47]. In the age of touchscreen-based mobile devices (e.g., mobile standard personal computer, mobile internet devices, handhelds, or PDAs and smartphone), researchers have proposed new sets of heuristics specific to such devices. In addition, previous studies have found that it is beneficial to have a specific set of heuristics for smartphone apps [48, 49].

Fung, Chiu, Ko, Ho, and Lo [50] conducted a heuristic usability evaluation on the University of Hong Kong Libraries website and its mobile version using Nielsen heuristic and discovered five (5) different usability issues with the mobile website. This poses the question as to whether usability heuristics sets are applicable across domains or whether usability heuristics sets should be domain specific for a more definite output of usability evaluations [48, 49].

Inostroza, Rusu, Roncagliolo, Jiminez, and Rusu [47] stated that heuristic evaluation is easy to apply but it is important to have a domain specific heuristic to ensure all usability issues are covered. Specific heuristics can detect usability problems related to the application domain, but it may be hard to understand and difficult to apply. On the other hand, general heuristics are easy to understand and apply; however, it is also easy to miss domain specific usability problems.

Indeed, there have been some studies in developing specific usability heuristics to fit application contexts. Mankoff et al. [51] conducted a study on the evaluation of ambient display using two different sets of usability heuristics, Nielsen’s original heuristics and a modified version of the heuristics for ambient display. The application experts in the ambient heuristics group found significantly more usability issues than Nielsen’s heuristics group, proving it to be more effective than Nielsen’s heuristics.

Silva, Holden, and Jordan [52] described the use of specific usability heuristics for older adults. The work was based on several previous studies on older adults, such as Silva, Holden, and Nii [53]; Chisnell, Redish, and Lee [54] and Kurniawan and Zaphiris [55]. Diaz, Rusu, and Collazos [56] also held the view that using appropriate heuristics is highly relevant. They developed and validated usability heuristics set that is specific to ecommerce websites.

A specific set of heuristics which is known as Touchscreen Mobile Devices heuristics (TMD) was proposed by Inostroza et al. [47]. They found out that 40% usability problems were identified using TMD, while only 26% usability issues were identified using Nielsen’s heuristics. Thus, indicate the effectiveness of specific heuristics as compared to Nielsen’ heuristics. Recently, Inostroza et al. [57] validated their TMD and named it SMASH heuristics.

These studies support the use of domain specific usability heuristics as crucial in determining domain specific usability issues. Other than SMASH heuristics, various types of heuristics for smartphone apps have been made available for HE, for example, SMART heuristics [18]; mobile usability heuristics [58]; MATcH [59]; and many other heuristics as described by Inostroza et al. [57] and Salgado & Freire [60]. We have decided to use SMART and Nielsen heuristics in this study. Details of the heuristics used in this study can be found in Tables 1 and 2.

Table 1: Nielsen’s Heuristics.
Table 2: Smartphone mobile application heuristics (SMART).

2. Methodology

2.1. Design and Development of Smartphone Apps

This project employed the Mobile Application Development Life Cycle (MADLC) that is specifically made for the development of smartphone apps. MADLC is commonly used in the design of Android mobile applications and was first introduced by Vithani and Kumar [62]. This system development lifecycle consists of seven stages: (1) identification, (2) design, (3) development, (4) prototyping, (5) testing, (6) deployment, and (7) maintenance.

In the identification stage, researchers analyzed existing mobile guide apps for museums and other cultural sites (i.e., [6366]); this collected information was used as a point of references in designing the mobile guide app for SCV.

In addition, researchers made several visits to the SCV and conducted interviews with 10 staffs from different backgrounds, for example, general manager of SCV to gather insights into SCV and its future plans to ensure that the proposed mobile guide will be fully utilized. In addition, an interview with an assistant manager that oversees the operation of the SCV also took place to understand the operation of SCV and to ensure the proposed guide will not come in between the visitors and artefacts and its user experiences.

An open-ended interview with staff who were living at the SCV was conducted over a 2-weeks period to gather information and materials for the ethnic houses such as information and photographs of the architecture and the artefacts in the houses as illustrated in Figure 2. They were asked about the activities that they performed at SCV, information about various artefacts on display of different ethnicities, and their significance. For example, the information about the human skulls located above the fireplace in the Baruk (warrior house) as illustrated in Figure 3.

Figure 2: Various artefacts and activities at one of the ethnic house.
Figure 3: Human skulls were place above the fireplace in the Baruk (warrior house).

Furthermore, a short interview with five visitors was conducted during this stage to gather their insights and their overall experiences at SCV. It is important to get users’ input from the beginning to ensure that the product will be acceptable. In addition, information searching was made by several visits to library and the ethnic foundations to gather related information about artefacts, lifestyles, culture, and architectures of the main ethnic in this study. Furthermore, online searching for related information was conducted too because people are known to disseminate knowledge using online means. In the design stage, the information from the identification stage was transformed into an initial design of the mobile guide app. Storyboards were sketched out to visualize the flow and interaction within the mobile guide app. Figure 4 shows a sketch of an interface for the app.

Figure 4: Mobile guide app interface sketch.

In the development stage, the storyboard sketches from the identification stage were transformed into a functional system using Corona SDK software development kit simulator and Notepad++. A prototype, which is a fully functional system of the basic concepts of the application, was built. Prototypes are used for testing to examine any bugs or errors with the design of the system.

The testing stage is a critical aspect in system development. Usability testing is usually carried out during this stage with actual users. However, this study employed the heuristic evaluation before user testing was conducted.

Prior to the evaluation stage, the researchers had successfully developed a mobile guide in the form of a smartphone app aimed to guide visitors at a living museum (Sarawak Cultural Village). This mobile guide is a native application that could be uploaded to the play store (Google Play, Apple App Store, etc.) whereby users can conveniently download it for free to their mobile devices.

2.2. Materials
2.2.1. SCV Smartphone App

When the users open the app, a splash screen (see Figure 5) will appear. Subsequently, users will be directed to the next screen, the map of the SCV, as illustrated in Figure 6.

Figure 5: Mobile guide flash screen.
Figure 6: Map of SCV (location of each ethnic houses).

When visitors tap on the “Bidayuh house” icon, they will be directed to the menu section as illustrated in Figure 7 that consists of the three different parts of the Bidayuh ethnic house; the Barok, Panggah, and Longhouse. Visitors will be provided with information about the house they selected from the menu as shown in Figure 8.

Figure 7: Main menu of the Bidayuh House.
Figure 8: Information about the Bidayuh Longhouse.

Subsequently, visitors will be directed to the floor plan of the house that shows the location of each artefacts available in each section of the house as illustrated in Figure 9. Information about each selected artefact will be provided in the next screen (for example, see Figure 10).

Figure 9: Floor plan.
Figure 10: The information about the selected artefact.
2.2.2. Heuristic Evaluation

To date, there are several types of heuristics available for HE that focused on smartphone. For example, touchscreen-based mobile device (TMD) heuristics [47]; SMASH heuristics [57]; SMART heuristics [18]; mobile usability heuristics [58]; mobile interface checklist [44]; MATcH [59]; and many other heuristics are described by Inostroza et al. [57] and Salgado & Freire [60].

Two types of heuristic principles were used for comparison: Nielsen’s heuristics [67] and SMART heuristics [18]. The set of heuristic principles by Nielsen is a well-established instrument that has been widely used for various types of interface design. The SMART heuristics, on the other hand, is a set of heuristics developed to particularly cover all aspects of a smartphone application interface. Though the mobile usability heuristics by Bertini and Gabrielli & Kimani [58] were considered, they were not selected due to the fact that the heuristics do not focus on the usability of the application icon.

During this stage, five (5) experts were recruited to evaluate the interface and the content of the application. They were instructed to identify the usability issues and determine their severity ratings using the severity rating scale [61] to rate the severity of usability problems. The severity ratings are as follows: 4—Catastrophic Problem—users will not be able to continue to their goal, must be fixed; 3—Major Problem—users will be frustrated/have difficulty continuing to their goal, could be fixed; 2—Minor Problem—users will be frustrated/have difficulty continuing to their goal, should be fixed; 1—Cosmetic Problem—user will be having minor problems that can be easily fixed; and 0 will have no usability issues at all. They were also instructed to map the usability issues to both Nielsen’s Heuristics and SMART Heuristics.

2.3. Method

Below is the procedure of the study:(1)Briefing Session: When the participants arrived, they were asked to take a seat. Participants were briefed on the purpose of the study by the researcher.(2)Interaction with the SCV mobile guide tour (smartphone apps): Participants were handed a Samsung mini tablet with the SCV mobile guide tour app installed. They were briefed about the purpose of the app and how it will be used at Sarawak Cultural Village. This was to help the participants visualize how it will be used and critically analyze it for any issues they might encounter, regardless of minor or major issues.(3)Evaluation session: Participants were asked to browse the mobile guide app screen by screen and document any usability issues they found. Participants identified the usability issues and their severity ratings and mapped them to both Nielsen’s Heuristics and SMART Heuristics.(4)Debriefing Session: During the session, researchers thanked participants for their contributions and answered questions from the participants.

3. Results

The output of the heuristic evaluations was a list of usability problems with the severity scale provided in the Table 3. A total of 31 usability issues were identified. Out of the total, six issues were classified as catastrophic issues and needed immediate action while eight issues were categorized as major issues and needed to be rectified.

Table 3: Usability issues, average severity, and its mapping with Nielsen’s Heuristic and SMART heuristics.

The usability issues listed in Table 3 were matched against the 10 Nielsen’s Heuristic and SMART heuristic. Table 3 also summarized the mapping of each usability issues with Nielsen’s and SMART heuristics. It is important to highlight that two (2) usability issues (issue 15 and 17) were not mapped to any Nielsen’s heuristic because the issues were too specific to be mapped.

4. Discussion

Heuristic evaluation is an essential activity for securing highly usable smartphone apps and should not be disregarded in the mobile app development life cycle. This study presented an analysis of 31 usability issues. Six (6) issues (19.35%) were classified as catastrophic, while 8 issues (25.8%) were categorized as major issues. The identification of these two categories of usability problems showed that a heuristic evaluation was needed so that usability issues could be addressed before the app reached the users. In addition, heuristic evaluations are low cost tools that could be easily executed to assess and improve the identified usability issues of the system during the development phase [68].

In this study, the aesthetic problems were identified more using the SMART heuristics. SMART heuristics have two principles dealing with aesthetics (#6 and #3) that describe the principle for overall interface aesthetics and specific icon aesthetics. However, though Nielsen’s heuristics #4 mentions aesthetics, the description does not elaborate on aesthetics other than minimal design. There are two usability issues (issue 15 and 17 that are related to the icons) that are not easily matched against the Nielsen traditional heuristic because it is too general and researchers have decided to classify these issues under aesthetic and minimalist design. On the other hand, it is clear in the SMART heuristic that these two usability issues were categorized under SMART13 (Create an aesthetic and identifiable icon) although there is another category that is analogous to traditional heuristic, aesthetic, and minimalist design which is SMART6: design a visually pleasing interface. Aesthetics of the visual design component of a mobile guide interface design is crucial as it can enhance visitors’ engagement [69]. SMART heuristics presents more detailed principles in evaluating aesthetics.

Usability issues in terms of consistency are the main problem found by the evaluators and this can be mapped to both Nielsen’s heuristics (#4) and SMART principle (#2). However, it is interesting to note that although both heuristics are related to consistency and standard, the number of issues mapped to Nielsen’s Heuristic and SMART principle is 8 and 11, respectively.

This study clearly showed the need to use domain specific heuristic evaluation as opposed to the general usability heuristics because there were difficulties in determining which heuristic was more appropriate for the issues identified especially when it came to the features that were only available for the smartphone. Joyce and Lilley [18] explained these difficulties because Nielsen heuristic is too general to detect usability problems because current systems are more interactive, complex, and diverse. In addition, a study by Alsumait and Al-Osaimi [70] concluded that Nielsen’s heuristic is too general and not explained in details and is not suitable to be applied to domain specific application. A similar finding by Pinelle, Wong, and Stach [71] explained that more usability issues were found when using newly developed set heuristics due to the complexity of the application. Other previous studies also supported the need to use domain specific usability heuristics (e.g., [58, 68]). Furthermore, Petrie and Power [72] also highlight that it is inappropriate to compare problem faced by users in past decades and problems faced by users in 2010s. Hence, the categorization using the SMART principle is more apt because it is recently developed and has taken into consideration current technologies, i.e., smartphone and its applications. On the other hand, Nielsen’s principles were developed before the advent of smartphone technologies; hence the categorizations were more general.

5. Conclusions

This study compared two sets of heuristics, traditional and domain specific for mobile application for cultural heritage sites. It shows that domain specific measurement is more comprehensive for a more definite identification of usability issues. These issues then can be rectified so that the mobile guide application can evolve into a more usable application before the system is deployed to the target users. This will help to improve the acceptance of the application once deployed to the users at Sarawak Cultural Village through the development of a highly usable mobile guide application that enhance the user experience (UX) at these sites. Future works can focus on comparison of domain specific heuristics to define heuristics for mobile guide applications for cultural heritage sites.

Data Availability

The data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare that there are no conflicts of interest regarding the publication of this paper.

Acknowledgments

The authors gratefully acknowledge the grant from Universiti Malaysia Sarawak (UNIMAS) (F04 (S148)/1128/2014(13)) and financial/nonfinancial support given by Sarawak Cultural Village (SCV).

References

  1. L. Tallon, “Introduction: Mobile, Digital and Personal,” in Walker, Digital Technologies and the Museum Experience: Handheld Guides and Other Media, Altamira Press, 2008. View at Google Scholar
  2. M. Kenteris, D. Gavalas, and D. Economou, “Electronic mobile guides: A survey,” Personal and Ubiquitous Computing, vol. 15, no. 1, pp. 97–111, 2011. View at Publisher · View at Google Scholar · View at Scopus
  3. J. Jaěn, J. Mocholl, J. M. Esteve et al., “MoMo: Enabling social technology in hybrid museums,” in Proceedings of the International Workshop Rethinking technology in museum: towards a new understanding of people’s experience in museums, L. Ciolfi, M. Cooke, T. Hall, L. J. Bannon, and S. Olivia, Eds., pp. 245–251, University of Limerick, Limerick, Republic of Ireland, 2005.
  4. S. Filippini-Fantoni, “Personalization through IT in museums,” in Proceedings of the International Cultural Heritage Informatics Meeting (ICHIM 03), Museum Informatics Europe, Paris, France, 2003.
  5. S. Filippini-Fantoni, J. P. Bowen, and T. Numerico, “Personalization issues for science museum Web sites and E-learning,” E-Learning Methodologies and Computer Applications in Archaeology, pp. 371–387, 2008. View at Google Scholar · View at Scopus
  6. S. Filippini-Fantoni and J. P. Bowen, “Mobile multimedia: Reflections from ten years of practice,” in Digital Technologies and the Museum Experience: Handheld Guides and Other Media, L. Tallon and K. Walker, Eds., AltaMira Press, Lanham, MD, USA, 2008. View at Google Scholar
  7. J. Pallud, “Impact of interactive technologies on stimulating learning experiences in a museum,” Information and Management, vol. 54, no. 4, pp. 465–478, 2017. View at Publisher · View at Google Scholar · View at Scopus
  8. M. K. Othman, H. Petrie, and C. Power, “Engaging visitors in museums with technology: Scales for the measurement of visitor and multimedia guide experience,” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics): Preface, vol. 6949, no. 4, pp. 92–99, 2011. View at Publisher · View at Google Scholar · View at Scopus
  9. M. F. Mazlan, A. Sivaji, S. S. Tzuaan, and A. M. Lokman, “Enhancing the heuristic evaluation (HE) by development and validation of a collaborative design measurement system (CDMS,” in In Proc Science Engineering Research (CHUSER, pp. 473–478, Sabah, Kota Kinabalu, 2012. View at Google Scholar
  10. V. Helyar, Usability issues and user perceptions of a 1st generation WAP service, 2000, http://experiencelab.typepad.com/files/wap-paper-2.pdf.
  11. K. Wac, S. Ickin, J.-H. Hong, L. Janowski, M. Fiedler, and A. K. Dey, “Studying the experience of mobile applications used in different contexts of daily life,” in Proceedings of the 1st ACM SIGCOMM Workshop on Measurements Up the Stack, W-MUST'11, Co-located with SIGCOMM 2011, pp. 7–12, Canada, August 2011. View at Scopus
  12. D. Zhang and B. Adipat, “Challenges, Methodologies, and Issues in the Usability Testing of Mobile Applications,” in Proceedings of the International Journal of Human–Computer Interaction, vol. 18, pp. 293–308, 2005. View at Publisher · View at Google Scholar
  13. Rosa Yáñez Gómez, Daniel Cascado Caballero, and José-Luis Sevillano, “Heuristic Evaluation on Mobile Interfaces: A New Checklist,” The Scientific World Journal, vol. 2014, Article ID 434326, 19 pages, 2014. View at Publisher · View at Google Scholar
  14. Z. Tang, T. R. Johnson, R. D. Tindall, and J. Zhang, “Applying heuristic evaluation to improve the usability of a telemedicine system,” Telemedicine and e-Health, vol. 12, no. 1, pp. 24–34, 2006. View at Publisher · View at Google Scholar · View at Scopus
  15. F. Paz, F. A. Paz, D. Villanueva, and J. A. Pow-Sang, “Heuristic Evaluation as a Complement to Usability Testing: A Case Study in WEB Domain,” in Proceedings of the 12th International Conference on Information Technology: New Generations, ITNG 2015, pp. 546–551, USA, April 2015. View at Scopus
  16. E. L. C. Law and E. T. Hvannberg, “Analysis of strategies for improving and estimating the effectiveness of heuristic evaluation,” in Proceedings of the third Nordic conference on Human-computer interaction, pp. 241–250, 2004.
  17. E. T. Hvannberg, E. L. C. Law, and M. K. Lérusdóttir, “Heuristic evaluation: Comparing ways of finding and reporting usability problems,” Interacting with Computers, vol. 19, no. 2, pp. 225–240, 2006. View at Google Scholar
  18. G. Joyce and M. Lilley, “Towards the development of usability heuristics for native smartphone mobile applications,” in Design, User Experience, and Usability. Theories, Methods, and Tools for Designing the User Experience, pp. 465–474, Springer International Publishing, Switzerland, 2014. View at Google Scholar
  19. H. Muzaini, “Informal heritage-making at the Sarawak Cultural Village, East Malaysia,” Tourism Geographies, vol. 19, no. 2, pp. 244–264, 2017. View at Publisher · View at Google Scholar · View at Scopus
  20. J. Anderson, “Living History: Simulating Everyday Life in Living Museums,” American Quarterly, vol. 34, no. 3, p. 290, 1982. View at Publisher · View at Google Scholar
  21. P. Dellios, “The museumification of the village: cultural subversion in the 21st Century,” Culture Mandala: The Bulletin of the Centre for East-West Cultural and Economic Studies, vol. 5, no. 1, 2002. View at Google Scholar
  22. S. C. Heilman and D. MacCannell, “The Tourist: A New Theory of the Leisure Class.,” Social Forces, vol. 55, no. 4, p. 1104, 1977. View at Publisher · View at Google Scholar
  23. C. Latrell, “Performance and Place-Making at Sarawak Cultural Village,” Journal of Ideas and Culture, vol. 10, no. 3, pp. 127–142, 2006. View at Google Scholar
  24. M. Hitchcock, N. Stanley, and K. C. Siu, “The South-east Asian living museum' and its antecedents,” in Heritage, museums and galleries: An introductory reader, vol. 291, 2005. View at Google Scholar
  25. J. Abi, “Visitors Evaluation on Facilities and Services Using Importance-Performance Analysis at Sarawak Cultural Village,” in Proceedings of the 2nd Regional Conference on Tourism Research, p. 16, 2011.
  26. J. Abi, M. Mariapan, and A. Aziz, “Visitor’s Evaluation on Facilities and Services using Importance Performance Analysis at Sarawak Cultural Village’,” OSR Journal of Environmental Science, Technology and Food Technology, vol. 9, no. 12, pp. 16–24, 2015, http://www.iosrjournals.org/iosr-jestft/papers/vol9-issue12/Version-1/D091211624.pdf. View at Google Scholar
  27. A. Charland and B. Leroux, “Mobile application development: web vs. native,” Communications of the ACM, vol. 54, no. 5, pp. 49–53, 2011. View at Publisher · View at Google Scholar · View at Scopus
  28. M. Hassenzahl and N. Tractinsky, “User experience—a research agenda,” Behaviour & Information Technology, vol. 25, no. 2, pp. 91–97, 2006. View at Publisher · View at Google Scholar · View at Scopus
  29. A. Civan, W. Jones, P. Klasnja, and H. Bruce, Better to Organize Personal Information by Folders or by Tags? The Devil is in the Details, vol. 45, American Society for Information Science and Technology, Silver Spring, Md, USA, 2009.
  30. H. L. O'Brien and E. G. Toms, “What is user engagement? A conceptual framework for defining user engagement with technology,” Journal of the Association for Information Science and Technology, vol. 59, no. 6, pp. 938–955, 2008. View at Publisher · View at Google Scholar · View at Scopus
  31. M. Hassenzahl, “User experience (UX): towards an experiential perspective on product quality,” in Proceedings of the 20th Conference on l'Interaction Homme-Machine (IHM '08), pp. 11–15, ACM, Metz, France, September 2008. View at Publisher · View at Google Scholar · View at Scopus
  32. E. L.-C. Law, V. Roto, M. Hassenzahl, A. Vermeeren, P. O. S, and J. Kort, “Understanding, scoping and defining user experience: A survey approach,” in Proceedings of CHI09, Boston, MA, USA, 2009.
  33. S. Kujala, V. Roto, K. Väänänen-Vainio-Mattila, E. Karapanos, and A. Sinnelä, “UX Curve: A method for evaluating long-term user experience,” Interacting with Computers, vol. 23, no. 5, pp. 473–483, 2011. View at Publisher · View at Google Scholar · View at Scopus
  34. E. Karapanos, J. Zimmerman, J. Forlizzi, and JB. Marten, “User Experience over Time: An Initial Framework,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Boston, MA, USA, 2009.
  35. E. Karapanos, J.-B. Martens, and M. Hassenzahl, “Reconstructing experiences with iScale,” International Journal of Human-Computer Studies, vol. 70, no. 11, pp. 849–865, 2012. View at Publisher · View at Google Scholar · View at Scopus
  36. I. Pattersson, The Temporality of In-Vehicle User Experience: Exploring User Experiences from Past to Future, Chalmers University of Technology, 2016.
  37. G. Joyce, M. Lilley, T. Barker, and A. Jefferies, “Evaluating the impact of changing contexts on mobile application usability within agile environments,” in Proceedings of the 2016 Future Technologies Conference, FTC 2016, pp. 476–480, USA, December 2016. View at Scopus
  38. C. C. Huang, Describing and Analyzing Interactive Experience Over Time [Ph.D. thesis], Indiana University, 2015.
  39. C. C. Huang and E. Stolterman, “Temporal Anchors in User Experience Research,” in Proceedings of the 2014 Conference on Designing Interactive Systems, pp. 271–274, New York, NY, USA, 2014.
  40. S. Kujala and T. Miron-Shatz, “The Evolving Role of Expectations in Long-term User Experience,” in Proceedings of the 19th International Academic Mindtrek Conference on AcademicMindTrek, vol. 15, pp. 167–174, New York, NY, USA, 2015.
  41. J. Varsaluoma and F. Sahar, “Usefulness of long-term user experience evaluation to product development: Practitioners' views from three case studies,” in Proceedings of the 8th Nordic Conference on Human-Computer Interaction, NordiCHI 2014, pp. 79–88, Finland, October 2014. View at Scopus
  42. N. Proctor, “Providing deaf and hard-of-hearing visitors with on-demand independent access to museum information and interpretation through handheld computers,” in Museum and the Web, J. Trant and D. Bearman, Eds., Archives & Museum Informatics, Toronto, Canada, 2004. View at Google Scholar
  43. R. Agarwal and V. Venkatesh, “Assessing a firm's web presence: a heuristic evaluation procedure for the measurement of usability,” Information Systems Research, vol. 13, no. 2, pp. 168–186, 2002. View at Publisher · View at Google Scholar · View at Scopus
  44. R. Y. Gómez, D. C. Caballero, and J.-L. Sevillano, “Heuristic Evaluation on Mobile Interfaces: A New Checklist,” The Scientific World Journal, vol. 2014, Article ID 434326, 2014. View at Publisher · View at Google Scholar · View at Scopus
  45. L. A. Lockwood and L. L. Constantine, “Usability by inspection: Collaborative techniques for software and web applications,” in Pro forUSE, pp. 253–282, 2003. View at Google Scholar
  46. J. Nielsen, “Reliability of severity estimates for usability problems found by heuristic evaluation,” in Proceedings of the Posters and short talks of the 1992 SIGCHI conference, p. 129, Monterey, California, May 1992. View at Publisher · View at Google Scholar
  47. R. Inostroza, C. Rusu, S. Roncagliolo, and V. Rusu, “Usability heuristics for touchscreen-based mobile devices: Update,” in Proceedings of the 2013 Chilean Conference on Human - Computer Interaction, ChileCHI 2013, pp. 24–29, Chile, November 2013. View at Scopus
  48. G. Joyce, M. Lilley, T. Barker, and A. Jefferies, “Smartphone app usability evaluation: The applicability of traditional heuristics,” in Proceedings of the International Conference, Design, User Experience and Usability (DUXU 2015), pp. 541–550, Los Angeles, LA, USA, 2015.
  49. G. Reynaga, S. Chiasson, and P. C. Van Oorschot, “Heuristics for the evaluation of captchas on smartphones,” in Proceedings of the British HCI Conference, British HCI 2015, pp. 126–135, UK, July 2015. View at Scopus
  50. R. H. Y. Fung, D. K. W. Chiu, E. H. T. Ko, K. K. W. Ho, and P. Lo, “Heuristic Usability Evaluation of University of Hong Kong Libraries’ Mobile Website,” Journal of Academic Librarianship, vol. 42, no. 5, pp. 581–594, 2016. View at Publisher · View at Google Scholar · View at Scopus
  51. J. Mankoff, A. K. Dey, G. Hsieh, J. Kientz, S. Lederer, and M. Ames, “Heuristic evaluation of ambient displays,” in Proceedings of the The CHI 2003 New Horizons Conference Proceedings: Conference on Human Factors in Computing Systems, pp. 169–176, USA, April 2003. View at Scopus
  52. P. A. Silva, P. Jordan, and K. Holden, “Something Old, Something New, Something Borrowed: Gathering experts' feedback while performing heuristic evaluation with a list of heuristics targeted at older adults,” in Proceedings of the 11th Advances in Computer Entertainment Technology Conference, ACE 2014 Workshops, Portugal, November 2014. View at Scopus
  53. P. A. Silva, K. Holden, and A. Nii, “Smartphones, smart seniors, but not-so-smart apps: A heuristic evaluation of fitness apps,” in Proceedings of the International Conference on Augmented Cognition, pp. 347–358, Springer, Cham, Switzerland, 2014.
  54. D. E. Chisnell, J. C. Redish, and A. Lee, “New heuristics for understanding older adults as web users,” Technical Communication, vol. 53, no. 1, pp. 39–59, 2006. View at Google Scholar · View at Scopus
  55. S. Kurniawan and P. Zaphiris, “Research-derived web design guidelines for older people,” in Proceedings of the ASSETS 2005 - The Seventh International ACM SIGACCESS Conference on Computers and Accessibility, pp. 129–135, USA, October 2005. View at Scopus
  56. J. Díaz, C. Rusu, and C. A. Collazos, “Experimental validation of a set of cultural-oriented usability heuristics: e-Commerce websites evaluation,” Computer Standards & Interfaces, vol. 50, pp. 160–178, 2017. View at Publisher · View at Google Scholar · View at Scopus
  57. R. Inostroza, C. Rusu, S. Roncagliolo, V. Rusu, and C. A. Collazos, “Developing SMASH: A set of SMArtphone's uSability Heuristics,” Computer Standards & Interfaces, vol. 43, pp. 40–52, 2016. View at Publisher · View at Google Scholar · View at Scopus
  58. E. Bertini, S. Gabrielli, and S. Kimani, “Appropriating and assessing heuristics for mobile computing,” in Proceedings of the working conference on Advanced visual interfaces, pp. 119–126, 2006.
  59. L. H. Salazar, T. Lacerda, J. V. Nunes, and C. Gresse von Wangenheim, “A Systematic Literature Review on Usability Heuristics for Mobile Phones,” International Journal of Mobile Human Computer Interaction, vol. 5, no. 2, pp. 50–61, 2013. View at Publisher · View at Google Scholar
  60. A. Salgado and A. Freire, “Heuristic evaluation of mobile usability: a mapping study,” in Human Computer Interaction. Applications and services, Lecture Notes Computer Science, vol. 8512, pp. 178–188, 2014. View at Publisher · View at Google Scholar
  61. J. Nielsen and R. L. Mack, Usability Inspection Methods, John Wiley & Sons, New York, NY, USA, 1994.
  62. T. Vithani and A. Kumar, “Modeling the mobile application development lifecycle,” in Proceedings of the International MultiConference of Engineers and Computer Scientist, Hong Kong, 2014.
  63. M. K. Othman, Measuring Visitors Experiences with Mobile Guide Technology in Cultural [Phd thesis], University of York, 2012.
  64. S. Gray, C. Ross, A. Hudson-Smith, M. Terras, and C. Warwick, “Enhancing Museum Narratives with the QRator Project: a Tasmanian devil, a Platypus and a Dead Man in a Box,” in Proceedings of Museum and the Web, Toronto, Archives, 2012.
  65. C. Bailey-Ross, S. Gray, J. Ashby, M. Terras, A. Hudson-Smith, and C. Warwick, “Engaging the Museum Space: Mobilizing visitor engagement with digital content creation,” in Digital Scholarship in the Humanities, 2016. View at Google Scholar
  66. S. Boiano, J. P. Bowen, and G. Gaia, “Usability, design and content issues of mobile apps for cultural heritage promotion: The Malta culture guide experience,” https://arxiv.org/abs/1207.3422.
  67. J. Nielsen, “Heuristic evaluation,” in Readings in Human–Computer Interaction, vol. 17, Elsevier, 1994. View at Google Scholar
  68. L. Kuparinen, J. Silvennoinen, and H. Isomäki, “Introducing usability heuristics for mobile map applications,” in Proceedings of the 26th International Cartographic Conference (ICC, 2013), Dresden , Germany, 2013.
  69. M. K. Othman, K. I. Idris, S. Aman, and P. Talwar, “An Empirical Study of Visitors’ Experience at Kuching Orchid Garden with Mobile Guide Application,” Advances in Human Computer Interaction, vol. 2018, Article ID 5740520, 14 pages, 2018. View at Publisher · View at Google Scholar
  70. A. Alsumait and A. Al-Osaimi, “Usability heuristics evaluation for child e-learning applications,” Journal of Software , vol. 5, no. 6, pp. 654–661, 2010. View at Google Scholar · View at Scopus
  71. D. Pinelle, N. Wong, and T. Stach, “Heuristic evaluation for games: usability principles for video game design,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (CHI '08), pp. 1453–1462, April 2008. View at Publisher · View at Google Scholar · View at Scopus
  72. H. Petrie and C. Power, “What do users really care about? A comparison of usability problems found by users and experts on highly interactive websites,” in Proceedings of the 30th ACM Conference on Human Factors in Computing Systems, CHI 2012, pp. 2107–2116, USA, May 2012. View at Scopus