About this Journal Submit a Manuscript Table of Contents
Journal of Computer Networks and Communications
Volume 2013 (2013), Article ID 165146, 28 pages
http://dx.doi.org/10.1155/2013/165146
Review Article

Survey and Challenges of QoE Management Issues in Wireless Networks

1Ministry of Security of Bosnia and Herzegovina, Trg BiH 1, 71000 Sarajevo, Bosnia and Herzegovina
2Faculty of Electrical Engineering and Computing, University of Zagreb, Unska 3, 10000 Zagreb, Croatia

Received 7 September 2012; Revised 8 December 2012; Accepted 12 December 2012

Academic Editor: Raimund Schatz

Copyright © 2013 Sabina Baraković and Lea Skorin-Kapov. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

With the move towards converged all-IP wireless network environments, managing end-user Quality of Experience (QoE) poses a challenging task, aimed at meeting high user expectations and requirements regarding reliable and cost-effective communication, access to any service, anytime and anywhere, and across multiple operator domains. In this paper, we give a survey of state-of-the-art research activities addressing the field of QoE management, focusing in particular on the domain of wireless networks and addressing three management aspects: QoE modeling, monitoring and measurement, and adaptation and optimization. Furthermore, we identify and discuss the key aspects and challenges that need to be considered when conducting research in this area.

1. Introduction

Wireless mobile communications have experienced phenomenal growth throughout the last decades, going from support for circuit-switched voice services and messaging services to IP-based mobile broadband services using High Speed Packet Access (HSPA), Worldwide Interoperability for Microwave Access (WiMAX), and Long-Term Evolution (LTE) Radio Access networks [1]. Increasingly, mobile applications and services are being used in daily life activities in order to support the needs for information, communication, or leisure [2]. Mobile users are requiring access to a wide spectrum of various multimedia applications/services without being limited by constraints such as time, location, technology, device, and mobility restrictions. This represents the outcome of the currently leading trend and future aim in the telecommunications domain: the convergence between fixed and mobile networks, and the integration of existing and new wireless technologies. Such integrations aim to satisfy mobile users’ requirements in terms of providing access to any service, along with reliable and cost-effective communication, anytime and anywhere, over any medium and networking technology, and across multiple operator domains [3].

The ITU has specified the Next Generation Network (NGN) as a generic framework for enabling network convergence and realizing the aforementioned requirements [4]. The NGN concept is centered around a heterogeneous infrastructure of various access, transport, control, and service solutions, merged into a single multimedia-rich service provisioning environment. Today, an increasing number of mobile operators are migrating their networks in line with the 3GPP specified Evolved Packet System (EPS), consisting of a multiaccess IP-based core network referred to as the Evolved Packet Core (EPC), and a new LTE 4G radio access network based on Orthogonal Frequency Division Multiplexing (OFDM) [1, 5, 6]. While the network controlled and class-based Quality of Service (QoS) concept of the EPC are based on the 3GPP policy and charging control (PCC) framework [79] (discussed further in Section 4), intense recent research in the area of Quality of Experience (QoE) has shown that such QoS mechanisms may need to be complemented with more user-centric approaches in order to truly meet end-user requirements and expectations.

Today, humans are quality meters, and their expectations, perceptions, and needs with respect to a particular product, service, or application carry a great value [10]. While the ITU-T has defined QoE as the “overall acceptability of an application or service, as perceived subjectively by the end user” [11], ETSI defines QoE as “a measure of user performance based on both objective and subjective psychological measures of using an ICT service or product” [12] and extends QoE beyond subjective to include objective psychological measures.

QoE is therefore considered to be a multidimensional construct, encompassing both objective (e.g., performance related) and subjective (e.g., user related) aspects [13]. As such, QoE has been considered in relation to both QoS, which is primarily a technical, objective, and technology-oriented concept, as well as to User Experience (UX) [14] which is generally considered as a more user-oriented concept. The former focuses on the impact of network and application performance on user quality perception, while the latter primarily deals with the individual users’ experiences derived from encounters with systems, impacted by expectations, prior experiences, feelings, thoughts, context, and so forth.

Various approaches such as [1519] provide definitions of QoE that are closely related to technology-centric logic, not accounting for the subjective character of human experience, and lacking consideration of a broader definition of QoE [20]; that is, the consequence of the assumption that the optimization of QoS-related parameters will automatically result in increasing the overall QoE, leading to swift adoption of products and services on the consumption side. However, QoS is only a subset of the overall QoE scope. Higher QoS would probably result in higher QoE in many cases, but fulfilling all traffic-related QoS requirements will not necessarily guarantee high user QoE. Moreover, it is assumed that products and services that meet users’ requirements and expectations and that allow them to have high QoE in their personal context will probably be more successful than products and services that have higher QoS but fail to meet users’ high demands and expectations [21].

A recent definition that has emerged from the EU Qualinet community (COST Action IC1003: “European Network on Quality of Experience in Multimedia Systems and Services”) encompasses the discussed aspects and defines QoE as “the degree of delight or annoyance of the user of an application or service. It results from the fulfillment of his or her expectations with respect to the utility and/or enjoyment of the application or service in the light of the user’s personality and current state. In the context of communication services, QoE is influenced by service, content, device, application, and context of use” [22].

In the context of converged all-IP wireless networks, an important consideration is the impact of mobility on user QoE, further related to mechanisms for assuring session and connection continuity. Session establishment delays are impacted by authorization and authentication procedures, as well as session establishment signaling. Handover may impose additional delays and packet losses, resulting in potential loss of session-related content and awkward communication. Hence, mechanisms are necessary for achieving seamlessly session continuity and minimizing disruption time [21]. Besides the challenges that various types of mobility (terminal, session, user, service) and seamless handover between networks that use the same technology (horizontal handover) or networks that use different technologies (vertical handover) represent, the fast development of new and complex mobile multimedia services that can be delivered via various new mobile devices (smartphones, tablets, etc.) poses an additional challenge in the QoE provisioning process [3]. In the context of wireless systems, limitations arising from both device and transmission channel characteristics have a clear impact on user quality perception [23].

The overall goal of QoE management may be related to optimizing end-user QoE (end-user perspective), while making efficient (current and future) use of network resources and maintaining a satisfied customer base (provider perspective). In order to successfully manage QoE for a specific application, it is necessary to understand and identify multiple factors affecting it (subjective and objective) from the point of view of various actors in the service provisioning chain, and how they impact QoE. Resulting QoE models dictate the parameters to be monitored and measured, with the ultimate goal being effective QoE optimization strategies. Therefore, the overall process of QoE management may be broken down into three general steps: (1) QoE modeling, (2) QoE monitoring and measurements, and (3) QoE optimization and control [24].

With the implementation of successful QoE management, users will benefit with satisfied requirements/expectations and may be further inclined to adopt new complex services and support further technology development. Furthermore, QoE management is very important for all actors and stakeholders involved in the service provisioning chain: device manufacturers, network providers, service providers, content providers, cloud providers, and so forth. In today’s highly competitive environment, where providers’ price levels are decreasing and pricing schemes are becoming more similar [25], it is not enough to simply make services available to users, who further have the option of choosing from a spectrum of various providers. Actors involved in the process of service provisioning have identified the need to work towards meeting users’ requirements and expectations by maximizing users’ satisfaction with the overall perceived service quality, while at the same time minimizing their costs. Understanding and managing QoE is needed in order to react quickly to quality problems (preferably) before customers perceive them. Hence, successful QoE management offers stakeholders a competitive advantage in the fight to prevent customer churn and attract new customers.

Based on the previously mentioned, it may be concluded that QoE is a fast emerging multidisciplinary field based on social psychology, cognitive science, economics, and engineering science, focused on understanding overall human quality requirements [10]. Consequently, management of QoE as a highly complex issue requires an interdisciplinary view from user, technology, context, and business aspects, with flexible cooperation between all players and stakeholders involved in the service providing chain.

In this paper we give a survey of approaches and solutions related to QoE management, focusing in particular on wireless network environments. It is important to note that different QoE models and assessment methodologies are applicable for different types of services (e.g., conversational voice services, streaming audio-visual services, interactive data services, collaborative services). We do not focus on a particular service but rather give a general survey of approaches. The paper is organized as follows. Section 2 surveys the modeling of QoE by discussing the classification of a wide range of QoE influence factors into certain dimensions and describing existing general QoE models. The monitoring and measurement of QoE is described in Section 3, while Section 4 discusses the topic of QoE optimization and control. Finally, Section 5 concludes the paper by pointing out the challenges and open research issues in the field of QoE management.

2. QoE Modeling

As a prerequisite to successful QoE management, there is a need for a deep and comprehensive understanding of the influencing factors (IF) and multiple dimensions of human quality perception. QoE modeling aims to model the relationship between different measurable QoE IFs and quantifiable QoE dimensions (or features) for a given service scenario. Such models serve the purpose of making QoE estimations, given a set of conditions, corresponding as closely as possible to the QoE as perceived by end users. Based on a given QoE model specifying a weighted combination of QoE dimensions and a further mapping to IFs, a QoE management approach will then aim to derive Key Quality Indicators (KQIs) and their relation with measureable parameters, along with quality thresholds, for the purpose of fulfilling a set optimization goal (e.g., maximizing QoE to maximize profit, maximizing number of  “satisfied” customers). An important issue to note is that different actors involved in the service provisioning chain will use a QoE model in different ways, focusing on those parameters over which a given actor has control (e.g., a network provider will consider how QoS-related performance parameters will impact QoE, while a content or service provider will be interested in how the service design or usability will impact QoE).

In this section we first discuss QoE IFs in general, and give an overview and comparison of general QoE modeling approaches, discussing in turn their applicability with respect to QoE management strategies. We then further consider more concretely QoE models targeted specifically towards wireless networks, highlighting the differences with respect to fixed networks. We end the section with a summary of QoE modeling challenges.

2.1. QoE Influence Factors

A QoE IF has been defined as “any characteristic of a user, system, service, application, or context whose actual state or setting may have influence on the Quality of Experience for the user” [22]. Figure 1 illustrates a multitude of different factors which may be considered in relation to QoE, making it clear that their grouping into categories aids in identifying such factors in a systematic way. Several existing approaches have addressed this issue and proposed classifications of QoE IFs into multiple dimensions. It should be noted that specific IFs are relevant for different types of services and applications.

165146.fig.001
Figure 1: Different factors to be considered in relation to QoE.

Stankiewicz and Jajszczyk [3] have classified the technology-oriented factors that impact QoE into three groups: QoS factors, Grade of Service (GoS) factors, and Quality of Resilience (QoR) factors, believing that provisioning of those at the appropriate level is crucial for achieving high QoE. Also, they take into consideration a number of additional factors (mostly nontechnology related) such as emotions, user profile, pricing policy, application specific features, terminals, codecs, type of content, and environmental, psychological, and sociological aspects, but they do not further group them. On the other hand, Baraković et al. [21] have categorized QoE influence factors into five dimensions: (1) technology performance on four levels: application/service, server, network, and device; (2) usability, referring to users’ behavior when using the technology; (3) subjective evaluation; (4) expectations; and (5) context. Recently, Skorin-Kapov and Varela [26] have proposed the ARCU model that groups QoE factors into four multidimensional IF spaces: Application (application configuration-related factors), Resource (network/system related factors), Context, and User spaces.

Finally, a recent classification that has emerged from the EU Qualinet community in the form of a White Paper [22] groups QoE IFs into the following three categories (which are additionally divided into several subcategories as described in the referenced whitepaper). (i)“Human IFs present any variant or invariant property or characteristic of a human user. The characteristic can describe the demographic and socioeconomic background, the physical and mental constitution, or the user’s emotional state” (e.g., user’s visual and auditory acuity, gender, age, motivation, education background, emotions).(ii)“System IFs refer to properties and characteristics that determine the technically produced quality of an application or service. They are related to media capture, coding, transmission, storage, rendering, and reproduction/display, as well as to the communication of information itself from content production to user” (e.g., bandwidth, delay, jitter, loss, throughput, security, display size, resolution).(iii)“Context IFs are defined as factors that embrace any situational property to describe the user’s environment in terms of physical, temporal, social, economic, task, and technical characteristics” (e.g., location, movements, time of day, costs, subscription type, privacy).

2.1.1. Relationships between QoS and QoE

Among the wide scope of discussed IFs, a great deal of research has focused in particular on identifying the relationships between QoS parameters and QoE, whereby in many cases a user’s perceived quality has been argued to mostly depend on QoS [2729]. In studies focused on the mathematical interdependency of QoE and QoS, Reichl et al. [28, 30] have identified a logarithmic relationship between QoE and QoS. They argue that it can be explained on behalf of the Weber-Fechner Law (WFL) [31], which studies the perceptive abilities of the human sensory system to a physical stimulus in a quantitative fashion, and states that the just noticeable difference between two levels of certain stimulus is proportional to the magnitude of the stimuli. A logarithmic relationship formulates the sensitivity of QoE as a reciprocal function of QoS.

On the other hand, the IQX hypothesis presented and evaluated by Fiedler et al. in [29] formulates the sensitivity of QoE as an exponential function of a single QoS impairment factor. The underlying assumption is that the change of QoE depends on the actual level of QoE. Understanding the relationship between network-based QoS parameters and user-perceived QoE provides important input for the QoE management process, in particular to network providers with control over network resource planning and provisioning mechanisms.

2.2. From Subjective Quality Assessment to Objective Quality Estimation Models

When building a QoE model, quality assessment methodologies must be employed. While actual “ground-truth” user perceived quality may be obtained only via subjective assessment methodologies, the goal is to use subjective tests as a basis for building objective QoE models capable of estimating QoE based solely on objective quality measurements. Hence, we shortly describe subjective quality assessment methodologies and different types of objective quality assessment methods and models.

2.2.1. Subjective QoE Assessment

Subjective quality assessments are based on psychoacoustic/visual experiments which represent the fundamental and most reliable way to assess users’ QoE, although they are complex and costly. These methods have been investigated for many years and have enabled researchers to gain a deeper understanding of the subjective dimensions of QoE. Most commonly, the outcomes of any subjective experiment are quality ratings from users obtained during use of the service (in-service) or after service use (out-of-service), which are then averaged into Mean Opinion Scores (MOSs). This approach has been specified in ITU-T Recommendation P.800.1 [32] and expresses an average quality rating of a group of users by means of standardized scales. For a number of reasons, the use of MOS has been criticized [33] and extended to other ITU recommended subjective assessment procedures classified by type of application and media [34]. An interested reader is referred to the large number of standards which are referenced in [34].

In addition to standardized subjective QoE assessment methods, additional (sometimes complementary) relevant methods used for long-term user experience assessment have been used. In their studies involving QoE evaluations of mobile applications, Wac et al. [35] have collected users’ QoE ratings on their mobile phones via an Experience Sampling Method (ESM) [36] several times per day, while a Day Reconstruction Method (DRM) [37] has been used to interview users on a weekly basis regarding their usage patterns and experiences towards the mobile applications. These methods have served to analyze possible relations and causalities between QoE ratings, QoS, and context.

With regards to collection of data and running of QoE experiments, assessments may be conducted in a laboratory setting [38], in a living labs environment [20], or in an actual real world environment [27, 35]. Some performance criteria are modified in a given range in a controlled fashion and subsequently users’ opinions regarding the service performance are quantified. As an emerging and very prospective solution focusing on obtaining a large number of ratings in a real world environment, crowdsourcing methodology [39, 40] has been studied and utilized.

2.2.2. Objective QoE Models

Objective QoE models are defined as the means for estimating subjective quality solely from objective quality measurement or indices [41]. In other words, these models are expected to provide an indication which approximates the rating that would be obtained from subjective assessment methods. Different types of objective quality estimation and prediction models have been developed. Each model has its proper domain of application and range of system or service conditions it has been designed for. Since there exists no universal objective quality assessment approach, proposed ones can be categorized by various criteria in order to determine their application area [34]: (1) application scope; (2) quality features being predicted; (3) considered network components and configurations; (4) model input parameters; (5) measurement approach; (6) level of service interactivity that can be assessed; and (7) level to which psychophysical knowledge or empirical data have been incorporated.

According to the level at which the input information is extracted, there are five types of objective QoE models [34]: (1) media-layer; (2) packet-layer; (3) bitstream-layer; (4) hybrid; and (5) planning models. Media-layer models [4245] estimate the audio/video QoE by using the actual media signals as their input. In dependence of utilization of the source signal, one can use three different approaches [46, 47]: (1) no-reference (NR) model; (2) reduced-reference (RR) model; and (3) full-reference (FR) model. On the other hand, packet-layer models [48] utilize only the packet header information for QoE estimation, which describes them as in-service nonintrusive quality monitoring approaches. A bitstream-layer model is the combination of the two previously mentioned models, since it utilizes bitstream information as well as packet header information. Similarly, hybrid models [4951] are conceived as a combination of the previously described three models. Finally, planning models do not acquire the input information from an existing service, but estimate it based on service information available during the planning phase.

In addition to ITU-T standards, ETSI gives a comprehensive guide with generic definitions and test methods for most of the key telecommunication services [52]. There are a number of other standardization bodies that deal with QoE assessment, including VQEG, MPEG, and JPEG.

While most of the current literature considers objective measures in relation to technology oriented collections of data, it is important to note that objective measurements may also refer to objective estimations of user’s behavior (e.g., task duration, number of mouse clicks) which is commonly considered only as subjective [53].

2.3. A Survey of General QoE Modeling Approaches

Besides briefly described subjective assessment methods that may contribute to building objective QoE models, also addressed in the previous subsection, we survey a number of general QoE modeling approaches in order to obtain a “broader picture” on this topic. Therefore, this subsection provides an analysis of several general QoE models that were validated by various types of services. While the listed models are not all strictly limited to wireless environments, they aim to identify numerous QoE IFs and provide mechanisms for relating them to QoE.

Table 1 summarizes these QoE models based on the following comparison parameters: IFs according to categorization in [22], type of service, consideration of wireless aspects, provisioning of the concrete QoE model, and applicability with respect to QoE management. This comparison provides an extension and modification of the analysis given by Laghari et al. in [10]. We consider more concretely QoE models targeted specifically for wireless environments in the following subsection.

tab1
Table 1: Comparison of QoE modeling approaches.

Perkis et al. in [54] present a model for measuring the QoE of multimedia services, distinguishing between measurable and nonmeasurable parameters. In other words, the approach does not provide any QoE metric relationship formulae but rather addresses the factors that influence user’s QoE. The measurable model parameters are closely related to the technology aspects of the terminal and service, and nonmeasurable entities are closely related to user’s perception of a service, his/her expectations, and behavior. Additionally, a framework for quantifying the model parameters is described and validated with Voice on Demand (VoD) and mobile TV services in a mobile 3G environment. Although the authors categorize the parameters by accounting for QoS, QoE, and business aspects, and thereby encompass human and system parameters, the model does not fully include the context dimension. However, the proposed modeling framework gives the input information for the measurement process and thereby contributes to the overall QoE management by aiding the various parties in improving their performance.

A model that does not clearly encompass all QoE dimensions, but rather considers them in a limited manner, is introduced by Kim et al. [55]. The In-service Feedback QoE Framework (IFQF) is a user-triggering scheme aimed at investigating the main reasons of quality deterioration and thereby contributing to the overall QoE management process. The architecture consists of four agents: server, network, user, and management agent that gather information and form a feedback loop to find out the reason and location of faults, and thereby to minimize the difference between QoE value estimated by operators and the real QoE (as subjectively perceived by the user).

In [56], Kilkki proposed a framework that identifies the relationship between QoS and QoE, but does not explicitly consider QoE components in detail. The framework connects different research communities, including engineers, economists, and behavioral scientists. The author makes a strong case for a holistic approach to QoE and suggests the establishment of a multidisciplinary research group which would address the complexity of QoE. Additionally, key terms in the communication ecosystem are stated, but no classification of QoE factors or any details on the taxonomy are provided. However, the framework introduces new concepts such as Quality of User Experience (QoUE) and Quality of Customer Experience (QoCE).

On the contrary to the previous approaches which considered various QoE factors only partially or in a more abstract fashion, Möller et al. [57] have developed a detailed taxonomy of the most relevant QoS and QoE aspects focusing on multimodal human-machine interactions, as well as factors influencing its QoS. The taxonomy consists of three layers: (1) the QoS-influencing factors related to the user, the system, and the context of usage; (2) the QoS interaction performance aspects describing user and system behavior and performance; and (3) the QoE aspects related to the quality perception and judgment processes taking place inside the user. In addition to previously described approaches, this one also does not provide any concrete formulation of QoE metrics relationship but recognizes the need for one with corresponding weights given to QoE IFs in order to contribute to target-oriented design and QoE optimization in future systems. However, it is believed that the developed detailed taxonomy provided in this paper will aid in producing concrete formulation.

As in [56], Geerts et al. [58] have taken a multidisciplinary approach and included researchers from backgrounds such as sociology, communication science, psychology, software development, and computer science in order to create a comprehensive framework. The proposed model consists of four components: user, ICT product, use process, and context. Each component is divided into several subcategories, which then encompass all three aforementioned QoE IF categories. Although the proposed approach does not introduce a weighted QoE formulae, it aims to provide a detailed look at the different components of QoE offering concrete information on how they can be measured.

Another approach accounting for all QoE IF categories is proposed by Laghari et al. [10]. The authors have proposed a high-level QoE model that can be adapted to many specific contexts. It consists of four domains, that is, sets of knowledge, activity, or influence in the proposed model: human, context, technology, and business. Therefore, the model addresses QoE from multiple aspects, while it can be noted that it is more subjectively oriented towards the human domain. Also, the framework defines the main interactions of the domains: human-context, human-technology, human-business, technology-business, and context-techno-business, as well as presenting causal relationships between domain characteristics. In other words, the presented formulation relates QoE (set of outcome factors) with a “cause-effect” relationship directly affected with the prediction factors (e.g., technological, business, or contextual characteristics) and indirectly with mediating factors (e.g., associations between aforementioned factors). Additionally, by providing a well-structured detailed taxonomy of QoE relevant variables and formulating the causal relationship between them, this approach aids various interested parties in comprehending and managing QoE in a broader manner.

Volk et al. [59] present a novel approach to QoE modeling and assurance in an NGN Service Delivery Environment (SDE). The proposed model is context aware and comprises a comprehensive set of quality-related parameters available throughout various information factories of the NGN and accessible by employing standardized procedures within the NGN SDE. QoS and various human perception components are addressed. Furthermore, parameter selection and mapping definitions are established vertically from the transport layer through the application layer to the end-user layer, and horizontally with concatenation of point-to-point QoS and end-to-end QoE.

De Moor et al. [20] propose a framework that enables the evaluation of multidimensional QoE in a mobile testbed-oriented living lab setting. The model consists of a distributed architecture for monitoring the network QoS, context information, and subjective user experience based on the functional requirement related to real-time experience measurements in real-life settings. The architecture allows the study and understanding of cross-contextual effects, the assessment of the relative importance of parameters, and the development of a basic algorithmic QoE model.

Although Song et al. [60] have not proposed exact formulae for QoE calculation, they have organized QoE IFs into three components: user, system, and context and mapped their impacts upon four elements of the mobile video delivery framework, namely, mobile user, mobile device, mobile network, and mobile video service, since the model is created for a mobile video environment. User-centered design of mobile video may benefit from this model, as well as mobile video vendors that may develop effective strategies to improve user’s experience.

Finally, the framework discussed by Reichl et al. [28] is aimed at improved modeling, measurement, and management of QoE for mobile broadband services (e.g., mobile Web browsing, file download). The authors have developed a model for predicting QoE of network services, based on a layered approach distinguishing between the network, application and the user layers. The layered approach derives the most relevant performance indicators (e.g., network performance indicators, user-experience characteristics, and specific application/service related performance indicator) and aims to builds accurate QoE models by combining user studies providing direct ratings, with logged data at the application layer and traffic measurements at the network layer. The employed laboratory setup includes the user’s device and two network emulators which model the behavior of a variable UMTS/HSPA network. Participants’ network traffic is captured using the METAWIN [61] passive monitoring system that monitors traffic on all the interfaces of the packet-switched core network. Passive network measurements combined with obtained subjective user ratings serve to build reliable QoE models for future QoE estimation. The proposed model provides an interdisciplinary perspective including aspects such as device and application usability, usage context, user personality, emotional issues, and user roles. Thereby, it can be used not only for QoE prediction and management, but also for uncovering functional dependencies between causally relevant performance indicators and the resulting perceived quality.

Based on the comparison given in Table 1, it may be summarized that the majority of discussed approaches [10, 20, 28, 5760], differing in considered type of service and wireless aspects, address the human IFs at both low-level (i.e., physical, emotional, and mental constitution of the user) and high-level processing (i.e., cognitive ability, interpretation, or judgment). These models also consider system IFs classified into content, media, network, and device factors, as well as context IFs, while a set of them such as [55, 56] do not explicitly address IFs in that fashion. However, although the analyzed approaches address different IFs, most provide a QoE modeling framework, while only a few of them provide a concrete model [10, 28, 59].

In the context of QoE management, all addressed QoE modeling approaches, each in its own way, contribute to this process and may be applied in various contexts such as system or service/application optimization, as well as network resource allocation improvement. Thus several of them have been built to aid monitoring, measurement, and estimation of QoE [20, 54, 58, 59], while others contribute to QoE improvement by diagnosing the main reasons for quality deterioration [55], enabling development of strategies [60], and providing detailed taxonomy [56, 57] and causal relationships [10], as well as QoE prediction mechanisms [28].

2.4. QoE Modeling in the Context of Wireless Networks

QoE modeling becomes even more challenging in the context of wireless and mobile networks due to additional issues posed by this variable environment. Previously analyzed QoE modeling approaches have addressed the points common to both fixed and wireless-mobile environments in terms of system, user, and context IFs. However, in order to gain better understanding of QoE modeling in wireless environments, additional aspects that need to be considered and stressed in addition to common ones have been listed and classified according to the categorization in [22] in Table 2.

tab2
Table 2: Aspects to be considered and stressed when modeling QoE in a wireless context.

Beginning with the environment itself, we address reliability and variability. Wireless channels are more prone to errors than fixed networks because of exposure to various physical phenomena such as noise, fading, or interference. This leads to packet losses, as well as to excessive and variable delays, which consequently affect metrics such as Round Trip Time (RTT), Server Response Time (SRT) or throughput, and integrity and fluency of transmitted data. The wireless infrastructure has been marked as an air bottleneck in data transmission between the user device and the gateway due to several other features such as wireless capacity in terms of speed, coverage radius, or limited bandwidth, and channel sharing with other users or signal strength which may be affected by temperature, humidity, distance from the antenna, and so forth. Also, regarding the wireless channels’ reliability, one must consider security issues and interceptions. In addition, the usage of the multitude of wireless access technologies differing in their characteristics (e.g., bandwidth, capacity and coverage constraints, congestion mechanism) also influences the wireless network performance in many ways and thereby the QoE as well.

In contrast to a fixed environment, mobility (horizontal handover) as well as the freedom of switching between various available wireless access technologies, that is, migration of communication from one network to another (vertical handover), leads to another factor affecting QoE—session establishment delay. Namely, during session establishment, a mobile user passes several steps. Firstly, the user has to wait for the security procedures to be performed in order to be granted access to the network. The user then additionally waits while the signaling procedures are completed in order to establish the session. Therefore, in order to initially establish the session or reestablish one due to interruption caused by handover, these procedures have to be performed [21]. With high users’ mobility rate, signaling procedures are performed more frequently, increasing the amount of the signaling traffic. This affects the overall usage of wireless resources and increases the session establishment delay, leading to a negative impact on QoE.

However, the increased amount of signaling traffic exchanged in session set up, modification, or tear down procedure does not only affect radio and signaling resources, but also affects device performance. For example, modern mobile phones have an impressive repertoire of functions and features and support applications that require constant connection with the network [62]. The connectivity is maintained by frequently exchanging signaling traffic, which may dominate in comparison with data traffic. The consequence is the overload of computational resources on the mobile device and faster battery consumption. These factors have been considered neither in the case of laptops where the signaling traffic is not generated that frequently and batteries are bigger and allow longer connection maintenance, nor in the QoE modeling for fixed environments where battery consumption is not an issue. Additionally, battery consumption is not only linked with mobile-device interaction with the network, but also to user-device interaction. Therefore, one may conclude that battery lifetime and its consumption are major factors that need to be considered when modeling QoE in the wireless context.

In addition to battery consumption, a number of mobile device features impact QoE. The size of the mobile device screen, as well as position and location of the keys on the screen, may cause difficulties with resizing or scrolling. The small keyboard can impact overall usability and lead to aggravation when typing. Furthermore, as stated in [2, 35], end-user perceived quality may be affected by a lack of “features,” such as flash player, personalized alarm clock, features for privacy settings, Global Positioning System (GPS), and built-in dictionary.

In the context of achieving high QoE, mobile application developers should consider all usage scenarios and address various challenges. Application adaptation capabilities (dynamic or static) are important, in terms of adapting service/application content to fit the device and access network capabilities. Besides usability which is mostly considered in all QoE models, applications should be adjusted to device computational power [63]. Various means of data access, security issues, and offline capabilities as well as transparent synchronization with the backend systems also must be addressed when considering mobile applications.

Finally, user behavior in the wireless context is different as compared to fixed environments. Users are able to access services via various available wireless technologies and different mobile devices, which expose them to dynamic environments. Although addressed in most existing QoE modeling approaches, it is particularly important to address the various usage contexts in wireless environments, since they change the users’ perceived quality greatly. The authors in [2] have recognized these important user- and context-related aspects in mobile environments and summarized them in the user routine and lifestyle.

2.5. Summary of QoE Modeling Challenges

There are a number of challenges related to the topic of QoE modeling. Firstly, there is a need to identify a long list of various factors affecting QoE for a given type of service. Secondly, well-planned extensive subjective studies need to be conducted involving human quality perception (including both cognitive and behavioral modeling) in order to model the relationship between identified IFs and (multiple dimensions of) QoE. Some of the main aspects to be considered when planning subjective tests include specification of the methodology to be used, identification of the dependant and independent variables to be considered, user test subjects, testing scenarios, testing environment, and rating scales. Test results analysis leads to identification of the IFs with the most significant impact on QoE and enables the derivation of key QoE IFs. The identification of key QoE IFs and their quality thresholds provide input for relevant QoE optimization strategies. Thirdly, general QoE models should be generic and designed in an elastic way so as to account for fast technology and service advances in converged wireless networks.

While standards specify subjective testing methodologies for multimedia services such as audio, video, and audiovisual services [49, 50, 64, 65], new methodologies are currently being studied for emerging services such as Web and cloud-based services [6670]. In addition, this section contributes by summarizing the QoE modeling challenges in the wireless context.

3. QoE Measurement and Modeling

As previously stated, QoE is a multidimensional concept which is difficult not only to define in a simple and unified manner, but also to monitor and measure, considering the large number of QoE IFs to be considered. In order to provide accurate QoE assessment, consideration of only one or two QoE IFs is generally not sufficient. On the contrary, QoE should be considered in all its dimensions taking into account as many IFs as possible (and relevant). Knowledge of the key IFs related to a given type of service drawn from QoE models provides input for QoE monitoring purposes.

The QoE monitoring and measurement process encompasses the acquisition of data related to the network environment and conditions, terminal capabilities, user, context, and application/service specific information and its quantification [24]. The parameters can be gathered via probes at different points in the communication system, at different moments, as well as by various methods. A diversity of QoE monitoring and measurement points, moments, and methods together with the selection of the key QoE IFs for a given service additionally increases the complexity of this process.

In order to be able to manage and optimize QoE, knowledge regarding the root cause of unsatisfactory QoE levels or QoE degradations is necessary. As noted by Batteram et al. [71] and also by Reichl. et al. [30], a layered approach relates network-level Key Performance Indicators (KPIs, for example, delay, loss, throughput, etc.) with user-level application specific Key Quality Indicators (KQIs, for example, service availability, usability, reliability, etc.), which then provide input for a QoE estimation model. Additional input to a QoE estimation model may then be provided by user-, context-, and device-related IFs. Knowledge regarding this mapping between KPIs and KQIs (or what we have referred to as quality dimensions) will provide valuable input regarding the analysis of the root causes of QoE degradation. Hence, monitoring probes inserted at different points along the service delivery chain to collect data regarding relevant KPIs are necessary.

When discussing monitoring points, we may roughly distinguish between network-based probes and client side probes (note that measurements in both cases may be conducted at different layers of the protocol stack). At the client side, we may further distinguish between probes that collect end-user-related data (e.g., objective measures such as user mouse clicks, or data such as user demographics, user motivation, etc.), context data, device-related data, application data, and network traffic data. While monitoring at the client side provides the best insight into the service quality that users actually perceive, a challenge lies in providing QoE information feedback to the network, service/application, content, or cloud provider to adapt, control, and optimize the QoE. As noted by Hoßfeld et al. [24], this client side monitoring point poses the issues of users’ privacy, trust, and integrity, since users may cheat in order to receive better performance. Consequently, collecting data from within the network without conducting client side monitoring (in an either objective or subjective manner), and vice versa, will not generally provide sufficient insight into QoE. Hence, accurate monitoring of QoE needs to employ both: monitoring from within the network and at the client side.

Soldani et al. [72] have used the same conclusions for the monitoring and measurement of QoE, specifically in mobile networks, that is, the need for complementary application of QoE monitoring and measurement methods. Two approaches were proposed: (1) a service-level approach using statistical samples; and (2) a network management system approach using QoS parameters. The former one uses application-level performance indicators and provides the real user opinion towards the used service, while the latter maps QoS performance metrics from various parts of the network onto user-perceptible QoE performance targets. Several similar QoE measurement approaches were standardized by 3GPP in particularly for Real-Time Protocol (RTP) based streaming, Hypertext Transfer Protocol (HTTP) streaming [73], Dynamic Adaptive Streaming over HTTP (DASH), progressive download [74], and Multimedia Telephony (MMtel) [75] for 3GPP devices. Namely, the quality of mobile media is usually degraded due to issues that arise in the last wireless hop, and, consequently, network oriented QoE assessment may not be very reliable. Therefore, it has been reported that the best way to obtain an accurate QoE assessment is to monitor and measure it in the mobile device and report it back to the system [76]. Reported QoE data is combined with other network collected measurements and facilitates the identification of the root causes of quality degradation.

Regarding timing, QoE measurements may be conducted (1) before the service is developed, which includes the consideration of individual quality factors as well as quality planning; (2) after the service is developed, but not delivered; (3) during/after service delivery, which comprises quality monitoring within the network and at the end user side during/after service usage. It has been noted that in the context of closed loop adaptation, there is a growing demand for suitable objective (rather than subjective) QoE evaluation techniques to facilitate optimal use of available wireless resources [23].

As discussed in [71], there are primarily three techniques prevalent in the market today for measuring performance: (1) using test packets; (2) using probes in network elements and user equipment; and (3) using the measurement combination from several network elements. Various approaches involving passive measurements have been reported, based on analyzing the correlations between traffic characteristics and performance criteria [28, 77]. Conducting passive measurements is often cheap and may be used for the evaluation of new applications. However, using network QoS measures for QoE estimation generally implies discerning individual media streams, hence putting additional effort on the monitoring process (involving packet filtering and stream reconstruction) [78].

It is a great challenge today to find a consensus regarding QoE measuring practices. On one hand, QoE has been mainly measured in terms of technical metrics, since it is often interpreted in terms of QoS. This measurement and assessment approach is criticized when stressing the multidimensional character of QoE [20, 54, 56, 7981]. On the other hand, measuring the subjective dimensions of the experience is often skipped or neglected because of shorter product/service life cycles, time pressure, budgetary reasons, or simply because they are ignored.

3.1. A Survey of QoE Monitoring and Measurement Approaches in Wireless Networks

Figure 2 illustrates possible QoE/QoS monitoring and adaptation points in a wireless network environment in the context of the 3GPP EPS. The EPS supports multiple access networks and mobility between them via a converged all-IP core network referred to as the previously mentioned EPC [1]. The figure portrays a simplified architecture combining 2G/3G access networks, non-3GPP radio access networks, and the 3GPP LTE access network [6, 82, 83]. As shown, QoE/QoS-related data may be collected from within the network, at the client side, or both. The QoE monitoring and measurement within the network may include data collection at different points such as the base stations within the various access networks, the gateways or routers within the core network, or the servers in the service/application, content, or cloud domains. The acquired parameters may be derived from application level (e.g., content resolution, frame rate, codec type, media type), network level (e.g., packet loss, delay, jitter, throughput), or a combination thereof, that is, in the cross-layer fashion. This approach of enabling QoS at different layers is required due to the fact that existing QoS support in wireless access technologies (e.g., WiMAX or LTE) focuses only on the access network [84]. Traffic collection closer to the end user will provide input for a more accurate estimate of QoE, as discussed also in [85]. Furthermore, the amount of data to process is greatly reduced as compared to data collected in the core network. While QoS solutions commonly use network egress routers for conducting traffic analysis, there is a need to consider computational load.

165146.fig.002
Figure 2: QoE/QoS monitoring and adaptation points in a wireless network environment (see Abbreviations).

The remainder of this subsection provides a discussion of several QoE measurement and monitoring studies that have been conducted with various types of services. The approaches are categorized as focusing on client side or network measurements, or their combination. A summary given in Table 3 compares these approaches based on the QoE monitoring point, QoE estimation point, method of conducting the QoE measurement, metrics (subjective or objective), type of service, and QoE management applicability, as well as deployment challenges in the context of QoE management. We note that while subjective measurements are generally more applicable in the context of building QoE models (or validating monitoring approaches), objective measurements are generally employed for QoE estimation and subsequently optimization purposes.

tab3
Table 3: Comparison of QoE monitoring and measurement approaches.
3.1.1. QoE Monitoring at the Client Side

While 3GPP policy and QoS mechanisms are based on centralized control, there have been complementary efforts to move certain intelligence from the network to the client. In the context of DASH services (Dynamic Adaptive Streaming over HTTP), 3GPP and MPEG standardization bodies have standardized mechanisms for activating QoE measurements at the client device, as well as the protocols and formats for the delivery of QoE reports to the network servers [74]. It is important to note that HTTP adaptive streaming in general provides the client with the ability to fully control the streaming session. This methodology is mostly suitable for mobile wireless environments and proposed as an optional feature on client devices. The QoE monitoring and reporting framework as standardized by 3GPP is composed of the following phases: (1) a server activates QoE reporting, requests a set of QoE metrics to be reported, and configures the QoE reporting framework; (2) a client monitors or measures the requested QoE metrics according to the QoE configuration; and (3) the client sends the QoE report to the network server in Extensible Markup Language (XML) format by using HTTP [86]. In the context of 3GPP LTE systems, it is important to devise and adopt new QoS delivery and service adaption methods targeting DASH services, since they are beneficial in the sense of optimal management of limited network resources and improved QoE provisioning to the end user [87]. The benefits of adaptive streaming have in particular been recognized in the case of high bandwidth-consuming mobile video communications. Figure 3 depicts a possible example of PCC architecture performing end-to-end QoS/QoE delivery for DASH services. As noted in [87], the current 3GPP PCC architecture supports only QoS delivery and service adaptation for RTSP-based adaptive streaming services, with the need for new methods for HTTP adaptive streaming services.

165146.fig.003
Figure 3: Example of possible PCC architecture performing end-to-end QoS/QoE delivery for DASH services (see Abbreviations).

Other work has addressed concrete cases of collecting QoE relevant metrics at the client side, albeit primarily for QoE modeling purposes and not considering QoE reporting mechanisms providing feedback to the network. In their studies of mobile video streaming, Ketykó et al. [88] have introduced an implementation of a QoE measurement approach on the Android platform based on the collection of both objective and subjective parameters. Observed objective parameters are logged by a QoS and context monitor component deployed on an Android device node and include audio and video jitter and packet loss rate, as well as percentage of duration of connection to a specific data network type in relation to the total duration of a video watching session. In addition, observed end user subjective parameters are logged by an Experience Monitor component also deployed on the Android device and include test users’ ratings of content, picture and sound quality, fluidness, matching to interests, and loading speed. Similar to the procedures in [88], Verdejo et al. [89] discuss the Android-based QoE measurement framework as applied in the context of playing a mobile location-based real-time massively multiplayer online role-playing game (MMORPG). Users’ evaluations regarding the feelings of amusement, absorption, or engagement experienced while playing the game are taken into account and related to a set of objective QoS-related parameters, contextual data, and physiological data obtained from an on-body sensor.

Previously described measurement approaches may contribute to the overall QoE management process in the context of improving the application design, that is, better understanding of content and physical effects. However, if applying such measurement techniques, for example, for optimizing network performance, the challenge lies in reporting QoE feedback obtained at the client side back to the network. Other potential deployment challenges are user related. As previously mentioned, if a user is providing QoE related feedback, they may cheat to improve their performance. Finally, user’s privacy may be an issue when it comes to behavioral monitoring.

Besides the previously described subjective data collection methods ESM and DRM, Wac et al. [35] have also addressed technical aspects of QoE in their measurement approach involving a real-life four-week-long study. They have developed an Android Context Sensing Software (CSS) application that unobtrusively collects context and QoS data from users’ Android phones. Gathered context data includes current time and user’s geographical location, wireless access network technology, cell-ID or an access point name, Received Signal Strength Indication (RSSI), and current used applications with total amounts of application throughput) while the measured network parameter is Round Trip Time (RTT) for an application-level control message sent every minute from a mobile device through the available wireless access network to a dedicated server. As a result of the study, the authors identified a number of QoE IFs for mobile applications, such as user’s routine, prior experience, and the possibility to choose between a PC and their mobile device. This approach contributes to the QoE management process in the context of mobile applications design improvement but might experience the same deployment challenges as the two previously described approaches.

With regards to collecting user feedback, a framework proposed by Chen et al. [90] quantifies the users’ quality perceptions by having users click a dedicated button whenever he/she feels dissatisfied with the quality of the used application. Hence, the framework is called OneClick and has been demonstrated through user evaluations of different multimedia content in variable network conditions.

Finally, we mention the applicability of client side monitoring in the context of QoE-driven mobility management. Focusing on voice services, Varela and Laulajainen [91] describe QoE estimations for VoIP to improve the existing network-level IP mobility management solutions. The proposed solution performs QoE estimations by passive network QoS monitoring for VoIP traffic, feeding the network’s QoS information to a Pseudo-Subjective Quality Assessment (PSQA) tool [92]. The aim is to aid in making access network handover decisions. A presented prototype implementation is tested in scenarios representing real VoIP service usage. We note that VoIP QoE estimation and prediction based on passive probing mechanisms and integrated directly into a mobility management protocol is further addressed by Mitra et al. [93]. Furthermore, a user-centric approach to proposing seamless mobility management solutions was one of the key focus areas of the EU FP7 PERIMITER project [94] (available deliverables provide detailed insight into user-centric mobility design).

3.1.2. QoE Monitoring in the Network

Volk et al. [59], whose approach is described in Section 2 in the context of QoE modeling, have focused on an NGN and IP Multimedia Subsystem (IMS) [95] based environment and proposed a solution for QoE assurance which employs an automated proactive in-service algorithm for the user’s QoE estimation, rather than relying on regular QoS techniques or acquiring the subjective end-user’s feedback. The QoE estimation algorithm is based on context-aware objective end-to-end QoE modeling and is run by a dedicated application server, implemented as a value-added service enabler. The authors argue reasons for conducting centralized QoE estimations in the network as being availability of and access to a wide range of quality-related information, the possibility of non-intrusive in-service QoE estimation, and the potential of proactive in-service quality assurance functionality (e.g., the application server may invoke adaptation/modification of certain quality affecting parameters). While the authors argue that this approach ensures fair interpretation of subjectively perceived quality towards any end user or service, and operational efficiency in terms of no end-user involvement (guaranteeing universality of the QoE estimation), the notion of user singularity may be considered neglected.

Hoßfeld et al. [96] compare two YouTube QoE monitoring approaches operating at the end user level (described in the following heading) and within the network. A novel YouTube in-network (YiN) monitoring tool is proposed as a passive network monitoring tool. This approach aims at detecting and measuring stalling of the video playback by approximating the video buffer status by comparing the playback times of video frames and the time stamps of received packets. The challenges of this approach are related to the accurate reconstruction of the stalling events that arrive at the application layer which requires additional costs and limits the scalability in terms of the number of YouTube video streaming flows that can be actually monitored by a probe.

Menkovski et al. [38] have presented a method for assessing QoE and developed a platform for conducting the QoE estimation for a service provider. The designed platform estimates QoE of the mobile TV services based on existing QoS monitoring data together with QoE prediction models. The prediction models are built using Machine Learning (ML) techniques from subjective data acquired by limited initial subjective user measurements. Thereby, subjective measurement associated complexities are minimized, while the accuracy of the method is maintained. This platform is currently used for mobile TV systems where it estimates the QoE of the streaming media content and is further used to manage the services and resources. However, the deployed system cannot give any information as to how the service is perceived by the end user.

3.1.3. QoE Monitoring Combining Client Side and Network Measurements

Further focusing on YouTube as one of the most common and traffic intensive mobile applications, Staehle et al. [97] have proposed a previously mentioned YouTube Application Comfort (AC) monitoring tool—YoMo, which monitors the QoE at the client’s side. The tool detects the YouTube video and determines its buffered playtime. Thereby, the YoMo tool is able to detect an imminent QoE degradation, that is, stalling of the video. The interruption of the video playback is the only considered factor influencing YouTube QoE, as it has been argued that this is the key IF in the given case. Additionally, the tool communicates the stalling information to the network advisor and raises an alarm if the AC becomes bad. What makes YoMo particularly suitable for QoE monitoring and measurement is its ability to predict the time of stalling in advance. Thus, it allows the network operator to react prior to the QoE degradation and to avoid unsatisfied customers.

Ketykó et al. [98] have introduced a measurement concept of QoE related to mobile video streaming in a 3G network environment and semi-real-life context. The data collection, which is based on the Experience Sampling Method, combines objective and subjective data for evaluating user experiences. Observed technical-quality-related QoE parameters, audio and video packet loss and jitter, are obtained at the server-side from the Real-time Transport Control Protocol (RTCP) Receiver Records (RR), while the observed RSSI parameter is obtained from the MyExperience in situ measurement tool used at the client side. The subjective assessments parameters have been conducted in two phases: (1) preusage questionnaire (obtaining users’ experiences towards mobile applications) and (2) usage phase where users are asked to use the mobile application in six different usage contexts: indoor and outdoor, at home, at work, and on a train/bus. This study has shown that QoE of mobile video streaming is influenced by the QoS and by the context. Additionally, the authors have proposed linear functions for modeling the technical-quality-related QoE aspects and argued that spatial quality and emotional satisfaction are the most relevant QoE aspects for the tested users.

Having surveyed several chosen QoE measurement approaches, we have classified them into ones that aim to perform QoE monitoring by acquiring data only at the client side [35, 74, 8890], only within the network [28, 38, 59, 91, 96], or by collecting data at both, the client side and within the network [97, 98]. As stated previously, in order to assure accurate QoE estimation and identification of the causes of QoE degradation, measurements collected along the end-to-end service delivery path are needed. The majority of approaches comprise both subjective and objective parameters with end users estimating QoE. While the collection of subjective assessments is generally conducted in the scope of empirical QoE studies targeted towards building accurate QoE models, objective measurements provide input for QoE prediction mechanisms and are commonly employed for QoE optimization and control purposes. Furthermore, in terms of the measurements environments that have been discussed, several approaches have been illustrated in a laboratory testbed [28, 38, 91, 96, 97], while others were demonstrated in real-life [35, 59, 74, 90] or semi-real-life environments [88, 89, 98].

As it can be observed from the previous analysis, QoE monitoring approaches in the wireless context often measure parameters such as packet loss rate, bandwidth, throughput, delay, and jitter and do not in general differ from ones addressing a fixed environment. However, in order to gain a deeper understanding of QoE IFs in wireless and mobile environments, there is a need to monitor parameters characteristic for such environments (Table 2). For example, RSSI measurements can be utilized for addressing wireless factors such as channel exposure to physical phenomenon, its capacity and sharing among the users, signal strength, terminal antenna gain. Additionally, in order to gain the accuracy regarding the wireless channels, this measurement can be combined with measurements of base station and terminal antenna gain, distance from the antenna, temperature, and so forth. The information obtained by combining the aforementioned measurements can give a clearer picture of which factors impact QoE and to what extent. Another example is the measurement of the radio cell reselection frequency or signaling update frequency which may reveal how these wireless and mobile specific issues can be optimized. Additionally, although not considered in the analysis, it is recommended to measure the mobile device power consumption while using different applications, processor capability, and storage capacity.

3.2. Summary of QoE Monitoring and Measurement Challenges

The previous discussion has shown that the QoE monitoring and measurement process is complex due to the diversity of factors affecting QoE, data acquisition points, and timings, as well as methods of collecting data, and the lack of consensus regarding these issues. The main challenge in this process is to answer the following four questions: (1) What to collect?; (2) Where to collect?; (3) When to collect?; and (4) How to collect?

Firstly, one needs to determine which data to acquire. The what/which clause is specified by the QoE metrics selection which depends on the service type and context. The decision regarding data that should be acquired considering the wide spectrum of QoE IFs is challenging, but it is the prerequisite for any QoE monitoring and measurement approach. Secondly, choosing a location where to collect data is another critical issue in the QoE assessment process, that is, determine the location of monitoring probes. As previously mentioned, data can be collected within the network, at the client side, or both (depending also on whether measurements are conducted for QoE modeling purposes or for QoE control purposes). The QoE monitoring and measurement within the network may include data collection at different points such as the base stations within the various access networks, the gateways or routers within the core network, or the servers in the service/application, content, or cloud domains. Additionally, the acquired parameters may be derived from application level, network level, or a combination thereof. Each acquisition location addresses the specific challenges discussed previously. Furthermore, if performing in-service QoE management (e.g., QoE-driven dynamic (re)allocation of network resources), collected data generally needs to be communicated to an entity performing QoE optimization decisions. Hence, the passing of data to a control entity needs to be addressed. Thirdly, one should determine when to collect data: (1) before the service is developed; (2) after the service is developed, but not delivered; and (3) after the service is delivered. Additionally, how often data should be monitored and measured needs to be considered. Finally, how to perform the data acquisition is determined by the where and when clauses. The QoE monitoring process implies computational operations, hence computational complexity and battery life of mobile devices need to be considered.

It may be concluded, as in the QoE modeling process, that different actors involved in the service provisioning chain will monitor and measure QoE in different ways, focusing on those parameters over which a given actor has control (e.g., a network provider will monitor how QoS-related performance parameters will impact QoE, a device manufacturer will monitor device-related performance issues, while application developers will be interested in how the service design or usability will affect QoE).

Having chosen the proper QoE metrics and monitoring and measurements approach, it is important to provide mechanisms utilizing this information for improving service performance, network planning, optimization of network resources, specification of service level agreements (SLAs) among operators, and so forth. Such issues are addressed as the “final step” in the QoE management process, discussed in the following section.

4. QoE Optimization and Control

Following QoE modeling, monitoring, and measurements, the ultimate goal of QoE management is to control QoE via QoE optimization and control mechanisms. Such mechanisms yield optimized service delivery with (potentially) continuous and dynamic delivery control in order to maximize the end-user’s satisfaction and optimally utilize limited system resources. From an operator point of view, the goal would be to maintain satisfied end users (in terms of their achieved QoE) in order to limit customer churn, while efficiently allocating available wireless network resources. QoE optimization as such may be considered a very challenging task due to a number of issues characteristic for converged all-IP wireless environments, including limited bandwidth and its variability, the growth of mobile data, the heterogeneity of mobile devices and services, the diversity of usage contexts, and challenging users’ requirements and expectations, as well as the strive to achieve cost efficiency.

4.1. An Overview of QoE Optimization Approaches in Wireless Environment

A number of strategies for optimizing QoE in a wireless environment that have been proposed differ in the applied approach (network/user oriented), parameters chosen to be adjusted, control location(s) and timing(s), and so forth. Therefore, in this section, Table 4 will provide an overview of the state-of-the-art approaches in terms of QoE optimization point, optimization strategy, considered wireless technologies, and deployment challenges.

tab4
Table 4: An overview of QoE optimization approaches in wireless environments.

Since QoS (and ultimately QoE) provisioning is a key issue in the context of wireless networks, 3GPP has proposed a set of comprehensive QoS concepts and architectures for UMTS [7, 8]. A Policy and Charging Rules Function (PCRF) included in the EPS as part of the 3GPP PCC architecture [9], which is shown in Figure 4, impacts end-user QoE for a particular subscription and service type by providing service-aware network-based QoS. Service requirements may be extracted from the application-level signaling (e.g., based on the Session Initiation Protocol (SIP)) and passed down to the PCRF, responsible for executing policy rules. Execution of policy rules and their enforcement at the network level serves to manage network congestion, provide differentiated service quality based on heterogeneous service requirements, and create a framework for new business models. The bearer- and class-based QoS concept introduces a QoS Class Identifier (QCI) which specifies standardized packet forwarding treatment for a given traffic flow. The standardized QCI characteristics are specified in terms of bearer type (Guaranteed Bit-Rate (GBR) or non-GBR), priority, Packet Delay Budget (PDB), and Packet Error Loss Rate (PELR) [99]. Apart from nine QCIs, QoS parameters defined in the EPS include Allocation and Retention Priority (ARP), Maximum Bit-Rate (MBR), and GBR.

165146.fig.004
Figure 4: 3GPP PCC architecture for EPS (see Abbreviations).

Skorin-Kapov and Matijašević [100] have proposed QoE-driven service adaptation and optimized network resource allocation mechanisms in the context of the 3GPP IMS and PCC architectures (Figure 5(a)). Service requirements and user preferences are signaled in the form of utility functions and serve as input for an optimization process (conducted by a proposed QoS Matching and Optimization Function) aimed at calculating the optimal service configuration and network resource allocation, given network resource, service, and operator policy constraints. The calculation results are passed to the PCRF node and serve as input for resource allocation mechanisms. Ivešić et al. [101] have further built on this approach by focusing on QoE-driven domain-wide optimal resource allocation among multiple sessions. The resource allocation has been formulated as a multiobjective optimization problem with the objectives of maximizing the total utility of all active sessions along with operator profit in the context of the 3GPP EPS.

fig5
Figure 5: QoE control mechanisms: (a) Skorin-Kapov and Matijašević [100] and Ivešić et al. [101]; (b) Sterle et al. [102] (see Abbreviations).

Further considering a 3GPP environment, an in-service QoE control mechanism has been proposed by Volk et al. [59] and further studied by Sterle et al. [102] (Figure 5(b)). The proposed application-level QoE estimation function running at the application server in the NGN service stratum is based on collection of a comprehensive set of QoE IFs. The authors attempt to maximize QoE by making the adjustments to identified quality performance indicators. As previously discussed in terms of QoE monitoring and measurement, this approach’s benefits include the wide range of quality-related information sources available in the network and non-intrusive in-service quality assurance and control.

Network-, that is, operator-driven, QoE optimization approaches are primarily concerned with the optimal utilization of available network resources, and in order to maximize QoE, they propose various network resource management mechanisms which rely on the information obtained from the monitoring and measurement process. Therefore, Thakolsri et al. [103] (Figure 6(a)) have applied utility maximization in the context of QoE-driven resource allocation across multiple users accessing different video contents in a wireless network. The proposed scheme allocates network resources and performs rate adaptation such that perceivable quality fluctuations lie within the range of unperceivable changes. Also, the Aquarema concept proposed by Staehle et al. [97] (Figure 6(b)) enables application specific network resource management and thereby improves the user QoE in all kinds of networks for all kinds of applications. The authors have achieved the improvement by the interaction of the previously described application comfort monitoring tool—YoMo, running at the client side, and a network advisor which may trigger different resource management tools. The tool quantifies how well an application is running and enables prediction of the user experience, thereby allowing the network advisor to act upon an imminent QoE degradation. The principles of these approaches that have placed the resource allocation mechanisms in the core network may be combined. For example, the former one which manages the network resources by prioritizing the users that have better channel conditions and which thereby indirectly assumes the improvement of the overall QoE can be supplemented with the client information obtained from the monitoring tool. Thereby it would gain more information for the fair prioritizing, that is, network resource allocation, and improvement of individual user QoE.

fig6
Figure 6: QoE optimization mechanisms: (a) Thakolsri et al. [103]; (b) Staehle et al. [97]; (c) Latré et al. [104].

Additionally, Latré et al. [104] (Figure 6(c)) have defined an autonomic management architecture to optimize the QoE in multimedia access networks using a three-plane approach consisting of (1) a Monitor Plane which monitors the network and builds up knowledge about it; (2) a Knowledge Plane which analyzes the knowledge and determines the ideal QoE actions; and (3) an Action Plane that enforces these actions into the network. The authors have focused on the “smart” Knowledge Plane which consists of two reasoners: an analytical one based on a set of equations and the other one based on neural networks. These reasoners can optimize the QoE of video services with two optimizing actions: applying Forward Error Correction (FEC) to reduce the packet loss caused by errors on a link and switching to different video bit rate to avoid congestion or to obtain a better video quality.

In order to efficiently use limited wireless resources and distribute them among users whose perceived quality should be maximized, QoE-driven resource allocation and scheduling mechanisms should incorporate the sensitivity of the human perceived quality [105] which requires a strategy to include the mapping of users’ opinions into resource allocation and scheduling algorithms in various wireless access technologies such as Code Division Multiple Access (CDMA), LTE, UMTS, WiMAX, or Wireless Local Area Network (WLAN). For example, Hassan et al. [106] model the QoE of mobile VoIP services and use this as input for a QoE management scheme employing a game theoretic approach to resource management.

In this context, it is challenging for network elements responsible for resource management to adapt the constrained uplink and downlink wireless resources by assigning or periodically reassigning them to different service providers and users such that all resource competitors are satisfied. Also, QoE-driven resource management that would result in higher users’ satisfaction may be performed by implementing QoE-aware routing and packet controllers which give preferential treatment to certain types of packets, according to priority-based policies that may differ depending on operator’s interests. However, in resource variable and constrained systems, such as wireless networks, the priority to gain resources is primarily given to users having good channel condition and accessing low-demand applications that result in his/her satisfaction for a small amount of limited resources [103]. Furthermore, one has to account for users that may be given priority for paying more, although they may not have the above mentioned communication conditions. QoE data may be utilized in order to aid and improve decision making in the context of resource allocation and scheduling [103, 107], mobility management [91], and so forth.

Further focusing on QoE-driven resource allocation, it can be noted that this process may either be adapted to meet different service requirements (e.g., via 3GPP QoS provisioning mechanisms), or services may be adapted to meet dynamic network resource availability (e.g., adaptive streaming based on MPEG DASH). Additionally, it can be considered in terms of QoE optimization for a single user [91] or multiple users [103] differing in maximization of QoE for a given user or total/average QoE for users, as well as for single media flow [108] or multiple media flows [100, 101] with different utility functions corresponding to each media component.

However, comprehensive consideration of the various QoE influence factors and their correlations is needed in order to improve and optimize QoE. Depending on the determination of the appropriate QoE IFs which may need to be adjusted for a given service, QoE optimization may be performed at different locations, as depicted in Figure 2. Thus, as summarized in Table 4, the optimization can be conducted by applying various control mechanisms at the base stations within various access networks [109111] by applying policy management rules at the gateways or routers within the core network [97, 103, 112, 113], by conducting adjustments at the servers in the service/application [104, 113, 114], content or cloud domains, or the combination thereof [107], as well as on end-user device [115]. Since QoE control relies on QoE monitoring and measurement information, usually the control locations are the same as monitoring and measurement points. This does not necessarily mean that QoE optimization is conducted at the location where the data for its activation is collected, since the data gathered at one point in the system may trigger the optimization at another point. Additionally, the optimization may be performed at levels ranging from link- to application-layer [59, 102], as well as in a cross-layer fashion which is most common [97, 103, 108, 112, 113, 116].

The addressed approaches propose various QoE optimization strategies that are aimed at optimal network resource allocation [100, 101, 112, 113]; optimal radio resource allocation [97, 103, 107, 109]; service optimization [97, 100, 102, 107, 110, 114]; optimized handover decision [113]; access network selection [111]; or battery consumption [115]. However, the overall automatic optimization strategies may successfully manage the finite network resources and fulfill general users’ requirements, but these may not always be optimal in terms of an individual user’s QoE. It has been argued that automated mechanisms may benefit from the user’s manual adjustments of the service settings on his/her own device, although it would affect the network resources. For example, Aristomenopoulos et al. [109] have proposed a dynamic utility adaption framework suitable for real-time multimedia services in a wireless environment, which allows users to express their (dis)satisfaction with the service quality and adjust it at their device. This framework provides the seamless integration of users’ subjectivity in network utility-based Radio Resource Management (RRM) mechanisms, enabling cross-layering from the application to the Media Access Control (MAC) layer.

With regards to timing when the QoE optimization should be conducted, one may distinguish between in-service and out-of-service approaches. In-service QoE optimization implies the conduction of the process in the system while it is in operation. In other words, it performs the on-line quality prediction/control during service execution. The user-oriented approaches based on quality-related feedback are in-service, since the events that trigger QoE adaptation are coming from the individual user while she/he is using the service [109, 111]. Also, the network-oriented approaches may be in-service, since the resource management functions can be triggered by events happening during the service (e.g., detection of network congestion, operator policy) [59, 97, 102, 103, 110, 113115]. On the other hand, out-of-service QoE adaptation is performed in an off-line fashion by implementing the control mechanisms for network planning, load balancing, network congestion detection, and so forth [9, 116].

It may be concluded that in most situations the user perceived QoE will depend on the underlying network performance. However, network-oriented QoE optimization processes would clearly benefit from perceived quality feedback data collected at the user’s side, since QoE is inherently user-centric. In addition, as previously mentioned, the network decision making process may benefit from the user’s adaptations in terms of more efficient resource usage. Therefore, in order to truly optimize and control QoE, both network- and user-oriented issues should be encompassed.

4.2. Summary of QoE Optimization Challenges

As previously discussed, QoE optimization and control is a challenging task considering numerous constraints. Similarly, as related to QoE monitoring and measuring processes, the main challenges that arise with regards to QoE controlling may be summarized in the answers to the following four questions: (1) what to control?; (2) where to control?; (3) when to control?; and (4) how to control?

Answering the question of what parameters to optimize relates to the issue of determining the key factors whose adjustments would result in improved QoE. Additionally, the impact of those optimizations on other parameters must be considered, since in certain cases improvements of one set of parameters may result in other parameters degradations (e.g., web browsing a high quality media content may prolong a web page response time). The what clause determines another critical issue in the QoE optimization process: the location where the optimization will be performed. Thus, the QoE may be optimized at various locations as previously discussed. Furthermore, one needs to determine when to perform the QoE optimization: during the service, that is, on-line control or in an off-line fashion. Additionally, it needs to be considered how often to optimize QoE. Finally, how to optimize QoE is determined by all three previously mentioned clauses.

Basically, there are many strategies for QoE optimization and control, and they can be described as being closer to the network or closer to the user. A promising approach appears to optimize QoE by combining the network- and user-oriented approaches by supplementing the drawbacks of one with the advances of the other. Therefore, by combining different approaches in QoE optimization, multiple stakeholders and players involved in the service delivery process would benefit, since the various additional challenges that each of them pose would be addressed. Additionally, the characteristics of wireless and mobile environments that have been discussed in Table 2 (e.g., the constrained and shared network resources, variable and unstable nature of wireless channels, device diversities and capabilities) will pose additional challenges to cope with when allocating limited resources among users.

Therefore, open research issues include applicability across different wireless access technologies (e.g., LTE, WiMAX, WLAN) and implementation on heterogeneous devices; computational issues in terms of complexity of the mechanism or limited computational capacity of a device which may lead to battery consumption; signaling overhead due to increased amount of signaling traffic exchanged between devices and wireless access points (if the optimization mechanism is distributed and combined) and resulting latency; scalability in terms of resulting time-consuming effect and cost when the solution is applied on a large number of users; and so forth. Since a goal of future research is to consider the user’s subjectivity in the QoE optimization procedure, user-related issues such as fairness, trust, or privacy must be addressed appropriately.

5. Conclusion

Satisfying user service quality expectations and requirements in today’s user centric and upcoming converged all-IP wireless environment implies the challenge of performing successful QoE management. Requirements put forth by standards including 3GPP, ETSI, and ITU include providing support for access to services anywhere and anytime, over any medium and network technology, and across multiple operator domains. In the context of the highly competitive telecom market, the fast development of new and complex mobile multimedia services delivered via various new mobile devices offers a wide scope of choice for the end user, hence increasingly driving operators to put focus on the QoE domain. Studies have shown that the management of QoE as a highly complex issue requires an interdisciplinary view from user, technology, context, and business aspects and flexible cooperation between all players and stakeholders involved in the service providing chain.

This paper gives a survey of the state of the art and current research activities focusing on steps comprising the QoE management process: QoE modeling, QoE monitoring and measurement, and QoE adaptation and optimization. Based on the overview, we have identified and discussed the key aspects and challenges that need to be considered when conducting research in the area of QoE management, particularly related to the domain of wireless networks. Challenges related to QoE modeling include the consideration of various QoE IFs and identification of key ones, as well as mapping the key IFs to QoE dimensions. The main challenges related to QoE monitoring and measurement can be summarized in the questions: what to collect?, and where, when, and how to collect data? Similarly, the following questions what to improve?, and where, when, and how to improve QoE? summarize the challenges of the QoE optimization process. In the context of wireless networks, QoE management has mostly been considered in terms of resource scheduling, whereby resource allocation decisions are driven to optimize end-user QoE. QoE-driven resource management has become an important issue for mobile network operators as a result of the continuously increasing demand for complex and faster multimedia applications that are adaptable to various devices and require increased network resources (e.g., more bandwidth, less delays, higher link quality), as well as the need for satisfying high user expectations towards continuous communication, mobility, and so forth. Therefore, QoE-driven mobility management solutions have also been presented, as well as QoE-driven service adaptation solutions.

Thus, by approaching various QoE aspects from a wide interdisciplinary perspective, this paper aims to provide a better understanding of QoE and the process of its management in converged wireless environments.

Abbreviations

AAA:Authentication, authorization, and accounting
A-GW: Access gateway
AS: Application server
BSC: Base station controller
BTS: Base transceiver station
DASH: Dynamic adaptive streaming over HTTP
E Node B: Evolved node B
EPC: Evolved backet core
ePDG: Enhanced packet data gateway
EPS:Evolved packet system
E-UTRAN: Evolved UMTS terrestrial radio access network
GGSN: Gateway gprs support node
GPRS: General packet radio service
GSM: Global system for mobile communication
HSPA: High speed packet access
HSS: Home subscriber service
HTTP: Hyper text transfer protocol
I-CSCF: Interrogating call session control function
IMS: IP multimedia subsystem
IP: Internet protocol
LAN: Local area network
LTE: Long-term evolution
MME: Mobility management entity
PCC: Policy and charging control
PCEF: Policy and charging enforcement function
PCRF: Policy and charging rules function
P-CSCF: Proxy call session control function
P-GW: Packet data network gateway
PS: Policy server
Q-MOF: QoS matching and optimization function
QoE: Quality of experience
QoS: Quality of service
RM: Resource manager
RNC: Radio network controller
SCF: Session control function
SGSN: Serving GPRS support node
S-GW: Serving gateway
SIP AS: Session initiation protocol application server
SPR: Subscription profile repository
S-SCSF: Serving call session control manager
UPR: User profile repository
UPSF: User profile server.

Acknowledgments

The authors would like to thank the anonymous reviewers for their valuable comments and suggestions. L. Skorin-Kapov’s work was in part supported by the Ministry of Science, Education and Sports of the Republic of Croatia research projects nos. 036-0362027-1639 and 071-0362027-2329.

References

  1. M. lsson, S. Sultana, S. Rommer, L. Frid, and C. Mulligan, SAE and the Evolved Packet Core: Driving the Mobile Broadband Revolution, Elsevier, New York, NY, USA, 2009.
  2. S. Ickin, K. Wac, M. Fiedler, L. Jankowski, J. H. Hong, and A. K. Dey, “Factors influencing quality of experience of commonly used mobile applications,” IEEE Communication Magazine, vol. 50, no. 4, pp. 48–56, 2012.
  3. R. Stankiewicz and A. Jajszczyk, “A survey of QoE assurance in converged networks,” Computer Networks, vol. 55, no. 7, pp. 1459–1473, 2011. View at Publisher · View at Google Scholar · View at Scopus
  4. I. T. U. -T Recommendation Y.2012, “Functional Requirements and Architecture of the NGN Release 1,” September 2006.
  5. K. Bogenine, R. Ludwig, P. Mogensen et al., “LTE part II: Radio access,” IEEE Communications Magazine, vol. 47, no. 4, pp. 40–42, 2009. View at Publisher · View at Google Scholar · View at Scopus
  6. 3GPP Technical Report TR 23.882 V8.0.0, “3GPP System Architecture Evolution: Report on Technical Options and Conclusions,” December 2008.
  7. 3GPP Technical Specification TS 23.107 V11.0.0, “Quality of Service (QoS) Concept and Architecture,” June 2012.
  8. 3GPP Technical Specification TS 23.207 V10.0.0, “End-to-end Quality of Service (QoS) Concept and Architecture,” March 2011.
  9. 3GPP Technical Specification TS 23.203 V11.6.0, “Policy and Charging Control Architecture,” June 2012.
  10. K. U. R. Laghari and K. Connelly, “Toward total quality of experience: a QoE model in a communication ecosystem,” IEEE Communication Magazine, vol. 50, no. 4, pp. 58–65, 2012. View at Publisher · View at Google Scholar
  11. ITU-T Recommendation P.10/G.100, “Vocabulary for performance and quality of service. Amendment 2: New definitions for inclusion in Recommendation ITU-T P.10/G.100,” July 2008.
  12. ETSI Technical Report 102 643 V1.0.2, “Human Factors (HF); Quality of Experience (QoE) requirements for real-time communication services,” October 2010.
  13. K. De Moor, W. Joseph, I. Ketykó et al., “Linking users' subjective QoE evaluation to signal strength in an IEEE 802.11b/g wireless LAN environment,” EURASIP Journal on Wireless Communications and Networking, vol. 2010, Article ID 541568, 2010. View at Publisher · View at Google Scholar · View at Scopus
  14. V. Roto, E. Law, A. Vermeeren, and J. Hoonhout, Eds., “User Experience White Paper. Bringing Clarity to the Concept of User Experience,” February 2011, http://www.allaboutux.org/files/UX-WhitePaper.pdf.
  15. A. Van Ewijk, J. De Vriendt, and L. Finizola, Quality of Service for IMS on Fixed Networks. Business Models and Drivers for Next-Generation IMS Services, International Engineering Consortium, USA, 2007.
  16. T. M. O'Neill, “Quality of Experience and Quality of Service for IP video conferencing,” Polycom; 2002.
  17. M. Siller and J. C. Woods, “QoS arbitration for improving the QoE in multimedia transmission,” in Proceedings of the International Conference on Visual Information Engineering, pp. 238–241, July 2003.
  18. Empirix, “Assuring QoE on Next Generation Networks,” Whitepaper, 2001, http://www.whitepapers.org/docs/show/113.
  19. D. Soldani, “Means and methods for collecting and analyzing QoE measurements in wireless networks,” in Proceedings of International Symposium on a World of Wireless, Mobile and Multimedia Networks (WoWMoM '06), pp. 531–535, Niagara-Falls, Buffalo-NY, USA, June 2006. View at Publisher · View at Google Scholar · View at Scopus
  20. K. De Moor, I. Ketykó, W. Joseph et al., “Proposed framework for evaluating quality of experience in a mobile, testbed-oriented living lab setting,” Mobile Networks and Applications, vol. 15, no. 3, pp. 378–391, 2010. View at Publisher · View at Google Scholar · View at Scopus
  21. S. Baraković, J. Baraković, and H. Bajrić, “QoE dimensions and QoE measurement of NGN services,” in Proceedings of the 18th Telecommunications Forum (TELFOR '10), Belgrade, Serbia, November 2010.
  22. P. Le Callet, S. Moller, and A. Perkis, Eds., “Qualinet White paper on Definitions of Quality of Experience (QoE),” May 2012, http://www.qualinet.eu/.
  23. M. G. Martini, C. W. Chen, Z. Chen, T. Dagiuklas, L. Sun, and X. Zhu, “Guest editorial QoE-aware wireless multimedia systems,” IEEE Journal on Selected Areas in Telecommunications, vol. 30, no. 7, pp. 1153–1156, 2012. View at Publisher · View at Google Scholar
  24. T. Hoßfeld, R. Schatz, M. Varela, and C. Timmerer, “Challenges of QoE management for cloud applications,” IEEE Communication Magazine, vol. 50, no. 4, pp. 28–36, 2012. View at Publisher · View at Google Scholar
  25. D. Durkee, “Why cloud computing will never be free,” Communications of the ACM, vol. 53, no. 5, pp. 62–69, 2010. View at Publisher · View at Google Scholar · View at Scopus
  26. L. Skorin-Kapov and M. Varela, “A multi-dimensional view of QoE: the ARCU model,” in Proceedings of the 35th Jubilee International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO '12), Opatija, Croatia, May 2012.
  27. J. Shaikh, M. Fiedler, and D. Collange, “Quality of experience from user and network perspectives,” Annales des Telecommunications/Annals of Telecommunications, vol. 65, no. 1-2, pp. 47–57, 2010. View at Publisher · View at Google Scholar · View at Scopus
  28. P. Reichl, B. Tuffin, and R. Schatz, “Logarithmic laws in service quality perception: where microeconomics meets psychophysics and quality of experience,” Telecommunication Systems Journal, vol. 55, no. 1, pp. 1–14, 2011.
  29. M. Fiedler, T. Hoßfeld, and P. Tran-Gia, “A generic quantitative relationship between quality of experience and quality of service,” IEEE Network, vol. 24, no. 2, pp. 36–41, 2010. View at Publisher · View at Google Scholar · View at Scopus
  30. P. Reichl, S. Egger, R. Schatz, and A. D'Alconzo, “The logarithmic nature of QoE and the role of the Weber-Fechner law in QoE assessment,” in Proceedings of IEEE International Conference on Communications (ICC '10), Cape Town, South Africa, May 2010. View at Publisher · View at Google Scholar · View at Scopus
  31. E. H. Weber, Annotationes Anatomicae et Physiologicae: Programmata Collecta: Fasciculi Tres. de Pulsu, Resorptione, Auditu et Tactu, Koehler, 1834.
  32. ITU-T Recommendation P.800.1, “Mean Opinion Score (MOS) Terminology,” July 2006.
  33. T. Hoßfeld, R. Schatz, and S. Egger, “SOS: the MOS is not enough!,” in Proceedings of the 3rd International Workshop on Quality of Multimedia Experience (QoMEX '11), Machelen, Belgium, September 2011.
  34. ITU-T Recommendation G.1011, “Reference Guide to Quality of Experience Assessment Methodologies,” June 2010.
  35. K. Wac, S. Ickin, J. H. Hong, L. Janowski, M. Fiedler, and A. K. Dey, “Studying the experience of mobile applications used in different contexts of daily life,” in Proceedings of the 1st ACM SIGCOMM Workshop on Measurements up the Stack (W-MUST '11), Toronto, Ontario, Canada, August 2011.
  36. J. M. Hektner, J. A. Schmidt, and M. Csikszentmihalyi, Experience Sampling Method: Measuring the Quality of Everyday Life, Sage Publications, 2006.
  37. P. A. Dinda, G. Memik, R. P. Dick et al., “The user in experimental computer systems research,” in Proceedings of the Workshop on Experimental Computer Science, San Diego, Calif, USA, June 2007. View at Publisher · View at Google Scholar · View at Scopus
  38. V. Menkovski, G. Exarchakos, A. Liotta, and A. Cuadra-Sanchez, “Managing quality of experience on a commercial mobile TV platform,” International Journal on Advances in Telecommunications, vol. 4, no. 1-2, pp. 72–81, 2011.
  39. T. Hoßfeld, M. Seufert, M. Hirth, T. Zinner, P. Tran-Gia, and R. Schatz, “Quantification of YouTube QoE via Crowdsourcing,” in Proceedings of the IEEE International Symposium on Multimedia (ISM '11), December 2011.
  40. K. T. Chen, C. J. Chang, C. C. Wu, Y. C. Chang, and C. L. Lei, “Quadrant of euphoria: a crowdsourcing platform for QoE assessment,” IEEE Network, vol. 24, no. 2, pp. 28–35, 2010. View at Publisher · View at Google Scholar · View at Scopus
  41. A. Takahashi, “Framework and standardization of quality of experience (QoE) design and management for audiovisual communication services,” NTT Technical Review, vol. 7, no. 4, pp. 1–5, 2009. View at Scopus
  42. ITU-T Recommendation P.862, “Perceptual Evaluation of Speech Quality (PESQ): An Objective Method for End-to-End Speech Quality Assessment of Narrow-Band Telephone Networks and Speech Codecs,” February 2001.
  43. ITU-T Recommendation P.863, “Perceptual Objective Listening Quality Assessment,” January 2011.
  44. ITU-R Recommendation BS.1387, “Method for Objective Measurements of Perceived Audio Quality,” November 2001.
  45. ITU-T Recommendation P.563, “Single-ended Method for Objective Speech Quality Assessment in Narrow-Band Telephony Applications,” May 2004.
  46. F. Kuipers, R. Kooij, De Vleeschauwer, and K. Brunnstrom, “Techniques for measuring quality of experience,” Wired/Wireless Internet Communications, pp. 216–227, 2010.
  47. A. Takahashi, D. Hands, and V. Barriac, “Standardization activities in the ITU for a QoE assessment of IPTV,” IEEE Communications Magazine, vol. 46, no. 2, pp. 78–84, 2008.
  48. ITU-T Recommendation P.564, “Conformance Testing for Voice Over IP Transmission Quality Assessment Models,” November 2007.
  49. ITU-T Recommendation G.107, “The E-model: A Computational Model for Use in Transmission Planning,” December 2011.
  50. ITU-T Recommendation G.1070, “Opinion Model for Video-Telephony Applications,” April 2007.
  51. ITU-T Recommendation G.1030, “Estimating End-to-End Performance in IP Networks for Data Applications,” November 2005.
  52. ETSI Guide 202 843 V1.1.2, “Definitions and Methods for Assessing the QoS Parameters of the Customer Relationship Stages Other than Utilization,” July 2011.
  53. P. Brooks and B. Hestnes, “User measures of quality of experience: why being objective and quantitative is important,” IEEE Network, vol. 24, no. 2, pp. 8–13, 2010. View at Publisher · View at Google Scholar · View at Scopus
  54. A. Perkis, S. Munkeby, and O. I. Hillestad, “A model for measuring quality of experience,” in Proceedings of the 7th Nordic Signal Processing Symposium (NORSIG '06), pp. 198–201, June 2006. View at Publisher · View at Google Scholar · View at Scopus
  55. H. J. Kim, K. H. Lee, and J. Zhang, “In-service feedback QoE framework,” in Proceedings of the 3rd International Conference on Communication Theory, Reliability, and Quality of Service (CTRQ '10), pp. 135–138, Athens, Greece, June 2010. View at Publisher · View at Google Scholar · View at Scopus
  56. K. Kilkki, “Quality of experience in communications ecosystem,” Journal of Universal Computer Science, vol. 14, no. 5, pp. 615–624, 2008. View at Scopus
  57. S. Möller, K. P. Engelbrecht, C. Kühnel, I. Wechsung, and B. Weiss, “A taxonomy of quality of service and quality of experience of multimodal human-machine interaction,” in Proceedings of the International Workshop on Quality of Multimedia Experience (QoMEx '09), pp. 7–12, July 2009. View at Publisher · View at Google Scholar · View at Scopus
  58. D. Geerts, K. De Moor, I. Ketykó et al., “Linking an integrated framework with appropriate methods for measuring QoE,” in Proceedings of the 2nd International Workshop on Quality of Multimedia Experience (QoMEX '10), pp. 158–163, Trondheim, Norway, June 2010. View at Publisher · View at Google Scholar · View at Scopus
  59. M. Volk, J. Sterle, U. Sedlar, and A. Kos, “An approach to modeling and control of QoE in next generation networks,” IEEE Communications Magazine, vol. 48, no. 8, pp. 126–135, 2010. View at Publisher · View at Google Scholar · View at Scopus
  60. W. Song, D. Tjondronegoro, and M. Docherty, “Understanding user experience of mobile video: framework, measurement, and optimization,” in Mobile Multimedia—User and Technology Perspectives, INTECH Open Access, 2012.
  61. F. Ricciato, “Traffic monitoring and analysis for the optimization of a 3G network,” IEEE Wireless Communications, vol. 13, no. 6, pp. 42–49, 2006. View at Publisher · View at Google Scholar · View at Scopus
  62. D. Kataria, “Mitigating strategies for smart phone signaling overload,” December 2011, http://www.ecnmag.com/articles/2011/10/mitigating-strategies-smart-phone-signaling-overload.
  63. V. Menkovski and A. Liotta, QoE for mobile streaming. Mobile multimedia: user and technology perspectives, InTech Publishing, 2012.
  64. ITU-T Recommendation J.144, “Objective Perceptual Video Quality Measurement Techniques for Digital Cable Television in the Presence of a Full Reference,” March 2004.
  65. A. F. Wattimena, R. E. Kooij, J. M. Van Vugt, and O. K. Ahmed, “Predicting the perceived quality of a first person shooter: the Quake IV G-model,” in Proceedings of the 5th ACM SIGCOMM Workshop on Network and System Support for Games (NetGames '06), October 2006. View at Publisher · View at Google Scholar · View at Scopus
  66. T. Hoßfeld, S. Biedermann, R. Schatz, A. Plazter, S. Egger, and M. Fiedler, “The memory effect and its implications on web QoE modeling,” in Proceedings of the 23rd International Teletraffic Congress (ITC '11), San Francisco, California, USA, September 2011.
  67. S. Egger, P. Reichl, T. Hoßfeld, and R. Schatz, ““Time is bandwidth”? Narrowing the gap between subjective time perception and quality of experience,” in Proceedings of the IEEE International Conference on Communications (ICC '12), Ottawa, Canada, June 2012.
  68. T. Ciszkowski, W. Mazurczyk, Z. Kotulski, T. Hoßfeld, M. Fiedler, and D. Collange, “Towards quality of experience-based reputation models for future web service provisioning,” Telecommunication Systems, vol. 52, no. 2, pp. 1–13, 2013. View at Publisher · View at Google Scholar · View at Scopus
  69. M. Jarschel, M. Schlosser, S. Scheuring, and T. Hoßfeld, “An evaluation of QoE in cloud gaming based on subjective tests,” in Proceedings of the 5th International Conference on Innovative Mobile and Internet Services in Ubiquitous Computing (IMIS '11), Seoul, Korea, July 2011.
  70. S. Egger, T. Hoßfeld, R. Schatz, and M. Fiedler, “Waiting times in quality of experience for web based services,” in Proceedings of the 4th International Workshop on Quality of Multimedia Experience (QoMEX '12), Yarra Valley, Australia, July 2012.
  71. H. Batteram, G. Damm, A. Mukhopadhyay, L. Philippart, R. Odysseos, and C. Urrutia-Valdés, “Delivering quality of experience in multimedia networks,” Bell Labs Technical Journal, vol. 15, no. 1, pp. 175–194, 2010. View at Publisher · View at Google Scholar · View at Scopus
  72. D. Soldani, M. Li, and R. Cuny, QoS and QoE Management in UMTS Cellular Systems, John Wiley & Sons, USA, 2006.
  73. 3GPP Technical Specification TS 26.234 V11.0.0, “Transparent end-to-end Packet-switched Streaming Service (PSS); Protocols and Codecs,” March 2012.
  74. 3GPP Technical Specification TS 26.247 V10.2.0, “Transparent End-to-End Packet-Switched Streaming Service (PSS); Progressive Download and Dynamic Adaptive Streaming Over HTTP (3GP-DASH),” June 2012.
  75. 3GPP Technical Specification TS 26.114 V11.4.0, “IP Multimedia Subsystem (IMS); Multimedia Telephony; Media Handling and Interaction,” June 2012.
  76. A. Raake, J. Gustafsson, S. Argyropoulos et al., “IP-based mobile and fixed network audiovisual media services,” IEEE Signal Processing Magazine, vol. 28, no. 6, pp. 68–79, 2011.
  77. D. Collange and J. L. Costeux, “Passive estimation of quality of experience,” Journal of Universal Computer Science, vol. 14, no. 5, pp. 625–641, 2008. View at Scopus
  78. D. Collange, M. Hajji, J. Shaikh, M. Fiedler, and P. Arlos, “User impatience and network performance,” in Proceedings of the 8th Euro-NF Conference on Next Generation Internet (NGI '12), Karlskrona, Sweden, June 2012.
  79. K. De Moor and L. De Marez, “The challenge of user- and QoE-centric research and product development in today’s ICT environment,” in Innovating for and by Users, pp. 77–90, Office for Official Publications of the European Communities, Luxembourg, UK edition, 2008.
  80. B. Fehnert and A. Kosagowsky, “Measuring user experience: complementing qualitative and quantitative assessment,” in Proceedings of the 10th International Conference on Human-Computer Interaction with Mobile Devices and Services (MobileHCI '08), pp. 383–386, September 2008. View at Publisher · View at Google Scholar · View at Scopus
  81. M. Andrews, J. Cao, and J. McGowan, “Measuring human satisfaction in data networks,” in Proceedings of the 25th IEEE International Conference on Computer Communications (INFOCOM '06), Barcelona, Spain, April 2006. View at Publisher · View at Google Scholar · View at Scopus
  82. 3GPP Technical Specification TS 23.402 V11.3.0, “Architecture Enhancements for non-3GPP Accesses,” June 2012.
  83. 3GPP Technical Specification TS 22.278 V12.1.0, “Service Requirements for the Evolved Packet System,” June 2012.
  84. P. Rengaraju, C. H. Lung, and F. R. Yu, “On QoE monitoring and E2E service assurance in 4G wireless networks,” IEEE Wireless Communications, vol. 19, no. 4, pp. 89–96, 2012.
  85. R. Serral-Gracià, E. Cerqueira, M. Curado, M. Yannuzzi, E. Monteiro, and X. Masip-Bruin, “An overview of quality of experience measurement challenges for video applications in IP networks,” in Proceedings of the 8th International Conference on Wired/Wireless Internet Communications (WWIC '10), pp. 252–263, Lulea, Sweden, June 2010.
  86. R. Fielding, J. Gettys, J. Mogul et al., “Hypertext Transfer Protocol - HTTP/1.1,” IETF Technical Report RFC 2616, Jun 1999.
  87. O. Oyman and S. Singh, “Quality of experience for HTTP adaptive streaming services,” IEEE Communications Magazine, vol. 50, no. 4, pp. 20–27, 2012.
  88. I. Ketykó, K. De Moor, T. De Pessemier et al., “QoE measurement of mobile youtube video streaming,” in Proceedings of the 3rd Workshop on Mobile Video Delivery (MoViD '10), pp. 27–32, October 2010. View at Publisher · View at Google Scholar · View at Scopus
  89. A. J. Verdejo, K. De Moor, I. Ketyko et al., “QoE estimation of a location-based mobile game using on-body sensors and QoS-related data,” in Proceedings of the IFIP Wireless Days Conference (WD '10), October 2010. View at Publisher · View at Google Scholar · View at Scopus
  90. K. T. Chen, C. C. Tu, and W. C. Xiao, “OneClick: A framework for measuring network quality of experience,” in Proceedings of the 28th Conference on Computer Communications (INFOCOM '09), pp. 702–710, April 2009. View at Publisher · View at Google Scholar · View at Scopus
  91. M. Varela and J. P. Laulajainen, “QoE-driven mobility management—integrating the users' quality perception into network-level decision making,” in Proceedings of the Quality of Multimedia Experience (QoMEX '11), Mechelen, Belgium, September 2011.
  92. M. Varela, Pseudo-subjective quality assessment of multimedia streams and its applications in control [Ph.D. thesis], INRIA/IRISA, University Rennes I, Rennes, France, Nov 2005.
  93. K. Mitra, C. Ahlund, and A. Zaslavsky, “QoE estimation and prediction using hidden Markov models in heterogeneous access networks,” in Australasian Telecommunication Networks and Applications Conference, Brisbane, Australia, 2012.
  94. “EU FP7 PERIMITER project,” http://www.ict-perimeter.eu/.
  95. 3GPP Technical Specification TS 23.228 V11.4.0, “IP Multimedia Subsystem (IMS),” March 2012.
  96. T. Hoßfeld, F. Liers, R. Schatz et al., Quality of Experience Management for YouTube: Clouds, FoG and the AquareYoum, PIK: Praxis der Informationverarbeitung und - kommnikation (PIK)., 2012.
  97. B. Staehle, M. Hirth, R. Pries, F. Wamser, and D. Staehle, “Aquarema in action: improving the YouTube QoE in wireless mesh networks,” in Proceedings of the Baltic Congress on Future Internet and Communications (BCFIC Riga '11), pp. 33–40, February 2011. View at Publisher · View at Google Scholar · View at Scopus
  98. I. Ketykó, K. De Moor, W. Joseph, L. Martens, and L. De Marez, “Performing QoE-measurements in an actual 3G network,” in Proceedings of the IEEE International Symposium on Broadband Multimedia Systems and Broadcasting (BMSB '10), March 2010. View at Publisher · View at Google Scholar · View at Scopus
  99. S. M. Chadchan and C. B. Akki, “3GPP LTE/SAE: an overview,” International Journal of Computer and Electrical Engineering, vol. 2, no. 5, pp. 806–814, 2010.
  100. L. Skorin-Kapov and M. Matijasevic, “Modeling of a QoS matching and optimization function for multimedia services in the NGN,” in Proceedings of the 12th IFIP/IEEE International Conference on Management of Multimedia and Mobile Networks and Services: Wired-Wireless Multimedia Networks and Services Management (MMNS '09), vol. 5842 of Lecture Notes in Computer Science, pp. 54–68, October 2009.
  101. K. Ivešić, M. Matijašević, and L. Skorin-Kapov, “Simulation based evaluation of dynamic resource allocation for adaptive multimedia services,” in Proceedings of the 7th International Conference on Network and Service Management (CNSM '11), pp. 1–8, October 2011.
  102. J. Sterle, M. Volk, U. Sedlar, J. Bester, and A. Kos, “Application-based NGN QoE controller,” IEEE Communications Magazine, vol. 49, no. 1, pp. 92–101, 2011. View at Publisher · View at Google Scholar · View at Scopus
  103. S. Thakolsri, W. Kellerer, and E. Steinbach, “QoE-based cross-layer optimization of wireless video with unperceivable temporal video quality fluctuation,” in Proceedings of the IEEE International Conference on Communication (ICC '11), Kyoto, Japan, Jun 2011.
  104. S. Latré, P. Simoens, B. De Vleeschauwer et al., “An autonomic architecture for optimizing QoE in multimedia access networks,” Computer Networks, vol. 53, no. 10, pp. 1587–1602, 2009. View at Publisher · View at Google Scholar · View at Scopus
  105. P. Ameigeiras, J. J. Ramos-Munoz, J. Navarro-Ortiz, P. Mogensen, and J. M. Lopez-Soler, “QoE oriented cross-layer design of a resource allocation algorithm in beyond 3G systems,” Computer Communications, vol. 33, no. 5, pp. 571–582, 2010. View at Publisher · View at Google Scholar · View at Scopus
  106. J. Hassan, M. Hassan, S. K. Das, and A. Ramer, “Managing quality of experience for wireless VoIP using noncooperative games,” IEEE Journal on Selected Areas in Communication, vol. 30, no. 7, pp. 1193–1204, 2012. View at Publisher · View at Google Scholar
  107. A. El Essaili, L. Zhou, D. Schroeder, E. Steinbach, and W. Kellerer, “QoE-driven live and on-demand LTE uplink video transmission,” in Proceedings of the 13th International Workshop on Multimedia Signal Processing (MMSP '11), Hangzhou, China, October 2011.
  108. S. Khan, S. Duhovnikov, E. Steinbach, and W. Kellerer, “MOS-based multiuser multiapplication cross-layer optimization for mobile multimedia communication,” Advances in Multimedia, vol. 2007, Article ID 94918, 11 pages, 2007. View at Publisher · View at Google Scholar · View at Scopus
  109. G. Aristomenopoulos, T. Kastrinogiannis, S. Papavassiliou, V. Kaldanis, and G. Karantonis, “A novel framework for dynamic utility-based QoE provisioning in wireless networks,” in Proceedings of the 53rd IEEE Global Communications Conference (GLOBECOM '10), December 2010. View at Publisher · View at Google Scholar · View at Scopus
  110. F. Wamser, D. Staehle, J. Prokopec, A. Maeder, and P. Tran-Gia, “Utilizing buffered YouTube playtime for QoE-oriented scheduling in OFDMA networks,” in Proceedings of the 24th International Teletraffic Congress (ITC '12), Krakow, Poland, October 2012.
  111. K. Piamrat, A. Ksentini, C. Viho, and J. M. Bonnin, “QoE-based network selection for multimedia users in IEEE 802.11 wireless networks,” in Proceedings of the 33rd IEEE Conference on Local Computer Networks (LCN '08), pp. 388–394, Montreal, Canada, October 2008. View at Publisher · View at Google Scholar · View at Scopus
  112. M. Shehada, S. Thakolsri, Z. Despotovic, and W. Kellerer, “QoE-based cross-layer optimization for video delivery in long term evolution mobile networks,” in Proceedings of the14th International Symposium on Wireless Personal Multimedia Communications (WPMC '11), Le Quartz, Brest, France, October 2011.
  113. N. Amram, B. Fu, G. Kunzmann et al., “QoE-based transport optimization for video delivery over next generation cellular networks,” in Proceedings of the IEEE Symposium on Computers and Communications (ISCC '11), Kerkyra, Greece, July 2011.
  114. A. Khan, I. Mkwawa, L. Sun, and E. Ifeachor, “QoE-driven sender bitrate adaptation scheme for video applications over IP multimedia subsystem,” in Proceedings of the IEEE International Conference on Communication (ICC '11), Kyoto, Japan, June 2011.
  115. M. Csernai and A. Gulyas, “Wireless Adapter Sleep Scheduling based on video QoE: how to improve battery life when watching streaming video?” in Proceedings of the 20th International Conference on Computer Communications and Networks (ICCCN '11), Maui, Hawaii, USA, August 2011.
  116. A. A. Khalek, C. Caramanis, and R. W. Heath Jr., “A cross-layer design for perceptual optimization of H.264/SVC with unequal error protection,” IEEE Journal on Selected Areas in Communication, vol. 30, no. 7, pp. 1157–1171, 2012.