Abstract
With the rapid development of computer network technology in recent years, more and more demands have been placed on the functionality and attributes of the user interface. In the development of many computer projects, the variability and flexibility of user interface requirements have greatly increased the complexity of program development for researchers. In addition, the poor reusability of page access control writing has created a pressing need for a highly standardized and flexible way of developing software. Thus, the development and design of user interfaces for application software systems occupy an important position and have been a hot topic of research in the field of human-computer interaction. The traditional methods of describing user interaction, such as state transitions and data flow diagrams, are not based on global and intuitive concepts. Moreover, there is little support for the design of user interface interaction behavior, resulting in user interfaces being ignored at design time and left to implementers to grasp at coding time. It is therefore an issue that needs to be addressed in order to integrate traditional methods and intuitive descriptions from the user’s perspective into a new interface development model and methodology. This research creates a user interface framework based on interaction behavior from the user’s perspective. Furthermore, UML extension mechanisms are used to enable the user interface framework to better support UML-based modelling environments. In addition, the UML is structured and extended to include structural elements that support interface generation, and a structured use case model is proposed, which drives the analysis and design of the individual submodels. The extracted abstract interface elements and their mapping to concrete interface elements are documented in a way that explores the generation of different target languages under different platforms. This study incorporates user requirements and provides a scientific reference for the development and design of user interfaces.
1. Introduction
With the rapid development of computer technology and information technology in recent years, mankind has entered a new digital era of pay. Due to the booming development of the digital industry, computer technology has played an increasingly important role in social economy, politics, culture, and people’s daily lives, gradually becoming an indispensable part of people’s lives [1]. In the usage environment of digital products, the user interface, as the core of the product, plays an irreplaceable role in improving the attractiveness of the product, strengthening the brand image, and enhancing the user experience [2]. As a result, the development and design of user interfaces is attracting more and more attention from users and developers. Accordingly, the study of user interface design has become one of the most active research directions in the field of design and computing in recent years. The user interface is the core part of human-computer interaction and is a two-way channel for information exchange between the user and the computer hardware and software [3]. The understanding of the user interface has been changing for a long time along with the development of human-computer interaction systems. While computer technology has continued to evolve, human-computer interaction technology has also undergone significant changes [4]. In line with this trend, the user interface has undergone a dramatic transformation from a command language user interface to a virtual reality user interface. The current commercial success of speech recognition technology and computer handwriting recognition technology has opened up a wide range of prospects for natural human-computer interaction [5]. At the same time, with the further development of computer technology and information technology [6], new interaction technologies and forms of user interfaces will continue to emerge, such as multitouch interfaces [7] and intelligent spatial interaction interfaces [8]. In a sense, the interface represents the entire software system for the user, and its development quality and efficiency have become an important factor in the quality of the entire software product. Furthermore, in order to provide effective support for user tasks, an increasing proportion of user interaction is taking place in application software systems. From the user’s point of view, user interaction with the application software system is mainly reflected in the operation of the user interface controls.
For the user, in the process of analysis and design of the entire system software, the description and design of the user interface is the key to reflecting the user’s thoughts, meeting the user’s requirements, and understanding and using the system. Hence, the intuitiveness and readability of the user interface will have a direct impact on the user’s understanding of the entire digital product [9]. It is important for developers to be able to guide developers easily, quickly, and effectively through the design of the user interface in response to complete user requirements, to shorten the software development cycle, and to ensure that it is usable and accurate. However, user requirements change all the time and it is difficult for designers or users to identify specific requirements. In addition, the traditional approaches to describing user interaction are to use state transitions [10] or data flow diagrams [11], for example, with the help of pseudo-code and natural language. It is difficult for designers and users to establish a global and intuitive concept through such descriptions, and there is little support for user interface interaction design, making it difficult to design user interfaces. The issue of how to integrate these traditional methods and intuitive descriptions from the user’s point of view into a new interface development model and methodology is an issue to be addressed. Therefore, in order to support the design of the user interface and interaction behavior, it is necessary to describe not only the layout and style of the user interface but also the dynamic interaction behavior part of the user interface. Actually, this description can help to characterize the whole system and facilitates the maintenance of the model and its engineering implementation [12].
In recent years, the separation of user interface design and system function design has become a trend in software development, and the automatic generation of interface code based on interface description models has become the goal of developers. So far, various ideas have been proposed for the automatic generation of interfaces, including specification language-based user interface generation [13], data structure-based user interface generation [14], model-based user interface generation [15], and some approaches using machine learning [16]. Among them, model-based interface generation has received a lot of attention because it is easy to understand. With the gradual proliferation of interface development tools and their programming languages, the limitations of completing user interface development on a particular environmental platform are increasing. User interfaces may have services that provide similar or common essentials, yet there are certainly technical implementation differences due to the different platforms and development languages on which they are implemented, resulting in a waste of personnel. This has led designers to focus on conceptual models of user interaction interfaces. The significance of a conceptual model is that the user interface can be described in detail at a higher and more abstract level, enabling rapid development and exploitation of the user interface for different programming languages and in the context of the used system [17]. The development of a model-based interface system is a process of creating and redefining the user interface model. The primary feature of this model is the depth of the semantic hierarchy, which eliminates the need for early interface detailing, and its reusable approach to interface development facilitates the maintenance of the system at a later stage [18]. The main advantage of using models to support the development of user interfaces is that they can be built using different levels of abstraction, thus supporting the systematic design and implementation [19]. Also, they can provide infrastructure models that support the automatic generation of user interfaces [20]. Moreover, the model-based interface generation can increase the level of abstraction of the interface description, so that the interface is designed in a loosely coupled form independent of the design, development, and runtime platform. However, the current model-based approach to user interface development cannot be widely applied due to the lack of effective reusability mechanisms. Therefore, in order to facilitate the reusability of user interfaces, developers have introduced the concept of models into user interfaces as a way to speed up the process and efficiency of user interface development. Interface design patterns have become a new research hotspot in the design field, focusing on the creation of reusable standard solutions to help developers solve common problems.
The Unified Modelling Language (UML) is the industry standard for object-oriented software design. In many successful projects, UML has played a significant role in software design. However, UML cannot support the design of graphical user interfaces to a great extent, especially in the modelling process where it is difficult to describe them directly and accurately [21]. In the majority of interface designs, UML also neglects the more important design description of interaction behavior [22]. Therefore, it is necessary to extend UML in order to better support user interface and interaction design. As the user interface is the most changeable part of a software system, more and more software systems require not only the ability to quickly develop a quality user interface but also new requirements for the extensibility of the user interface, i.e., the implementation of end-user modifiability of the user interface. End-user modifiability allows developers and end-users to extend the user interface at little cost even after the software system has been released [23]. Traditional development approaches applying RAD tools often solidify the user interface code in the application and require the software system to be redistributed once the user interface has changed, which greatly limits the extensibility of the user interface [24]. The Extensible Markup Language (XML) provides a reference for the solution to this problem. Due to its advantages of extensibility, flexibility, and self-descriptiveness, XML has been widely used in many areas such as e-commerce [25], services [26], electronic health record system [27], and web development [28, 29]. As a result, XML has become the standard in the field of data exchange in the software industry.
To address the above issues, this paper focuses on the use of an extended model based on the user’s point of view to describe the user interaction behavior in modelling applications. The design based on user interaction behavior attempts to highlight the characteristics of the user and the nature of the interaction task by analyzing and modelling the tasks of the application to meet the user interface usability requirements. Extending to application engineering, this approach will reduce the time and development costs of user interface design, thus reflecting a user-centered design philosophy.
2. Model-Based User Interface Development
The conceptual model of the user interface uses three basic models to support the automatic generation of the interface, namely, the application model, the dialogue model, and the representation model. Typical representatives of this class of models are Seeheim model, PAC model, and MVC model.
2.1. Seeheim Model
Seeheim model was the first proposed model of the user interface. As shown in Figure 1, this model divides the user interface into three components. The representation section deals with the external representation of the interface, and the rest of the interface cannot communicate directly with the outside. The dialogue control section specifies the structure of the dialogue between the user and the system. The application interface section establishes the communication links with the application semantics, describes the data structures accessible to the interface, and is responsible for calling these procedures. Logically, these three sections are independent of each other and communicate with each other by sending words.

Seeheim model is a language-based model. The three components correspond to the lexical, syntactic, and semantic levels. A distinctive feature of the model is the emphasis on the role of the control part of the dialogue. However, in a direct manipulation dialogue, the user interacts with the graphical representation of individual application semantic objects, rather than with the application as a whole. This means that the syntax associated with the individual objects should be contained within the individual graphical representations, rather than as a unified and separate part. In addition, semantic feedback is important to increase user involvement. Semantic feedback is sometimes required even for operations that are considered to be at the lexical level. For instance, dragging a graphical object is a lexical operation. However, user engagement is greatly increased if feedback is given on the potential semantic effect of the action. This requires the semantics to be more closely related to the representation part. Clearly, this model does not support the requirement for direct manipulation of syntax and semantics. Although it deals with the logic of conversational interaction in a linear way, it provides a theoretical basis for other models.
2.2. PAC Model
The PAC model divides the system into an abstraction layer, a control layer, and an expression layer from a system perspective (Figure 2). The abstraction layer is responsible for the interaction with the functional kernel. The control layer is responsible for receiving control from the outside and passing it on to the outside, where it interacts directly with the abstraction layer. The expression layer is responsible for direct interaction with the user, including input and output, and the expression layer can interact directly with the control layer. The expression layer of the PAC model communicates with the abstraction layer via the control layer.

2.3. MVC Model
Figure 3 shows the basic framework of the MVC model. Based on the MVC concept and the adaptability to C/S and B/S platforms, the structure of the user interface code consists of three layers: static presentation, logical support, and backend resources. The aim is to obtain a loose coupling between layers and a strong intralayer aggregation. The static presentation layer is the interface presentation layer and therefore the data input and output layer, which consists of input and output controls and visible components. The code that performs local constraints and checks on inputs also belongs to this layer. The logical support layer is the service layer for external service calls, event response to the view layer, and view refresh processing. This layer is loosely related to the interface presentation layer and interface functionality. The backend resource layer is the provisioning layer for external data and component services. The different system and component services are the main components of this layer, most of which are existing code or prebuilt components provided by the system.

PAC and MVC models belong to the object-oriented multiagent model, whose distinguishing features are modularity, parallelism, and distributed processing, and have become the conceptual basis for interface control and implementation. However, the conceptual models are conceptually based and lack a design-oriented representation of the engineering implementation, and their application depends on the designer’s awareness and application and cannot support full process development.
3. User Interface Model Based on Interaction Behavior
The ultimate goal of user interface design is to satisfy the needs of the user as much as possible, which are mainly achieved through the interaction behavior in the system. By describing the interaction between the user and the system, it is possible to effectively define the data and information that the system will operate on and the functional behavior that the system will provide to the public.
3.1. User Interface Requirement
User interface requirements are the initial definition of the characteristics or features of the overall system interface required by the user. Therefore, when analyzing user interface requirements, it is necessary to identify the origin and characteristics of the users of the whole system. At the same time, user interface requirements should be clearly articulated and carefully analyzed with regard to the user’s specific tasks, so that the strategies and responses to these tasks can be aligned with the characteristics of the system’s users. On the other hand, user interface styles are often varied and are closely related to the functionality of the system and the data that it needs to process. Hence, it is necessary to describe the tasks of the system users and the important information data that is relevant to the creation and completion of the functions in the system and the user interface.
In the initial stages of user interface development, it is difficult to obtain the exact and specific software requirements of the user. When the user is an enterprise, the acquisition of user requirements is usually not a problem that can be handled by a computer. When the user is a normal user, the designer must design a convenient and intelligent interaction tool to capture and analyze the specific needs of the user. In addition to this, the interaction tool can automatically analyze and standardize the various types of user requirements and understand what the user really means in terms of requirements. In addition, user requirements are becoming increasingly complex and the attempt to understand them all at once is clearly no longer sufficient to meet the systematic requirements of developing software. Therefore, at the beginning of the requirements analysis, the user interface requirements need to be described in detail until the bulk generation of the entire system interface has become an urgent issue to be addressed throughout the software design and implementation process.
3.2. Framework for User Interface Implementation
The user interface model based on interaction behavior should have the following characteristics: Firstly, the user interface should support user needs and interaction behavior as a whole. The user requirements of a software system are mainly reflected in the set of actions that users perform on the application system interface. User interaction is mainly reflected in the user’s manipulation of the various controls in the interface. This user interface model adds a complementary point to existing approaches to interface design and builds on this to propose user-centered features. Specifically, it is a design process to parse user requirements and provide feasible solutions. Secondly, this user interface model should satisfy a description of user requirements at an abstraction level. To be specific, user requirements need to be abstracted after they have been captured and described as a specific abstract task in order to build the user interface requirements model.
Based on these two characteristics, a framework for generating user interface models based on interaction behavior can be developed, as shown in Figure 4. The input model of the framework must follow the definition and structure of the structured use case model or be able to be transformed into an instance conforming to the structured use case model by means of a designed parser. The transformation of the platform-related models is then achieved by following the relevant rules.

Furthermore, based on the description of the interface requirements in XML, the interface template information is queried using query techniques in markup languages. The interface template information previously obtained during the development of the system can then be stored in the format of an XML document to create a general repository of interface template information. The interface template information can be queried in terms of document name, description of the interface information, etc. The query can then be added to the current project design and implementation with simple modifications and extensions, which can greatly improve the efficiency of development. The reuse relationship is shown in Figure 5.

3.3. Principle of User Interface Generation
The overall structure of the user interface generation is shown in Figure 6, which technically combines semantic technology with automatic interface generation technology and further refinement of its ideas.

The detailed steps of user interface generation are shown as follows (Figure 7):(1)Generate the interface description file based on some of the requirements for the interface provided by the user(2)Initialize the interface data based on the data information from the front page and assign initial values to each corresponding component(3)Initialize the interface generation engine(4)Read and parse the interface description file using the interface generation engine to obtain the corresponding property information(5)Based on the interface description information, the interface component elements are constructed using the interface automation engine and manipulated for layout and interface display

3.4. Structured Use Case Model Design
Use case descriptions are abstract and unstructured, while user interface elements are concrete and structured. In order to transform the abstract use case model into a user interface, this study adds some structural elements to the abstract unstructured traditional use case design, defines them in a structured way, and builds a structured use case model to support the organization of the user interface and the acquisition of interface elements. In this way, the use case model has been largely enriched and made ready for conversion to a user interface.
The structured use case model consists of a basic event stream and one or more alternative event streams. The event stream includes multiple use case events and the relationships between them. Data and control constraints can be bound to the use case events. The data can be operation objects, input and output objects, and action calls for the use case event. If a use case event has multiple post conditions, the jumping of the use case event is controlled by branching the conditions. When describing user interface requirements using a structured use case model, the tasks that may involve the user interface should be broken out and refined as much as possible. A use case can correspond to one or more views of the user interface. The view is an area of the user interface that performs a particular task. This research applies a structured use case diagram to describe the user requirements for the interface, as shown in Figure 8.

Use cases are designed for requirements analysis, and they do not provide information about the control flow associated with the task. Use cases can be considered as high-level tasks, which can then be analyzed for tasks to achieve the use case objectives. Based on the analysis of structured use cases and tasks, the mapping relationship between them is shown in Figure 9.

On the whole, the mapping transformation relationship mainly contains two aspects. On the one hand, XML documents contain different types of model information, such as presentation, interaction, and object, so it is important to clarify the mapping between various models. On the other hand is the the mapping relationship between the target language and the model constituent elements. The architecture model contains controls and the relationships between controls, their global variables, and shared functions. The object model contains classes, objects, views, and other elements and their generalization and association with each other. The interaction model reflects the interface objects and their static relationships and expresses the dynamic interaction between them, including the invocation of program methods between objects, use cases, roles, and collections and the relationship with the interface navigation.
4. Conclusion
This paper analyses the relationship between user interaction behavior and user interface and focuses on how to use user interaction behavior as the basis for user interface modelling. Firstly, from the user’s point of view, an interface generation framework based on user interaction behavior is proposed by extending and improving the concept of packages in the UML library. Next, by structuring and extending the traditional UML use cases, the structured use case model is proposed and used as the main thread through the analysis and design of each submodel to better realize the mapping transformation between the submodels. After that, structured use case diagrams are used to describe user requirements for the interface, the mapping of structured use cases to interaction behavior tasks is investigated, and activity diagrams are used to design the interaction tasks. A method for deriving a presentation model from the interaction behavior model design is then investigated to document the mapping of interface elements to specific elements.
However, there is still much room for further research and refinement of user interface generation models based on interaction behavior. In the future, contextual descriptions such as user preferences could be added and learning mechanisms could be introduced to enrich the user experience. In addition, the adaptability of the interface generation framework needs to be further demonstrated, as well as the implementation of a model processor for conversion to different platforms.
Data Availability
The labeled data set used to support the findings of this study is available from the corresponding author upon request.
Conflicts of Interest
The author declares that there are no conflicts of interest.
Acknowledgments
This work was supported by the Shijiazhuang University of Applied Technology.