Fabrication of Nāmya: A Bend and Touch-Sensitive Flexible Smartphone-Sized Prototype
In human-computer interaction research, prototypes allow for communicating design ideas and conducting early user studies to understand user experience without developing the actual product. For investigating deformation-based interaction, functional prototyping becomes challenging due to the unavailability of commercial platforms and the marginal availability of flexible electronic components. During functional prototyping, incurred time and cost are essential factors that further depend on the ease of stiffness customization, reproduction, and upgrade. To offer these advantages, this work presents the fabrication workflow of Nāmya, a smartphone-sized flexible prototype that can detect bend gestures and touch-based inputs using off-the-shelf sensors and flexible materials. This do-it-yourself (DIY) approach to fabricating deformable prototypes focuses on addressing the challenges of selecting flexible material, type of sensor, and sensor positions. We also demonstrate that the proposed use of a flexible three-dimensional- (3D-) printed internal structure with sensor pockets and the one-part silicone cast allows the development of robust deformable prototypes. This fabrication process offers the opportunity to easily customize device stiffness, reproduce prototypes with similar physical properties, and upgrade existing prototypes.
In human-computer interaction (HCI) research, prototyping is an integral part of the design process. It enables cost- and time-efficient implementation, evaluation, refinement, and validation of design concepts without developing the actual product. Prototyping plays an essential role in communicating design ideas and conducting research on future technology devices. One such research area in HCI is the deformable user interfaces (DUI), investigating new interaction techniques for future flexible digital devices [1, 2]. Due to the lack of commercially available deformable devices, researchers often use nonfunctional or functional prototypes. However, functional prototyping becomes challenging due to the marginal availability of flexible electronic components. The researchers have used different deformable materials, including flexible plexiglass [3, 4], paper [5, 6], flexible plastic [7, 8], ethylene-vinyl acetate (EVA) foam [8–11], polyvinyl chloride (PVC) [12, 13], polycarbonate , and silicone [11, 14–16] to develop the prototype’s body. The resistive bend (or flex) sensors with different lengths and directional capabilities [2–4, 11, 14, 16–18], optical bend sensors [8, 19], piezoelectric sensors [20, 21], and conductive foam-based sensors  are used to detect initiation, extension, and direction of deformation. Literature also offers different feedback techniques such as visual feedback with flexible display [6, 17], rigid display [4, 23–25], and projected display [7, 26, 27] and audio [13, 16] and vibrotactile [13, 28] feedback.
While developing a functional prototype, selecting a flexible material could be challenging considering the time and cost of the prototyping phase and the ease of stiffness customization and upgrade. Particularly at the initial stages of research, deciding on the flexible material, sensor, and sensor positions becomes even more challenging, which often requires developing the prototype multiple times. In addition, based on the type of input interaction, the sensors and placement of sensors are often required to be changed to achieve better results. A certain set of resources get damaged during this repeated modification process and cannot be reutilized, which may indirectly increase the time and cost of fabrication. New and independent researchers often suffer from these difficulties as they usually start from the onset, which could become time-consuming and even fail to bring novelty through prototyping. Also, it becomes difficult to compare findings from two studies conducted with two different prototypes as results may vary with the prototypes’ stiffness and sensor positions. In addition, the use of printed circuits also limits the ease of further modification in the circuit and the scope of introducing additional sensors and actuators without replacing the existing printed circuit. During initial prototyping, these flexible printed circuits may increase the cost of prototyping to meet the requirements of a particular design and the availability of resources. Moreover, there is limited existing research that reports do-it-yourself (DIY) approaches to fabricating low-cost flexible yet robust prototypes with off-the-shelf materials and sensors that allow ease of stiffness customization, reproduction, and upgrade.
Therefore, we present the fabrication workflow of Nāmya (Figure 1(a)), a smartphone-sized flexible prototype that enables users to interact with the system through bend gestures and touch-based input. We selected smartphone-sized prototypes as it is a commonly used digital device by both sighted and visually impaired users  and an increasingly used mobile technology for learning and teaching purposes [30, 31]. Before developing Nāmya, we studied a set of deformable handheld prototypes reported in the literature [3, 4, 6, 14, 16–18, 20, 32, 33]. With this inspiration, we explored the use of several flexible materials and combinations of flexible materials along with the type and placement of commercially available bend or flex sensors to develop a robust deformable smartphone-sized prototype. After finalizing flexible materials, type of sensor, and sensor positions, we first developed Nāmya V1 (with one bend sensor), and later, we upgraded it to Nāmya V2 (with four bend sensors). Finally, we developed Nāmya V3, which can detect bend gestures at six locations and two directions, with touch input at thirty-two touch points on the device surface. This fabrication process has been reported in this work, including failures and challenges that new researchers can address to reduce the time and cost of prototyping. Nāmya V3 is made of low-cost one-part silicone, a flexible 3D-printed internal structure, and conductive fabric. The proposed use of internal structure along with one-part silicone offers ease of stiffness customization, reproduction, and upgrade. These internal structure’s sensor pockets with additional sliding space lengthen the bend sensor’s life, making the entire prototype more robust. The reported fabrication process does not include the fabrication of bend and touch sensors. Instead, we utilized off-the-shelf flexible materials and sensors, which are readily available. For instance, we used commercially available bend sensors to detect bend gestures and capacitive touch sensor controllers to detect the activation of conductive fabric-based touch points. We believe that the reported DIY approach of fabrication and the shared 3D models will help interaction designers and researchers of DUIs to develop deformable prototypes.
2. Literature Review
Our work is informed by existing literature on deformable handheld devices. Gummi  is one of the initial functional deformable handheld prototypes attached with a rigid display. They studied physical deformation to provide input and found that bend gestures are feasible, effective, and enjoyable . Research regarding these deformation-based gestures with different sizes of devices , flexible materials , and stiffness [36–38] have been studied in the literature. Warren et al.  proposed a classification of bend gestures on a letter-sized flexible prototype with three pairs of bend sensors. PaperPhone  reported a deformable prototype that includes a flexible E-Ink display. Kinetic device  reported the use of a flexible display on a deformable smartphone-sized prototype. Ahmaniemi et al.  reported the use of a functional, high-fidelity deformable prototype equipped with a high-resolution flexible display. Later, Twisting touch  reported the use of a multifinger capacitive touch panel made of thin, flexible material that allows the prototype to detect both deformation and touch input.
Although several deformable prototypes have been reported in the literature (Table 1), fabricating Bendy  reported a detailed process for fabricating a silicone cast flexible prototype with bend sensors (attached to a flexible printed circuit) embedded in it with a plastic substrate (polycarbonate) layer on top. They also discussed three types of visual displays for deformable prototypes with flexible, rigid, and projected displays. This fabrication technique has also been used for fabricating other flexible prototypes used for bend passwords, mobile games, and bend gesture-based interaction by users with visual impairment . Ernst et al.  reported the development of Typhlex, a flexible prototype for users with visual impairment. They reported the fabrication of silicone cast flexible prototypes and evaluated the performance and usability of several prototypes with different groove locations, widths, and depths. However, both Bendy  and Typhlex  reported device stiffness customization through the use of different flexible materials in combination or the same material of different shore hardness. The existing literature has provided several prototyping techniques with various flexible materials, sensors, and sensor positions. However, early prototyping becomes challenging due to the unavailability of commercial flexible electronic components such as flexible bend and touch-sensitive panels and displays. In addition, there is limited existing research that reported DIY approaches to fabrication of a flexible yet robust prototype using off-the-shelf materials and sensors for early research on deformable smartphones that facilitate ease of stiffness customization, reproduction, and upgrade. In this work, we report a DIY approach to fabricating a silicone cast bend and touch-sensitive smartphone-sized prototype using commercially available bend sensors and touch sensor controllers with a 3D-printed internal structure to offer ease of stiffness customization, reproduction, and upgrade.
3. Fabrication Workflow of Nāmya
Nāmya is a smartphone-sized flexible prototype (Figure 1(a)) of dimension . The word Nāmya is a Sanskrit word that means bendable, pliant, pliable, or flexible [41–43]. The second letter ā (Latin A with a macron) denotes the long vowel to pronounce the word as Naamya. This work presents a DIY approach to fabricating deformable prototypes using silicone and flexible 3D-printed internal structures. It also demonstrates the process of producing new deformable prototypes (Nāmya V1 and V3) and upgrading an existing prototype (Nāmya V1 to V2).
3.1. Selection of Flexible Materials
According to the literature, different materials have been used for developing deformable prototypes. A few of these materials are very distinct (such as cloth and paper), where the studies’ objectives involve exploring such material’s potential [5, 6]. We observed that different materials offer different flexibility, ease of development and upgrade during prototyping. In addition, the ease of performing deformation gestures varies with the material’s stiffness .
Out of the different flexible materials we explored, the below mentioned six materials (Figure 2) have certain advantages in their basic form (not in combination with other materials). Canvas fabric, paper, PVC, and EVA foam sheets are readily available with different thicknesses and are easy to cut in the required dimension. Canvas fabric and paper are suitable for projected display due to their white colour. However, overused and moist canvas fabric and paper have poor shape retention capability. The major advantage of PVC sheets is their shape retention capability after releasing the deformation. However, PVC sheets are not suitable for large angles of bend or fold gestures. Compared to the above three materials, EVA foam is more suitable for flexible prototypes as it is not affected by repeated use, moisture, and large angles of bend or fold gestures. However, it has less shape retention capability. This indicates that a combination of EVA foam and PVC sheets can offer more benefits. The major advantage of using the rest two materials (Figure 2), 3D-printed Thermoplastic Polyurethane (TPU) and silicone cast, is the ease and freedom to fabricate complex custom-shaped prototypes. However, they require more fabrication time. Considering the need for stiffness customization to fabricate flexible prototypes of different stiffness for various use cases such as handheld and wearable devices, ease of prototype’s stiffness customization and replication is important during flexible device prototyping. The 3D-printed flexible prototype offers the freedom of the overall prototype’s stiffness customization by changing the thickness of the 3D-printed prototype or using a printing material of different shore hardness. For two-part silicone, thoroughly mixing the two parts in a predefined mixing ratio is required to produce the expected physical properties of the cured silicone cast. Such mixing of two parts often requires degassing of the final mixture. Unlike two-part silicone, precise mixing errors and degassing are not challenges of one-part room-temperature-vulcanizing (RTV) silicone. The use of self-leveling (flowable or low viscosity) one-part RTV silicone also reduces the challenges of manual leveling of the silicone cast’s top surface. In the case of both one- and two-part silicones, the overall prototype’s stiffness can be changed by changing the thickness of the silicone cast or using a silicone of different shore hardness. This indicates the potential advantage of using 3D-printed TPU with silicone to develop flexible device prototypes. Nevertheless, considering the type of liquid silicone used, it is crucial to handle the uncured silicone with caution, such as using protective goggles, a respirator mask covering the nose and mouth, and gloves protecting the skin.
3.1.1. Prototypes Made of EVA Foam and PVC Sheet
We developed the initial nonfunctional prototype combining two flexible materials. It contains a flexible PVC sheet of thickness 1.5 mm sandwiched between two EVA foam sheets of thickness 3 mm (Figure 3(a)). We used this initial nonfunctional prototype in a gesture identification study . Later, we added a thin layer of white fabric on the topmost EVA foam layer to utilize it as a projected display (alternative to white EVA foam sheets). A nonfunctional prototype developed using this technique was used in a gesture action mapping study with projected content . We found that these materials offer ease of manipulating the height and width of the nonfunctional prototype. However, manipulating the thickness to achieve desired stiffness along with shape retention capability becomes difficult. Developing a functional prototype with these materials becomes even more challenging due to the embedded sensors. Keeping the sensors and circuits attached to the EVA foam sheet is one of the challenges. After a few repeated deformations of the prototype, the sensors and circuits usually get displaced, which leads to detection error. In contrast, permanently attaching the sensors to the layers can damage them as they do not have enough freedom to slide during deformation. To solve the issue of unwanted movement of sensors and circuits during deformation, we cut grooves on the surface of a thick EVA foam sheet to place the sensors and covered them with a thin EVA foam sheet. We found that manual grooving may lead to nonuniform grooving, making the sensors difficult to slide during deformation, which may diminish their sensitivity and even damage the sensors. We believe prototypes made of EVA foam and PVC sheets could be helpful in nonfunctional prototyping and deciding sensor positions and orientations during the initial stages of functional prototyping.
Later, we developed a flexible 3D-printed layer made of TPU (Thermoplastic Polyurethane of shore hardness 95A) with grooves on it to hold the sensors and the wires in place. We used this layer as an alternative to the PVC sheet shown in Figure 3(a). Although adding this 3D-printed layer with grooves solves the issue of unwanted movement of sensors and circuits during bend gestures, we found that after performing a few gestures, the joints between the 3D-printed layer and EVA foam sheets tear apart. We found silicone-based glue to serve better when applied on clean 3D-printed layers for all the above joints.
3.1.2. Prototypes Made of 3D-Printed TPU and One-Part Silicone
We developed another prototype made of silicone cast (shore hardness 25A) sandwiched between two 3D-printed TPU (shore hardness 95A) layers (Figure 3(b)). Out of these three layers, the two extreme 3D-printed TPU layers hold the sensors, and the middle silicone layer serves as an insulator and glue that holds all the layers together. However, keeping the layers attached to each other for a longer period during repeated deformation remained challenging. On the other hand, we found that a flexible 3D-printed layer is most suitable for holding the sensors and wires in place. Later, to develop a functional flexible prototype that is robust to repeated deformation, we developed a silicone cast prototype (Figure 3(c)) with a 3D-printed TPU layer (as an internal structure to hold the sensors and wires) embedded in the cast. Considering the higher cost of two-part silicone and the degassing required before curing, we decided to use a low-cost one-part RTV silicone sealant for preparing the silicone cast (Figure 3(c)). Although it needs more cure time, it is flexible (shore hardness 25A), is resistant to moisture, and does not involve mixing two parts. According to the literature , a flexible printed circuit (FPC) to physically hold the sensors could solve the issue of unwanted movement of sensors during deformation. However, custom-made FPCs are relatively costly, and any modification or damage on the FPC requires complete replacement, which may further increase the overall fabrication cost. In contrast, we decided to use an easily customizable flexible 3D-printed TPU layer with silicone-coated wires as a cost-efficient alternative. We used this 3D-printed TPU layer as an internal structure to hold the sensors in place and multistranded 28 American Wire Gauge (AWG) silicone-coated wires for the circuit. We found that the flexible 3D-printed layer also contributes to the overall stiffness of the prototype. We explored flexible 3D-printed layers of different thicknesses, which resulted in a change in the rotational stiffness of the prototypes. This indicates that changing the thickness of the 3D-printed internal structure allows easy customization and reproduction of device stiffness without changing the thickness of the silicone cast. This allows producing prototypes of different stiffness without changing the entire device thickness. Moreover, silicone sealant can be applied on a clean cured layer of silicone itself. As a result, replacing and upgrading the sensors and debugging the circuit do not require creating a new prototype from scratch.
3.2. Purpose-Driven Use and Placement of Sensors
Resistive bend (or flex) sensors are the most commonly used commercially available sensors to detect deformation on a flexible prototype [2–4, 11, 14, 16–18]. These sensors come with different lengths and directional capabilities (uni- and bidirectional). The purpose of bidirectional sensors can be achieved by placing two unidirectional sensors together at the cost of using more space and wires. Optical sensors are also explored in the literature to detect device deformation [8, 19]. Rendl et al. [20, 21] proposed printed piezoelectric sensors that can detect complex deformations. Chien et al.  proposed a shape-sensing flexible sensor strip composed of an array of strain gauges. Researchers have also proposed soft multipoint sensors to sense through structure , deformable textile sensors to sense surface and deformation gestures , and soft multilayer sensors to sense contact localization and the type and magnitude of deformation . Teyssier et al.  proposed the fabrication of an interface that can reproduce the sensing capabilities of human skin. Shahmiri and Dietz  proposed a geometric technique that measures relative shifting among multiple layers of the sensor to detect curves with multiple bends. Watanabe et al.  reported the use of a novel conductive foam-based sensor with one wire to detect multimodal input.
However, for developing Nāmya, we used commercially available bend sensors to avoid the challenges of fabricating the sensing units from scratch. Since the cost of bidirectional bend sensors is higher than unidirectional bend sensors, we started testing different sensor positions with 2.2 long unidirectional sensors (Figures 4(a) and 4(b)). In Figure 4(a), we placed the unidirectional bend sensors only at the middle and right side of the device as we developed this prototype to detect right-handed interaction in landscape mode towards only upward (bending the device towards the user) direction. Here, bend gestures at right corners and the right side activate corner sensors, while the center bend gesture activates the middle sensor. In Figure 4(b), we placed two unidirectional bend sensors at the same location (top-right corner) as we developed this prototype to detect two magnitude levels of size-based gestures during right-handed landscape mode interaction towards the upward direction. Here, small area bend activates one sensor, and the large area bend activates two sensors. Later, in the final prototype, we used the 3D-printed layer (Figure 4(c)) as an internal structure with four 2 long bidirectional bend sensors placed along each corner since we need to recognize deformation gestures towards both upward (towards the user) and downward (away from the user) directions.
The type and placement of sensors depend on the intended gesture set that needs to be recognized. Deciding these factors depends on the descriptors of deformation gestures  (commonly location, direction, size, and angle of deformation) since correctly recognizing the gesture requires distinguishable activation of the appropriate sensor or multiple sensors. The prototype Nāmya (Figure 1(a)) was developed for both visually impaired and sighted users. As visually impaired users use four corners and sides of a smartphone as spatial references , we selected limited bend gestures at the four corners, top side, and bottom side (in portrait mode). Therefore, instead of placing additional sensors along the top and bottom sides (in portrait mode), we utilized the input from two corner sensors to detect side bend gestures (Figure 4(c)). For example, in portrait mode, activation of both top sensors at the same moment is recognized as top side bend, and activation of both bottom sensors at the same moment is recognized as bottom side bend.
3.3. Fabrication of Nāmya V1 and Its Upgrade to Nāmya V2
After selecting the flexible materials (for casting and internal structure) and sensor type and positions, we developed the first functional prototype (Nāmya V1 in Figure 5(d)) to recognize a bend gesture at the top-right corner towards the upward direction in portrait mode. Although we used one 2.2 long unidirectional sensor for this prototype, the flexible 3D-printed internal structure (Figure 6(b)) was designed to accommodate four bend sensors. First, we developed a 3D-printed mold to cast the prototype, where the casting was done in two phases. We used TPU to print the mold to make it flexible enough for ease of removing the cast. We poured the silicone sealant to cast half of the prototype in the first phase (Figure 5(a)), and within a few mins, we placed the 3D-printed internal structure (along with one sensor) on the top (Figure 5(b)). Then, we poured the silicone sealant to cast the remaining half of the prototype (Figure 5(c)) before the previous layer of silicone forms the skin. Nāmya V1, the first functional prototype (Figure 5(d)), contains a 3D-printed internal structure (Figure 6(b)) of thickness 1 mm and a shore hardness of 95A. The shore hardness of the cured silicone is 25A. The thickness of the final silicone cast prototype is 6 mm. The overall stiffness of the silicone cast prototype can be increased by increasing the thickness of the 3D-printed internal structure or the silicone cast. Although silicone sealant cures and bonds in 24 hours at room temperature, we decided to keep it for another 24 hours to avoid any damage while removing the mold. After the completely cured prototype was removed from its mold and cleaned with a damp cloth, we tested the bend sensor at the top-right corner by performing repeated bends. We found that the analog pin of the Arduino board connected to the bend sensor was showing approximately similar readings for repeated similar angles of bend gestures. We also tested one touch point using Proximity Capacitive Touch Sensor Controller (MPR121) for Arduino. Both bend and touch sensors were connected to an Arduino Mega 2560 microcontroller board. We selected Arduino Mega 2560 to take advantage of more memory and a higher number of digital and analog pins to connect multiple bend sensors and touch sensor controllers. We used the open-source integrated development environment (IDE) of Arduino for writing the code and uploading it to the Arduino board. Later, for Nāmya V3, we also used Processing (version 3.5.4) IDE with its graphics library to graphically represent the touch and bend gesture-based input (Figure 1(b)).
In the next phase, we developed Nāmya V2, the second functional prototype that contains the same 3D-printed internal structure as that of Nāmya V1 but is equipped with four 2.2 long unidirectional bend sensors (Figure 5(h)). However, this time, we did not repeat the entire casting process from scratch. Instead, we modified the existing prototype Nāmya V1 to develop the second prototype Nāmya V2. The surface of Nāmya V1 was dissected without damaging any wires or the internal structure to remove the internal structure with the sensor, as shown in Figure 5(e). The old silicone covering the internal structure was removed (Figure 5(f)). Later, the internal structure was equipped with four 2.2 long unidirectional bend sensors. Finally, this internal structure (with four sensors) is placed back to its original position, and the empty space around and above is filled with the same silicone sealant (Figure 5(g)). This indicates the ease of upgrade due to the use of flexible 3D-printed internal structure and silicone sealant. After this prototype (Nāmya V2 in Figure 5(h)) was allowed to cure for 48 hours, we again tested all the bend sensors at the four corners and found them to be working properly during repeated bends.
3.4. Fabrication of Nāmya V3
Nāmya V3, the third functional prototype, contains four 2 long bidirectional bend sensors and thirty-two touch points on the device’s front surface (Figure 1(a)). Fabrication of Nāmya V3 was done following the same steps as that of Nāmya V1 (Figures 5(a)–5(d)). However, this time, a separate mold (Figure 6(a)) was used to create thirty-two thumb-sized rectangular grooves of depth 1 mm on the prototype’s surface. Later, these grooves were filled with conductive fabric glued by the same silicone sealant to use as touch points. There are thirty-two thumb-sized rectangular touch points on the prototype’s surface. These rectangular touch points towards the center of the device’s surface are intentionally kept wider than those near the edges to offer comparable traversal time while the user moves from the edge. Before using conductive fabric as touch points, we first tried to fill the grooves with a mixture of silicone sealant and carbon-based conductive ink. However, this mixture failed to detect touch-based input uniformly over a surface area. Such mixing of silicone sealant and carbon-based conductive ink requires further investigation. Later, as an alternative, we decided to use conductive fabric, which is flexible and can be easily glued to a silicone surface. Conductive threads insulated by silicone (as an alternative to thin silicone-coated flexible wire) were used for connecting the conductive fabric-based touch points with MPR121. These conductive threads were placed on the mold before pouring the first silicone layer. Unlike the previous internal structure (Figure 6(b)), a new internal structure (Figure 6(c)) was used for Nāmya V3, where the sensor pockets were intentionally kept longer than the sensors and equipped with covers to create free space around the sensors during casting. These sensor pockets with additional sliding space lengthen the bend sensors’ life as compared to the previous internal structure (Figure 6(b)) by reducing the potential damage during deformation. This new internal structure also offered a better uniform experience of device stiffness across the corners and sides of the flexible prototype due to the wider internal structure (Figure 6(c)). After the prototype was allowed to cure for 48 hours, we tested all the bend sensors and touch points and found them to be working correctly during repeated bend and touch-based input. This indicates the ease of reproduction by following the proposed DIY fabrication approach with one-part silicone sealant and flexible 3D-printed internal structures.
We used this final functional prototype (Nāmya V3) in a user study conducted to identify bend gesture completion strategies . We used this prototype during both training and exploration of gesture completion strategies to record and monitor the sensor data. During all the participants’ repeated deformations of the prototype, both bend sensors’ and touch points’ activations and changes in readings were recorded correctly by the Arduino board. We also noticed that the conductive cloth-based touch points perform better in detecting touch input in the presence of moisture. Overall, from the viewpoint of prototyping, we found that silicone cast prototyping with a flexible 3D-printed internal structure is more robust than other materials explored in this work.
Since the flexible 3D-printed internal structure also contributes to the prototype’s overall stiffness, modifying its thickness or design will ease the prototype’s stiffness customization without changing the device thickness. Such modification can easily be replicated, unlike changing the prototype’s stiffness by changing the type of material itself to develop the prototype from scratch. Unlike the use of a flexible printed circuit  of uniform and standard thickness, the proposed 3D-printed internal structure can be designed to facilitate uniform or nonuniform stiffness across the device surface. This freedom to manipulate the stiffness at different locations of the same device can be utilized for affordance and ease of performing bend gestures in addition to achieving the same by using external grooves on the device surface . In addition to reducing the cost of prototyping, one-part silicone sealant also contributes to prototype upgradation without fabricating from scratch. The process of developing Nāmya V2 from Nāmya V1 indicates that this fabrication approach allows ease of upgrade. The use of 3D-printed internal structures also reduces the fabrication cost (provided 3D printing facilities are readily available) and helps upgrade an existing prototype easily. Although this internal structure is used as an alternative to an FPC, considering the advantages of sensor pockets with additional sliding space, this structure can be used along with an FPC to create more robust prototypes. Moreover, with the help of a 3D modeling program, the shared 3D models (https://drive.google.com/drive/folders/1KmyoMf_p8yw23PCuVGmmX6NDcybspFFG) of the molds and the internal structures could be easily modified to upgrade the structure to support more sensors or to use it for prototyping flexible devices of various types (handheld and wearable to name a few) and dimensions. Overall, the use of the 3D-printed internal structure with one-part silicone allows ease of stiffness customization, reproduction, and upgrade. In comparison with existing literature reported in Table 1, the proposed DIY fabrication approach enables the development of deformable prototypes with off-the-shelf materials to perform multimodal interaction. We believe this prototyping process could also benefit classroom training as it opens up several new opportunities for prototype and process customization using different molds and internal structures.
Nāmya V3 is a functional prototype for bend gesture and touch-based interaction. The four bidirectional bend sensors placed at the corners can detect bend gestures at the four corners, top and bottom sides towards both upward and downward directions. It can also detect compound bend gestures performed at the above-mentioned locations and directions. The use of additional bend sensors can enable the recognition of bend gestures at other locations along with recognition of deformation gestures other than bending and folding the device [35, 39, 51]. Although threshold-based activation of the bend sensors could be utilized to recognize these bend gestures, using a machine learning algorithm in this context could deliver promising results with limited use of bend sensors.
We believe this prototyping technique with flexible 3D-printed internal structures embedded in silicone cast can be applied to fabricate other flexible digital devices provided the prototype thickness offers enough space to embed the internal structure equipped with sensors, actuators, and circuits. Although the prototypes reported in this work are limited to smartphone-sized handheld devices, deformable user interfaces could be of various dimensions and shapes. To fabricate other functional deformable prototypes, the form factor of the silicone cast, design of the internal structure, and type and position of the sensors can be decided based upon the usage scenarios of the deformable device and the gesture space determined for deformation-based interaction. EVA foam-based nonfunctional prototypes could be helpful during the initial exploration of device form factors and the type and position of sensors before developing silicone cast prototypes. However, one major drawback of such silicone cast prototypes is that silicone shrinks after it cures. Such silicone cast prototypes may not be suitable if the experiment demands precise dimensions of prototypes. Another limitation of the current prototype is the conductive fabric’s metallic colour (used as touch points) that does not allow this prototype’s surface for projected display. Using the conductive fabric of lighter colour (containing cotton and silver yarn) and mixing similar colour to the one-part silicone or using a white one-part silicone sealant during fabrication can address this issue.
In this work, we present a DIY fabrication approach of Nāmya, a flexible silicone cast smartphone-sized prototype that allows multimodal interaction through bend gestures and touch-based input. We reported the exploration of several flexible materials and combinations of flexible materials to develop a flexible yet robust prototype. We also investigated different types and placements of sensors to study the features of flexible smartphone-sized devices. We found that one-part silicone sealant, 3D-printed internal structure, and conductive fabric allow robust prototyping of flexible devices for multimodal interaction using commercially available bend sensors and capacitive touch sensor controllers. Moreover, the proposed fabrication technique using one-part silicone with flexible 3D-printed internal structures offers ease of stiffness customization, reproduction, and upgrade. We believe this work and the shared 3D models of the molds and internal structures will help designers and researchers of deformable devices.
The data used to support the findings of this study are included within the article.
Conflicts of Interest
The authors declare that they have no conflicts of interest.
A. Boem and G. M. Troiano, “Non-rigid HCI: a review of deformable interfaces and input,” in Proceedings of the 2019 on Designing Interactive Systems Conference, pp. 885–906, New York, June 2019.View at: Google Scholar
A. Girouard and A. K. Eady, “Deformable user interfaces: using flexible electronics for human computer interaction,” in 2018 International Flexible Electronics Technology Conference (IFETC), pp. 1–3, Ottawa, Canada, August 2018.View at: Google Scholar
C. Schwesig, I. Poupyrev, and E. Mori, “Gummi: user interface for deformable computers,” in CHI’03 Extended Abstracts on Human Factors in Computing Systems, pp. 954-955, United States, April 2003.View at: Google Scholar
C. Schwesig, I. Poupyrev, and E. Mori, “Gummi: a bendable computer,” in Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 263–270, NY United States, April 2004.View at: Google Scholar
D. Holman, R. Vertegaal, M. Altosaar, N. Troje, and D. Johns, “Paper windows: interaction techniques for digital paper,” in Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 591–599, United States, April 2005.View at: Google Scholar
B. Lahey, A. Girouard, W. Burleson, and R. Vertegaal, “PaperPhone: understanding the use of bend gestures in mobile devices with flexible electronic paper displays,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1303–1312, New York, United States, May 2011.View at: Google Scholar
F. Daliri and A. Girouard, “Visual feedforward guides for performing bend gestures on deformable prototypes,” Graphics Interface, pp. 209–216, 2016.View at: Google Scholar
G. Herkenrath, T. Karrer, and J. Borchers, “Twend: twisting and bending as new interaction gesture in mobile devices,” in CHI’08 Extended Abstracts on Human Factors in Computing Systems, pp. 3819–3824, New York, United States, April 2008.View at: Google Scholar
P. P. Borah, J. Seth, and K. Sorathia, “Gesture action mapping for deformation-based one-handed landscape mode interaction with flexible smartphone-sized devices,” in India HCI, pp. 1–10, New York, United States, February 2021.View at: Google Scholar
P. P. Borah and K. Sorathia, “Natural and intuitive deformation gestures for one-handed landscape mode interaction,” in Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction, pp. 229–236, New York, United States, March 2019.View at: Google Scholar
D. B. Faustino, S. Nabil, and A. Girouard, “Bend or PIN: studying bend password authentication with people with vision impairment,” 2019.View at: Google Scholar
S. Maqsood, “Shoulder surfing susceptibility of bend passwords,” in CHI’14 Extended Abstracts on Human Factors in Computing Systems, pp. 915–920, New York, United States, April 2014.View at: Google Scholar
S. Maqsood, S. Chiasson, and A. Girouard, “Bend Passwords: using gestures to authenticate on flexible devices,” Personal and Ubiquitous Computing, vol. 20, no. 4, pp. 573–600, 2016.View at: Google Scholar
J. Lo and A. Girouard, “Fabricating bendy: design and development of deformable prototypes,” IEEE Pervasive Computing, vol. 13, no. 3, pp. 40–46, 2014.View at: Google Scholar
P. P. Borah, K. Sorathia, and S. Sarcar, “User-defined bend gesture completion strategies for discrete and continuous inputs,” IFIP Conference on Human-Computer Interaction, Springer, Cham, pp. 192–202, 2021.View at: Google Scholar
M. Ernst, T. Swan, V. Cheung, and A. Girouard, “Typhlex: exploring deformable input for blind users controlling a mobile screen reader,” IEEE Pervasive Computing, vol. 16, no. 4, pp. 28–35, 2017.View at: Google Scholar
J. Burstyn, A. Banerjee, and R. Vertegaal, “FlexView: an evaluation of depth navigation on deformable mobile devices,” in Proceedings of the 7th International Conference on Tangible, Embedded and Embodied Interaction, pp. 193–200, New York, United States, February 2013.View at: Google Scholar
A. Girouard, M. Jessica Lo, F. D. Riyadh, A. K. Eady, and J. Pasquero, Eds.“One-handed bend interactions with deformable smartphones,” in Proceedings of the 33rd annual ACM conference on human factors in computing systems, A. Girouard, M. Jessica Lo, F. D. Riyadh, A. K. Eady, and J. Pasquero, Eds., pp. 1509–1518, New York, United States, April 2015.View at: Google Scholar
K. S. Kuang, W. J. Cantwell, and P. J. Scully, “An evaluation of a novel plastic optical fibre sensor for axial strain and bend measurements,” Measurement Science and Technology, vol. 13, no. 10, article 1523, 2002.View at: Google Scholar
C. Rendl, D. Kim, S. Fanello et al., “FlexSense: a transparent self-sensing deformable surface,” in Proceedings of the 27th annual ACM symposium on User interface software and technology, pp. 129–138, New York, United States, October 2014.View at: Google Scholar
C. Rendl, D. Kim, P. Parzer et al., “Flexcase: enhancing mobile interaction with a flexible sensing and display cover,” in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, pp. 5138–5150, New York, United States, May 2016.View at: Google Scholar
K. Watanabe, R. Yamamura, and Y. Kakehi, “foamin: a deformable sensor for multimodal inputs based on conductive foam with a single wire,” in Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–4, New York, United States, May 2021.View at: Google Scholar
A. K. Eady, “Deformable interactions to improve the usability of handheld mobile devices,” in Proceedings of the 21st International Conference on Human-Computer Interaction with Mobile Devices and Services, pp. 1–5, New York, United States, October 2019.View at: Google Scholar
T. Tajika, T. Yonezawa, and N. Mitsunaga, “Intuitive page-turning interface of e-books on flexible e-paper based on user studies,” in Proceedings of the 16th ACM international conference on Multimedia, pp. 793–796, New York, United States, October 2008.View at: Google Scholar
D. Wightman, T. Ginn, and R. Vertegaal, “Bendflip: examining input techniques for electronic book readers with flexible form factors,” IFIP Conference on Human-Computer Interaction, Springer, Berlin, Heidelberg, pp. 117–133, 2011.View at: Google Scholar
K. Sorathia, A. Singh, and M. Chhabra, “BendSwipe: one handed target zooming for flexible handheld display,” FIP Conference on Human-Computer Interaction, Springer, Cham, pp. 431–435, 2017.View at: Google Scholar
J. Steimle, A. Jordt, and P. Maes, “Flexpad: highly flexible bending interactions for projected handheld displays,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 237–246, New York, United States, April 2013.View at: Google Scholar
M. Ernst and A. Girouard, “Exploring haptics for learning bend gestures for the blind,” in Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, pp. 2097–2104, New York, United States, May 2016.View at: Google Scholar
S. K. Kane, C. Jayant, J. O. Wobbrock, and R. E. Ladner, “Freedom to roam: a study of mobile device adoption and accessibility for people with visual and motor disabilities,” in Proceedings of the 11th international ACM SIGACCESS conference on Computers and accessibility, pp. 115–122, New York, United States, October 2009.View at: Google Scholar
A. Oliveira, R. Feyzi Behnagh, L. Ni, A. A. Mohsinah, K. J. Burgess, and L. Guo, “Emerging technologies as pedagogical tools for teaching and learning science: a literature review,” Human Behavior and Emerging Technologies, vol. 1, no. 2, pp. 149–160, 2019.View at: Google Scholar
M. Raento, A. Oulasvirta, and N. Eagle, “Smartphones: an emerging tool for social scientists,” Sociological Methods & Research, vol. 37, no. 3, pp. 426–454, 2009.View at: Google Scholar
J. Kildal, A. Lucero, and M. Boberg, “Twisting touch: combining deformation and touch as input within the same interaction cycle on handheld devices,” in Proceedings of the 15th international conference on Human-computer interaction with mobile devices and services, pp. 237–246, New York, United States, August 2013.View at: Google Scholar
K. Warren, J. Lo, V. Vadgama, and A. Girouard, “Bending the rules: bend gesture classification for flexible displays,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 607–610, New York, United States, April 2013.View at: Google Scholar
S.-s. Lee, Y.-k. Lim, and K.-P. Lee, “Exploring the effects of size on deformable user interfaces,” in Proceedings of the 14th international conference on Human-computer interaction with mobile devices and services companion, pp. 89–94, New York, United States, September 2012.View at: Google Scholar
S.-S. Lee, S. Kim, B. Jin et al., “How users manipulate deformable displays as input devices,” in In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 1647–1656, New York, United States, April 2010.View at: Google Scholar
J. Kildal, “Interacting with deformable user interfaces: effect of material stiffness and type of deformation gesture,” International Conference on Haptic and Audio Interaction Design, Springer, Berlin, Heidelberg, pp. 71–80, 2012.View at: Google Scholar
J. Kildal and G. Wilson, “Feeling it: the roles of stiffness, deformation range and feedback in the control of deformable ui,” in Proceedings of the 14th ACM international conference on Multimodal interaction, pp. 393–400, New York, United States, October 2012.View at: Google Scholar
Y. Nakagawa, A. Kamimura, and Y. Kawaguchi, “MimicTile: a variable stiffness deformable user interface for mobile devices,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 745–748, New York, United States, May 2012.View at: Google Scholar
J. Kildal, S. Paasovaara, and V. Aaltonen, “Kinetic device: designing interactions with a deformable mobile interface,” in CHI’12 extended abstracts on human factors in computing systems, pp. 1871–1876, New York, United States, May 2012.View at: Google Scholar
T. T. Ahmaniemi, J. Kildal, and M. Haveri, “What is a device bend gesture really good for?” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 3503–3512, New York, United States, April 2014.View at: Google Scholar
V. S. Apte, The Practical Sanskrit-English Dictionary: Containing Appendices on Sanskrit Prosody and Important Literary and Geographical Names of Ancient India, Motilal Banarsidass Publications, India, 1965.
A. A. Macdonell, A Practical Sanskrit Dictionary with Transliteration, Accentuation, and Etymological Analysis throughout, Motilal Banarsidass Publications, New Delhi India, 2004.
M. Monier-Williams, “Sanskrit-English Dictionary,” 1964.View at: Google Scholar
C.-y. Chien, R.-H. Liang, L.-F. Lin, L. Chan, and B.-Y. Chen, “Flexibend: enabling interactivity of multi-part, deformable fabrications using single shape-sensing strip,” in Proceedings of the 28th Annual ACM Symposium on User Interface Software & Technology, pp. 659–663, New York, United States, November 2015.View at: Google Scholar
R. Slyper, I. Poupyrev, and J. Hodgins, “Sensing through structure: designing soft silicone sensors,” in Proceedings of the fifth international conference on Tangible, embedded, and embodied interaction, pp. 213–220, New York, United States, January 2010.View at: Google Scholar
P. Parzer, A. Sharma, A. Vogl, J. Steimle, A. Olwal, and M. Haller, “SmartSleeve: real-time sensing of surface and deformation gestures on flexible, interactive textiles, using a hybrid gesture detection pipeline,” in Proceedings of the 30th Annual ACM Symposium on User Interface Software and Technology, pp. 565–577, New York, United States, October 2017.View at: Google Scholar
S. H. Yoon, L. Paredes, K. Huo, and K. Ramani, “MultiSoft: soft sensor enabling real-time multimodal sensing with contact localization and deformation classification,” Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 2, no. 3, pp. 1–21, 2018.View at: Google Scholar
M. Teyssier, G. Bailly, C. Pelachaud, E. Lecolinet, A. Conn, and A. Roudaut, “Skin-on interfaces: a bio-driven approach for artificial skin design to cover interactive devices,” in Proceedings of the 32nd Annual ACM Symposium on User Interface Software and Technology, pp. 307–322, New York, United States, October 2019.View at: Google Scholar
F. Shahmiri and P. H. Dietz, “Sharc: a geometric technique for multi-bend/shape sensing,” in Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, pp. 1–12, New York, United States, April 2020.View at: Google Scholar
T. Guerreiro, K. Montague, J. Guerreiro, R. Nunes, H. Nicolau, and D. J. V. Gonçalves, “Blind people interacting with large touch surfaces: strategies for one-handed and two-handed exploration,” in Proceedings of the 2015 International Conference on Interactive Tabletops & Surfaces, pp. 25–34, New York, United States, November 2015.View at: Google Scholar
P. Shorey and A. Girouard, “Bendtroller: an exploration of in-game action mappings with a deformable game controller,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, pp. 1447–1458, New York, United States, May 2017.View at: Google Scholar