Advances in Human-Computer Interaction

Advances in Human-Computer Interaction / 2020 / Article

Research Article | Open Access

Volume 2020 |Article ID 8854042 | https://doi.org/10.1155/2020/8854042

Adnan Mehmood, Han He, Xiaochen Chen, Aleksi Vianto, Ville Vianto, Oğuz ‘Oz’ Buruk, Johanna Virkki, "ClothFace: A Passive RFID-Based Human-Technology Interface on a Shirtsleeve", Advances in Human-Computer Interaction, vol. 2020, Article ID 8854042, 8 pages, 2020. https://doi.org/10.1155/2020/8854042

ClothFace: A Passive RFID-Based Human-Technology Interface on a Shirtsleeve

Academic Editor: Francesco Bellotti
Received13 Mar 2020
Revised03 Jul 2020
Accepted18 Jul 2020
Published05 Aug 2020

Abstract

This paper introduces ClothFace, a shirtsleeve-integrated human-technology interface platform, which comprises two wrist antennas and three radio frequency identification (RFID) integrated circuits (ICs), each with a unique ID. The platform prototype, which is created on a shirtsleeve by cutting the antennas and antenna-IC interconnections from copper tape, can be used for push button and swipe controlling. Each IC can be activated, i.e., electrically connected to the two antennas, by touching the IC. These ICs can act as wireless input buttons to the technology around us. Due to the used passive ultrahigh-frequency (UHF) RFID technology, there is no need for clothing-integrated energy sources, but the interface platform gets all the needed energy from an external RFID reader. The platform prototype was found to be readable with an external RFID reader from all directions at distances of 70–80 cm. Further, seven people giving altogether 1400 inputs tested the prototype sleeves on a table and on body. In these first tests, 96–100% (table) and 92–100% (on-body) success rates were achieved in a gamelike testing setup. Further, the platform was proved to be readable with an off-the-shelf handheld RFID reader from a distance of 40 cm. Based on these initial results, this implementation holds the potential to be used as a touch interface blended into daily clothing, as well as a modular touch-based interaction platform that can be integrated into the surfaces of electronic devices, such as home appliances.

1. Introduction

Technology today allows us to communicate and participate, as well as play games, organize our lives, and learn new things. Digital devices are also rapidly taking a larger role in daily health monitoring, along with their increasing use for supporting exercising, activation, and rehabilitation. However, the current handheld, screen-based, and touch-operated devices are not ideal for all consumers and all use situations [13]. For example, lowered cognitive skills, dry fingertips, bad eyesight, and decreased motor skills are preventing the efficient use of digital devices. The available alternatives are usually based on voice or body movement controlling. Voice-controlled interfaces, such as Apple’s Siri, Amazon’s Alexa, Microsoft Cortana, or Google Assistant, while popular and developing quickly, have their own challenges, such as linguistic coverage and challenges in noisy and strictly noiseless environments. Similarly, there are commercial interfaces that use human body movements, such as Nintendo Wii Remote, Sony PlayStation Move, Jacquard by Google, Kinect from Microsoft, Myo armband from Thalmic Labs, and E-skin sensor shirt from Xenoma. Further, different technologies have been suggested for detecting hand movement on the human body, such as different sensor technologies [49] and interactive textiles [1013]. All the available solutions, however, require a line-of-sight to work, i.e., the gesture maker needs to be directly seen, or an on-board power source, which increases their cost and limits their practical daily use. To overcome these challenges, WIFI signals have been used for gesture monitoring [1416] but, despite the promising results, these solutions have challenges with multiple user environments and they are only functional in a specific use environment [17].

Passive RFID (radio frequency identification) technology, especially in the UHF (ultrahigh frequency) range, has the potential to become a solution for overcoming the problems mentioned above. Passive RFID uses battery-free, remotely addressable electronic tags, composed only of an antenna and a small RFID IC (integrated circuit) component, having a unique ID. A passive RFID tag gets all its energy from an RFID reader and responds by backscattering. Variations of backscattered signal strengths and phases from body-attached passive RFID tags have been shown to provide information about body positions and movements [1824]. Further, when a user touches an RFID tag with a finger, the user manifests as a change in phase of the tag’s backscattered signal [25, 26]. Moreover, integrating versatile sensing options into passive RFID tags has been done [2732], and by tracking changes in the tags’ backscattered signals, passive UHF RFID tags have been used, for example, as autonomous strain [3337] and moisture [3842] sensors. However, despite the promising results, the presented passive RFID tag-based results have shown that the backscattered signals of passive RFID tags are noisy and unstable and strongly affected by the environment. Thus, a new type of approach is needed in order to fully benefit from passive UHF RFID in human-technology interaction.

To satisfy this need, we have developed our ClothFace solution, in which several combined passive RFID IC components are simply “switched on and off” by touch-created electrical interconnections to RFID antennas. Due to the unique ID of each IC, these components can be then used as specific input buttons. Our first passive RFID-based human-technology interaction solution was based on a shirt-integrated antenna, which could be used to activate single RFID IC components placed around the user [43]. Next, we further developed this concept of integrating RFID ICs in the environment around us, by installing a passive RFID-based platform, consisting of three RFID ICs on the surface of the wooden table [44]. Now, in this study, we introduce a new ClothFace concept, a shirtsleeve-integrated touchpad solution, which comprises two wrist antennas and three RFID ICs (each with a unique ID). The interface prototype, which is integrated on a shirtsleeve by cutting the antennas and antenna-IC interconnections from copper tape, can be used for push button and swipe controlling. Each IC can be activated, i.e., electrically connected to the two wrist antennas, by touching with finger. An external RFID reader antenna will provide all the needed energy for the wireless system, which enables the uniqueness of our solution. Our shirtsleeve-integrated interface is fully passive and maintenance-free, having no on-cloth energy sources, which makes fabrication directly into clothing simple and easy, and provides great mobility for practical use. The cost of a basic passive RFID IC is only a few cents, which makes our solution cost-effective and convenient for use in daily clothing. Further, by applying a coating, the platform can be made waterproof and fully washable.

2. Platform Design and Fabrication

The design and dimensions of the platform prototype are shown in Figure 1. Our sleeve interface includes two wrist antennas and three RFID ICs, which act as input buttons. The antenna design is based on our previous paper [43], where we introduced a wrist antenna that had antenna bands going around the wrist. The idea of these antenna bands is that the human wrist will not fully cover the antenna. The bands will go around the wrist, which will provide a better wireless performance and a longer read range. We now combined two of such wrist antennas together and connected the RFID IC components into both. This new type of simple design will allow these three ICs to be readable even when they are not directly facing the RFID reader.

Each RFID IC has a unique ID due to the unique electronic product code (EPC). Based on their EPC codes, the ICs (input buttons) in the sleeve interface are numbered 1–3 from left to right. These prototype antennas and interconnections are manufactured from copper tape, which has glue on the backside, and can thus be easily integrated into the shirtsleeve. The used ICs (shown in Figure 2) are NXP UCODE G2iL RFID microchips (with a wake-up power of −18 dBm), which the manufacturer has embedded into a plastic film strap structure. This IC strap has two 3 × 3 mm2 copper pads, which are used to attach the component to one of the two copper tape conductors. As presented in Figure 1, in our platform, these two copper tape conductors are connecting the two copper tape wrist antennas.

In this first prototype of the sleeve interface, the ICs are activated using an “input finger” that is a piece of textile (a finger cut from a glove), coated with a copper tape material. Alternatively, the ICs are activated by touching with a bare fingertip. Thus, we are now testing two different methods for further development. When a specific IC strap is touched, the copper tape in the input finger or the bare finger itself will create the needed electrical interconnection from the IC strap pad to the second copper tape line and switch that specific IC readable to the RFID reader.

3. Preliminary Tests

As a preliminary test, the sleeve interface is firstly tested with Thingmagic M6 RFID reader on an office table. The reader operates at the European standard frequency range (865.6–867.6 MHz) and the used power is 28 dBm. The maximum read range of the sleeve interface (the distance from the RFID reader antenna, where the interface is still working flawlessly) is measured from different directions, as presented in Figure 3. Further, the backscattered power for each IC on the sleeve is recorded at the maximum distance. As can be seen from Table 1, the sleeve works from all directions from distances of 70–80 cm, which is a good starting point.


Direction 1 (D1)Direction 2 (D2)Direction 3 (D3)Direction 4 (D4)
Read range (cm)Backscattered power (dBm)Read range (cm)Backscattered power (dBm)Read range (cm)Backscattered power (dBm)Read range (cm)Backscattered power (dBm)
IC1IC2IC3IC1IC2IC3IC1IC2IC3IC1IC2IC3

75−51−50−5275−43−43−4880−55−56−5570−46−46−47

Next, the on-body performance of the sleeve interface is tested in an office environment with a handheld mobile RFID reader. In order to evaluate the practical potential of the sleeve, it is tested on a male subject (as presented in Figure 3) and the maximum read range is measured. The reader (Nordic ID Medea) operates at 866 MHz. The three ICs on the sleeve operate efficiently at 40 cm from the mobile reader.

4. Practical Test

As presented in Figure 4, the sleeve interface is tested while placed on a wooden table (meaning without the effects of the human body) and while worn on body (where the lossy human body as well as body movement may affect the antenna performance). The sleeve interface is tested by seven users (three females and four males), who are each asked to give 200 random inputs by our gamelike testing software. Four of the users test the interface with the input finger, while three of them test the interface with a bare finger. By using both methods, only a gentle touch is needed to activate the ICs.

Figure 5 shows the measurement setup, which includes the interface platform, integrated into a sleeve of a cotton shirt, one circularly polarized reader antenna, attached to Thingmagic M6 RFID reader through a connecting cable, and our testing software user interface. The practical testing of our sleeve interface is done in a normal office environment with wooden and metallic furniture, people, and computers. The wireless environment in the office is also in a normal condition, meaning that mobile phones are used and there is also an indoor WIFI signal. As shown in Figure 4, the reader antenna was placed directly opposite to the wrist antennas on the table, while it was facing the wrist antennas at a 90-degree angle when the interface was tested on body. The goal was to get a better understanding of the practical use possibilities.

Figure 6 shows an example of a situation, where the software asks the user to touch button 3, and the user is controlling the platform with a bare finger. Further, in Figure 7, the software asks the user to swipe right, and the user is controlling the platform with the input finger. The testing software is developed on the Net framework with C# as windows forms application. The testing software uses ThingMagic Mercury API tools to control the M6 reader and filter received RFID tag IDs to focus only on the ICs on the test (and not to be disturbed by any surrounding RFID tags). The ThingMagic Mercury API supports continuous reading, so it was chosen to retrieve RFID tags from the M6 reader. The testing software asks the users to perform the following actions in random order: swipe left (i.e., touch buttons 3–1), swipe right (i.e., touch buttons 1–3), touch button 1, touch button 2, and touch button 3.

During testing, the system stores the asked input, the given input, and the information if the given input was the same as the asked input. As shown in Figure 6, the color on the testing software screen is initially blue. When the correct input is detected by the testing software, green color appears on the screen and the software stores the data as “1”. If there is no input for 5 seconds or a wrong input is given, red color appears on the screen and “0” is saved.

Table 2 shows the testing results from the four testers, who use the interface with the input fingers, while Table 3 presents the results from the three users, who use the platform with their bare fingers. The overall success rate for the table and arm setup is 96–100% and 92–100%, respectively. These preliminary results prove that this sleeve-integrated platform can attain high input accuracy in normal office conditions, despite the challenging wireless environment. The performance seems to be equally good on a table and on body, which supports the use of our novel wearable platform design having two wrist antennas attached to the ICs. There is no significant difference between the results achieved with the input finger and a bare finger. The results from table measurements with a bare finger and with the input finger are 96–99% and 96–100%, respectively, while the on-body results are 92–97% and 97–100%, respectively. Based on these results, it is possible to control the ClothFace interface with a bare finger, which removes the need for a specific input finger and provides more flexibility for practical use. Thus, the development of the next prototype will focus on using the platform with a bare finger.


SubjectFemale 1Female 2Male 1Male 2
Table (%)Arm (%)Table (%)Arm (%)Table (%)Arm (%)Table (%)Arm (%)

Success999910098969798100
Error11024320


SubjectFemale 3Male 3Male 4
Table (%)Arm (%)Table (%)Arm (%)Table (%)Arm (%)

Success969798929995
Error432815

As can be seen from Figures 6 and 7, the sleeve interface may get crumpled when used. This may result in unwanted inputs and wanted inputs being ignored by the system. This did not occur during any of the tests but needs to be considered in the future. Thus, our plan is to fabricate the next prototype from electrotextiles, which blend into clothing better than the copper tape. Further, we trust that a protective coating will also help with the abovementioned challenge.

Previous projects, such as Google Jacquard [10], also provide similar interaction modalities; however, due to their complex structures and components, they need specifically designed clothes preventing the variety in terms of aesthetics and functionality. On the contrary, our implementation promises versatility in terms of visual aesthetics, interaction resolution and area, and broad opportunities for customization to different uses and contexts. The fundamental strength of the user interface implemented here lies within its passive nature, cost-effectiveness, and simple implementation into clothes.

It is not hard to imagine rapidly deploying ClothFace to different types of clothing, such as pants, gloves, or hats. This versatility can suggest using our system as a platform for implementing and testing many speculative on-body gesture design studies, such as [45, 46]. Moreover, although antennas and the circuit can be concealed beneath the cloth, the form factor of the current implementation also hints fashion design studies that can help to create a new visual language for RFID-based smart clothing, which can leave copper parts visible by incorporating diverse visual interpretations. The results achieved with the handheld mobile reader support our future goal to study using such interfaces with mobile phone-integrated UHF RFID readers. If mobile phones widely adopt UHF RFID readers, clothes can be designed for different functions in various forms. For example, we can produce a diverse array of gaming t-shirts representing distinct game characters. Easy deployment of our system can facilitate producing different types of input commands that can be used on various parts of the body for controlling games. This would create unique player experiences for different characters according to their special abilities in a game and would be a worthy contribution to the emerging gaming wearable area [47, 48]. This versatility can also be expanded into daily life tasks, such as kitchen-aprons interacting with the house while preparing a meal or pajamas giving access to the control of TV and the music system. Our solution can give special groups, such as disabled people with different limitations and elderly people with bad eyesight and memory, more possibilities for independent living. Further and more detailed implementations of this system will also make exploring body-related interactions in a rapid way possible, which can yield a platform for design research in areas such as Somesthetic Design [49]. These speculative ideas demonstrate that the virility of our implementation and our further work encapsulate exploring these distinct dimensions.

In this direction, our next step as further work is to create a wearable prototype, which can be comfortably used for daily life actions. The sleeve platform will be embedded between two layers of cotton textiles. The copper tape lines and the RFID IC strap pads inside will be separated by a thin textile net, which means that they will form a contact when the surface of the top layer textile is touched. This will remove the need for a specific input finger and integrate the platform seamlessly into the shirtsleeve.

5. Conclusions

We introduced a ClothFace sleeve, a passive UHF RFID-based human-technology interface, integrated into a sleeve of a cotton shirt. The created textile touchpad enables push button and swipe controlling without on-cloth energy sources. During practical testing, 96–100% and 92–100% success rates were achieved on a table and on body, respectively. These results are very encouraging, especially when considering that the sleeve interface, being cost-effective and flexible, promises versatile and practical application areas in an extensive number of different contexts. As these preliminary results seem to suggest that it is possible to control the platform with a bare finger, the next prototype will be developed to be controlled without an input finger, which will make it more flexible and more suitable for practical use. More, the prototype will be further developed by seamlessly integrating it into clothing. The next goals are to make the platform smaller in size, optimize the antenna design for longer read ranges, and test different coating materials to achieve washability. Due to the promising results achieved with the handheld mobile reader, our plan is to start tests with mobile phone-integrated UHF RFID readers, which can be held in a pocket for a truly mobile system. We imagine that this type of clothing-integrated user interface opens possibilities especially for special needs groups, such as people using alternative and assistive communication technologies. By using individually tailored mobile phone applications, these people could benefit from versatile “communication clothes” in their daily lives.

Data Availability

The measurement data used to support the findings of this study are available from the corresponding author upon request.

Conflicts of Interest

The authors declare no conflicts of interest.

Acknowledgments

This work has been funded by The Finnish Cultural Foundation, Jane and Aatos Erkko Foundation, the Academy of Finland (decision number 294534), and the European Union’s Horizon 2020 Research and Innovation Programme under the Marie Sklodowska-Curie grant agreement no. 833731-WEARTUAL.

References

  1. T. L. Baldi, G. Spagnoletti, M. Dragusanu, and D. Prattichizzo, “Design of a wearable interface for lightweight robotic arm for people with mobility impairments,” in Proceedings of the IEEE International Conference on Rehabilitation Robotics, London, UK, July 2017. View at: Google Scholar
  2. R. Y. Y. Chan, J. Ding, L. W. Kong et al., “Making telecommunications services accessible to people with severe communication disabilities,” in Proceedings of the IEEE Global Humanitarian Technology Conference, Seattle, WA, USA, October 2016. View at: Google Scholar
  3. H. Inoue, H. Nishino, and T. Kagawa, “Foot-controlled interaction assistant based on visual tracking,” in Proceedings of the IEEE International Conference on Consumer Electronics, Taipei, Taiwan, January 2015. View at: Google Scholar
  4. C. Harrison, D. Tan, and D. Morris, “Skinput: appropriating the body as an input surface,” in Proceedings of the ACM Conference on Human Factors in Computing Systems, Atlanta, GA, USA, April 2010. View at: Google Scholar
  5. G. Laput, R. Xiao, X. A. Chen, S. E. Hudson, and C. Harrison, “Skin buttons: cheap, small, low-powered and clickable fixed-icon laser projectors,” in Proceedings of the ACM Symposium on User Interface Software and Technology, Honolulu, HI, USA, October 2014. View at: Google Scholar
  6. S. Y. Lin, C. H. Su, K. Y. Cheng, R. H. Liang, T. H. Kuo, and B. Y. Chen, “Pub-point upon body: exploring eyes-free interaction and methods on an arm,” in Proceedings of the ACM Symposium on User Interface Software and Technology, Santa Barbara, CA, USA, October 2011. View at: Google Scholar
  7. R. Lissermann, J. Huber, A. Hadjakos, S. Nanayakkara, and M. Mühlhäuser, “EarPut: augmenting ear-worn devices for ear-based interaction,” in Proceedings of the ACM Computer-Human Interaction Conference on Designing Futures, Sydney, Australia, December 2014. View at: Google Scholar
  8. M. Weigel, T. Lu, G. Bailly, A. Oulasvirta, C. Majidi, and J. Steimle, “iSkin: flexible, stretchable and visually customizable on-body touch sensors for mobile computing,” in Proceedings of the ACM Conference on Human Factors in Computing Systems, Seoul, Republic of Korea, April 2015. View at: Google Scholar
  9. N. Hamdan, R. K. Kosuru, C. Corsten, and J. Borchers, “Run&Tap: investigation of on-body tapping for runners,” in Proceedings of the ACM International Conference on Interactive Surfaces and Spaces, Brighton, UK, October 2017. View at: Google Scholar
  10. I. Poupyrev, N. Gong, S. Fukuhara, M. Karagozler, C. Schwesig, and K. Robinson, “Project jacquard: interactive digital textiles at scale,” in Proceedings of the ACM Conference on Human Factors in Computing Systems, San Jose, CA, USA, May 2016. View at: Google Scholar
  11. N. Hamdan, J. R. Blum, F. Heller, R. K. Kosuru, and J. Borchers, “Grabbing at an angle: menu selection for fabric interfaces,” in Proceedings of the ACM International Symposium on Wearable Computers, Heidelberg, Germany, September 2016. View at: Google Scholar
  12. T. Karrer, M. Wittenhagen, L. Lichtschlag, F. Heller, and J. Borchers, “Pinstripe: eyes-free continuous input on interactive clothing,” in Proceedings of the ACM Conference on Human Factors in Computing Systems, Vancouver, Canada, May 2011. View at: Google Scholar
  13. P. Parzer, A. Sharma, A. Vogl, J. Steimle, A. Olwal, and M. Haller, “SmartSleeve: real-time sensing of surface and deformation gestures on flexible, interactive textiles, using a hybrid gesture detection pipeline,” in Proceedings of the ACM Symposium on User Interface Software and Technology, Québec City, Canada, October 2017. View at: Google Scholar
  14. Q. Pu, S. Gupta, S. Gollakota, and S. Patel, “Whole-home gesture recognition using wireless signals,” in Proceedings of the ACM International Conference on Mobile Computing & Networking, Miami, FL, USA, 2013. View at: Google Scholar
  15. H. Abdelnasser, M. Youssef, and K. A. Harras, “WiGest: a ubiquitous WiFi-based gesture recognition system,” in Proceedings of the Conference on Computer Communications, Kowloon, Hong Kong, October 2015. View at: Google Scholar
  16. W. Wang, A. X. Liu, M. Shahzad, K. Ling, and S. Lu, “Understanding and modeling of WiFi signal based human activity recognition,” in Proceedings of the International Conference on Mobile Computing and Networking, Paris, France, September 2015. View at: Google Scholar
  17. H. Jiang, C. Cai, X. Ma, Y. Yang, and J. Liu, “Smart home based on WiFi sensing: a survey,” IEEE Access, vol. 6, pp. 13317–13325, 2018. View at: Publisher Site | Google Scholar
  18. S. Manzari, C. Occhiuzzi, and G. Marrocco, “Feasibility of body-centric systems using passive textile RFID tags,” IEEE Antennas and Propagation Magazine, vol. 54, no. 4, pp. 49–62, 2012. View at: Publisher Site | Google Scholar
  19. R. Krigslund, S. Dosen, P. Popovski, J. L. Dideriksen, G. F. Pedersen, and D. Farina, “A novel technology for motion capture using passive UHF RFID tags,” IEEE Transactions on Biomedical Engineering, vol. 60, no. 5, pp. 1453–1457, 2013. View at: Publisher Site | Google Scholar
  20. R. Krigslund, P. Popovski, and G. F. Pedersen, “3D gesture recognition using passive RFID tags,” in Proceedings of the IEEE Antennas and Propagation Society International Symposium, Orlando, FL, USA, July 2013. View at: Google Scholar
  21. S. Amendola, L. Bianchi, and G. Marrocco, “Movement detection of human body segments: passive radio-frequency identification and machine-learning technologies,” IEEE Antennas and Propagation Magazine, vol. 57, no. 3, pp. 23–37, 2015. View at: Publisher Site | Google Scholar
  22. H. Ding, J. Han, L. Shangguan et al., “A platform for free-weight exercise monitoring with passive tags,” IEEE Transactions on Mobile Computing, vol. 16, no. 12, pp. 3279–3293, 2017. View at: Publisher Site | Google Scholar
  23. J. Wang, D. Vasisht, and D. Katabi, “RF-IDraw: virtual touch screen in the air using RF signals,” in Proceedings of the ACM Computer Communication Review, Chicago, IL, USA, September 2014. View at: Google Scholar
  24. H. Jin, Z. Yang, S. Kumar, and J. I. Hong, “Towards wearable everyday body-frame tracking using passive RFIDs,” Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, vol. 1, no. 4, 2018. View at: Publisher Site | Google Scholar
  25. S. Pradhan, E. Chai, K. Sundaresan, L. Qiu, M. A. Khojastepour, and S. Rangarajan, “RIO: a pervasive RFID-based touch gesture interface,” in Proceedings of the ACM International Conference on Mobile Computing and Networking, Sandy, UT, USA, October 2017. View at: Google Scholar
  26. H. Li, E. Brockmeyer, E. J. Carter, J. Fromm, S. E. Hudson, and S. N. Patel, “PaperID: a technique for drawing functional battery-free wireless interfaces on paper,” in Proceedings of the ACM Conference on Human Factors in Computing Systems, San Jose, CA, USA, May 2016. View at: Google Scholar
  27. S. Caizzone, E. DiGiampaolo, and G. Marrocco, “Wireless crack monitoring by stationary phase measurements from coupled RFID tags,” IEEE Transactions on Antennas and Propagation, vol. 62, no. 12, pp. 6412–6419, 2014. View at: Publisher Site | Google Scholar
  28. S. Lemey, F. Declercq, and H. Rogier, “Textile antennas as hybrid energy-harvesting platforms,” Proceedings of the IEEE, vol. 102, no. 11, pp. 1833–1857, 2014. View at: Publisher Site | Google Scholar
  29. C. Occhiuzzi, S. Cippitelli, and G. Marrocco, “Modeling, design and experimentation of wearable RFID sensor tag,” IEEE Transactions on Antennas and Propagation, vol. 58, no. 8, pp. 2490–2498, 2010. View at: Publisher Site | Google Scholar
  30. C. Occhiuzzi, C. Vallese, S. Amendola, S. Manzari, and G. Marrocco, “NIGHT-care: a passive RFID system for Remote monitoring and control of overnight living environment,” Procedia Computer Science, vol. 32, pp. 190–197, 2014. View at: Publisher Site | Google Scholar
  31. T. Kaufmann, D. C. Ranasinghe, M. Zhou, and C. Fumeaux, “Wearable quarter-wave folded microstrip antenna for passive UHF RFID applications,” International Journal of Antennas and Propagation, vol. 2013, Article ID 129839, 11 pages, 2013. View at: Publisher Site | Google Scholar
  32. O. O. Rakibet, C. V. Rumens, J. C. Batchelor, and S. J. Holder, “Epidermal passive RFID strain sensor for assisted technologies,” IEEE Antennas and Wireless Propagation Letters, vol. 13, pp. 814–817, 2014. View at: Publisher Site | Google Scholar
  33. C. Occhiuzzi, C. Paggi, and G. Marrocco, “Passive RFID strain-sensor based on meander-line antennas,” IEEE Transactions on Antennas and Propagation, vol. 59, no. 12, pp. 4836–4840, 2011. View at: Publisher Site | Google Scholar
  34. F. Long, X. Zhang, T. Björninen et al., “Implementation and wireless readout of passive UHF RFID strain sensor tags based on electro-textile antennas,” in Proceedings of the European Conference on Antennas and Propagation, Lisbon, Portugal, 2015. View at: Google Scholar
  35. S. Merilampi, T. Björninen, L. Ukkonen, P. Ruuskanen, and L. Sydänheimo, “Embedded wireless strain sensors based on printed RFID tag,” Sensor Review, vol. 31, no. 1, pp. 32–40, 2011. View at: Publisher Site | Google Scholar
  36. S. Merilampi, T. Björninen, L. Sydänheimo, and L. Ukkonen, “Passive UHF RFID strain sensor tag for detecting limb movement,” International Journal on Smart Sensing Intelligent Systems, vol. 5, no. 2, 2012. View at: Publisher Site | Google Scholar
  37. X. Chen, L. Ukkonen, and T. Björninen, “Passive E-textile UHF RFID-based wireless strain sensors with integrated references,” IEEE Sensors Journal, vol. 16, no. 22, pp. 7835-7836, 2016. View at: Publisher Site | Google Scholar
  38. J. Siden, X. Zeng, T. Unander, A. Koptyug, and H. Nilsson, “Remote moisture sensing utilizing ordinary RFID tags,” in Proceedings of the IEEE Sensors, Atlanta, GA, USA, October 2007. View at: Google Scholar
  39. S. Kim, T. Le, A. Harrabi, A. Collado, and A. Georgiadis, “An RFID-enabled inkjet-printed soil moisture sensor on paper for “smart” agricultural applications,” in Proceedings of the IEEE Sensors, Valencia, Spain, November 2014. View at: Google Scholar
  40. S. Sajal, Y. Atanasov, B. D. Braaten, V. Marinov, and O. Swenson, “A low cost flexible passive UHF RFID tag for sensing moisture based on antenna polarization,” in Proceedings of the IEEE International Conference on Electro/Information Technology, Milwaukee, WI, USA, June 2014. View at: Google Scholar
  41. D. Shuaib, S. Merilampi, L. Ukkonen, and J. Virkki, “The possibilities of embroidered passive UHF RFID textile tags as wearable moisture sensors,” in Proceedings of the International Conference on Serious Games and Applications for Health, Perth, Australia, April 2017. View at: Google Scholar
  42. E. Sipilä, J. Virkki, L. Sydänheimo, and L. Ukkonen, “Experimental study on brush-painted passive RFID-based humidity sensors embedded into plywood structures,” International Journal of Antennas and Propagation, vol. 2016, Article ID 1203673, 8 pages, 2016. View at: Publisher Site | Google Scholar
  43. A. Mehmood, S. Qureshi, H. He et al., “Clothing-integrated RFID-based interface for human-technology interaction,” in Proceedings of the International Conference on Serious Games and Applications for Health, Kyoto, Japan, April 2019. View at: Google Scholar
  44. A. Mehmood, V. Vianto, H. He et al., “Passive UHF RFID-based user interface on a wooden surface,” in Proceedings of the Progress in Electromagnetics Research Symposium, Xiamen, China, December 2019. View at: Google Scholar
  45. I. Bostan, O. T. Buruk, M. Canat et al., “Hands as a controller: user preferences for hand specific on-skin gestures,” in Proceedings of the 2017 Conference on Designing Interactive Systems, Edinburgh, UK, June 2017. View at: Google Scholar
  46. M. Weigel, V. Mehta, and J. Steimle, “More than Touch : understanding how people use skin as an input surface for mobile computing,” in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, Toronto, Canada, April 2014. View at: Google Scholar
  47. O. T. Buruk and O. Özcan, “Extracting design guidelines for wearables and movement in tabletop role-playing games via a research through design process,” in Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems, Montréal, Canada, April 2018. View at: Google Scholar
  48. O. T. Buruk, K. Isbister, and T. Tanenbaum, “A design framework for playful wearables,” in Proceedings of the 14th International Conference on the Foundations of Digital Games, San Luis Obispo, CA, USA, August 2019. View at: Google Scholar
  49. K. Höök, M. P. Jonsson, A. Ståhl, and J. Mercurio, “Somaesthetic appreciation design,” in Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems, San Jose, CA, USA, May 2016. View at: Google Scholar

Copyright © 2020 Adnan Mehmood et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.


More related articles

 PDF Download Citation Citation
 Download other formatsMore
 Order printed copiesOrder
Views801
Downloads517
Citations

Related articles

Article of the Year Award: Outstanding research contributions of 2020, as selected by our Chief Editors. Read the winning articles.