Concept: Wearable computer
The development of bendable, stretchable, and transparent touch sensors is an emerging technological goal in a variety of fields, including electronic skin, wearables, and flexible handheld devices. Although transparent tactile sensors based on metal mesh, carbon nanotubes, and silver nanowires demonstrate operation in bent configurations, we present a technology that extends the operation modes to the sensing of finger proximity including light touch during active bending and even stretching. This is accomplished using stretchable and ionically conductive hydrogel electrodes, which project electric field above the sensor to couple with and sense a finger. The polyacrylamide electrodes are embedded in silicone. These two widely available, low-cost, transparent materials are combined in a three-step manufacturing technique that is amenable to large-area fabrication. The approach is demonstrated using a proof-of-concept 4 × 4 cross-grid sensor array with a 5-mm pitch. The approach of a finger hovering a few centimeters above the array is readily detectable. Light touch produces a localized decrease in capacitance of 15%. The movement of a finger can be followed across the array, and the location of multiple fingers can be detected. Touch is detectable during bending and stretch, an important feature of any wearable device. The capacitive sensor design can be made more or less sensitive to bending by shifting it relative to the neutral axis. Ultimately, the approach is adaptable to the detection of proximity, touch, pressure, and even the conformation of the sensor surface.
Approximately one-half of American adults exhibit low health literacy and thus struggle to find and use health information. Low health literacy is associated with negative outcomes including overall poorer health. Health information technology (HIT) makes health information available directly to patients through electronic tools including patient portals, wearable technology, and mobile apps. The direct availability of this information to patients, however, may be complicated by misunderstanding of HIT privacy and information sharing.
Retinal prostheses, which restore partial vision to patients blinded by outer retinal degeneration, are currently in clinical trial. The Argus II retinal prosthesis system was recently awarded CE approval for commercial use in Europe. While retinal prosthesis users have achieved remarkable visual improvement to the point of reading letters and short sentences, the reading process is still fairly cumbersome. This study investigates the possibility of using an epiretinal prosthesis to stimulate visual braille as a sensory substitution for reading written letters and words. The Argus II retinal prosthesis system, used in this study, includes a 10 × 6 electrode array implanted epiretinally, a tiny video camera mounted on a pair of glasses, and a wearable computer that processes the video and determines the stimulation current of each electrode in real time. In the braille reading system, individual letters are created by a subset of dots from a 3 by 2 array of six dots. For the visual braille experiment, a grid of six electrodes was chosen out of the 10 × 6 Argus II array. Groups of these electrodes were then directly stimulated (bypassing the camera) to create visual percepts of individual braille letters. Experiments were performed in a single subject. Single letters were stimulated in an alternative forced choice (AFC) paradigm, and short 2-4-letter words were stimulated (one letter at a time) in an open-choice reading paradigm. The subject correctly identified 89% of single letters, 80% of 2-letter, 60% of 3-letter, and 70% of 4-letter words. This work suggests that text can successfully be stimulated and read as visual braille in retinal prosthesis patients.
Using computer, mobile and wearable technology enhanced interventions to reduce sedentary behaviour: a systematic review and meta-analysis
- The international journal of behavioral nutrition and physical activity
- Published about 2 years ago
High levels of sedentary behaviour (SB) are associated with negative health consequences. Technology enhanced solutions such as mobile applications, activity monitors, prompting software, texts, emails and websites are being harnessed to reduce SB. The aim of this paper is to evaluate the effectiveness of such technology enhanced interventions aimed at reducing SB in healthy adults and to examine the behaviour change techniques (BCTs) used.
- Proceedings of the National Academy of Sciences of the United States of America
- Published over 2 years ago
From the desktop to the laptop to the mobile device, personal computing platforms evolve over time. Moving forward, wearable computing is widely expected to be integral to consumer electronics and beyond. The primary interface between a wearable computer and a user is often a near-eye display. However, current generation near-eye displays suffer from multiple limitations: they are unable to provide fully natural visual cues and comfortable viewing experiences for all users. At their core, many of the issues with near-eye displays are caused by limitations in conventional optics. Current displays cannot reproduce the changes in focus that accompany natural vision, and they cannot support users with uncorrected refractive errors. With two prototype near-eye displays, we show how these issues can be overcome using display modes that adapt to the user via computational optics. By using focus-tunable lenses, mechanically actuated displays, and mobile gaze-tracking technology, these displays can be tailored to correct common refractive errors and provide natural focus cues by dynamically updating the system based on where a user looks in a virtual scene. Indeed, the opportunities afforded by recent advances in computational optics open up the possibility of creating a computing platform in which some users may experience better quality vision in the virtual world than in the real one.
Wearable technology is an exciting new field in humans and animals. In dogs activity monitors have helped to provide objective measurement tools where pet owner observation had been the only source of information. Previous research has focused on measuring overall activity versus rest. This has been relatively useful in determining changes in activity in orthopedic disease or post-surgical cases [Malek et al., BMC Vet Res 8:185, 2012, Yashari et al., BMC Vet Res 11:146, 2015]. Assessment of pruritus via changes in activity, however, requires an assumption that increased activity is due to scratching or other pruritic behaviors. This is an inaccurate method with obvious flaws as other behaviors may also register as greater activity. The objective of this study was to validate the ability of a multidimensional high frequency sensor and advanced computer analysis system, (Vetrax®, AgLogica Holdings, Inc., Norcross, GA, USA) to specifically identify pruritic behaviors (scratching and head shaking). To establish differences between behaviors, sensor and time stamped video data were collected from 361 normal and pruritic dogs. Video annotations were made by two observers independently, while blinded to sensor data, and then evaluated for agreement. Annotations that agreed between the two were used for further analysis. The annotations specified behaviors at specific times in order to compare with sensor data. A computer algorithm was developed to interpret and differentiate between these behaviors. Test subject data was then utilized to test and score the system’s ability to accurately predict behaviors.
New technologies are emerging that may help individuals engage in healthier eating behaviors. One paradigm to test the efficacy of a technology is to determine its effect relative to environment cues that are known to cause individuals to overeat.
Information and communication technology (ICT) has transformed the health care field worldwide. One of the main drivers of this change is the electronic health record (EHR). However, there are still open issues and challenges because the EHR usually reflects the partial view of a health care provider without the ability for patients to control or interact with their data. Furthermore, with the growth of mobile and ubiquitous computing, the number of records regarding personal health is increasing exponentially. This movement has been characterized as the Internet of Things (IoT), including the widespread development of wearable computing technology and assorted types of health-related sensors. This leads to the need for an integrated method of storing health-related data, defined as the personal health record (PHR), which could be used by health care providers and patients. This approach could combine EHRs with data gathered from sensors or other wearable computing devices. This unified view of patients' health could be shared with providers, who may not only use previous health-related records but also expand them with data resulting from their interactions. Another PHR advantage is that patients can interact with their health data, making decisions that may positively affect their health.
Physical activity is an important outcome in oncology trials. Physical activity is commonly assessed using self-reported questionnaires, which are limited by recall and response biases. Recent advancements in wearable technology have provided oncologists with new opportunities to obtain real-time, objective physical activity data. The purpose of this review was to describe current uses of wearable activity monitors in oncology trials.