Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (2)

Search Parameters:
Keywords = touch-reading compliance

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
14 pages, 4741 KB  
Article
3D Printed Tablets (Printlets) with Braille and Moon Patterns for Visually Impaired Patients
by Atheer Awad, Aliya Yao, Sarah J. Trenfield, Alvaro Goyanes, Simon Gaisford and Abdul W. Basit
Pharmaceutics 2020, 12(2), 172; https://doi.org/10.3390/pharmaceutics12020172 - 19 Feb 2020
Cited by 139 | Viewed by 13519
Abstract
Visual impairment and blindness affects 285 million people worldwide, resulting in a high public health burden. This study reports, for the first time, the use of three-dimensional (3D) printing to create orally disintegrating printlets (ODPs) suited for patients with visual impairment. Printlets were [...] Read more.
Visual impairment and blindness affects 285 million people worldwide, resulting in a high public health burden. This study reports, for the first time, the use of three-dimensional (3D) printing to create orally disintegrating printlets (ODPs) suited for patients with visual impairment. Printlets were designed with Braille and Moon patterns on their surface, enabling patients to identify medications when taken out of their original packaging. Printlets with different shapes were fabricated to offer additional information, such as the medication indication or its dosing regimen. Despite the presence of the patterns, the printlets retained their original mechanical properties and dissolution characteristics, wherein all the printlets disintegrated within ~5 s, avoiding the need for water and facilitating self-administration of medications. Moreover, the readability of the printlets was verified by a blind person. Overall, this novel and practical approach should reduce medication errors and improve medication adherence in patients with visual impairment. Full article
Show Figures

Graphical abstract

25 pages, 1242 KB  
Article
Design and Implementation of a Smart LED Lighting System Using a Self Adaptive Weighted Data Fusion Algorithm
by Wen-Tsai Sung and Jia-Syun Lin
Sensors 2013, 13(12), 16915-16939; https://doi.org/10.3390/s131216915 - 6 Dec 2013
Cited by 42 | Viewed by 13473
Abstract
This work aims to develop a smart LED lighting system, which is remotely controlled by Android apps via handheld devices, e.g., smartphones, tablets, and so forth. The status of energy use is reflected by readings displayed on a handheld device, and it is [...] Read more.
This work aims to develop a smart LED lighting system, which is remotely controlled by Android apps via handheld devices, e.g., smartphones, tablets, and so forth. The status of energy use is reflected by readings displayed on a handheld device, and it is treated as a criterion in the lighting mode design of a system. A multimeter, a wireless light dimmer, an IR learning remote module, etc. are connected to a server by means of RS 232/485 and a human computer interface on a touch screen. The wireless data communication is designed to operate in compliance with the ZigBee standard, and signal processing on sensed data is made through a self adaptive weighted data fusion algorithm. A low variation in data fusion together with a high stability is experimentally demonstrated in this work. The wireless light dimmer as well as the IR learning remote module can be instructed directly by command given on the human computer interface, and the reading on a multimeter can be displayed thereon via the server. This proposed smart LED lighting system can be remotely controlled and self learning mode can be enabled by a single handheld device via WiFi transmission. Hence, this proposal is validated as an approach to power monitoring for home appliances, and is demonstrated as a digital home network in consideration of energy efficiency. Full article
(This article belongs to the Section Sensor Networks)
Show Figures

Back to TopTop