
Digital health is becoming essential infrastructure for modern care. As health care systems confront clinician shortages, rising chronic illness and growing demand for personalized treatment, digital technologies are offering new solutions — from earlier diagnosis and remote monitoring to data-driven clinical decision-making.
Experts at Rice University are at the forefront of this shift. Under its new strategic plan, Rice has made leading innovations in health a central priority. Located directly across the street from the Texas Medical Center, the world’s largest medical complex, Rice researchers are deepening partnerships across institutions to accelerate the translation of engineering and computing breakthroughs into real-world health impact.
Rice has a critical mass of experts working in this cross-disciplinary space, coalesced in a long-standing university-wide collaboration network, the Digital Health Initiative. Spearheaded by Ashutosh Sabharwal, the initiative inspired a range of efforts to advance the field, including seed grant programs and collaborative events. These efforts culminated with the establishment of the Digital Health Institute, a collaboration with Houston Methodist that focuses on advancing next-generation technologies for early detection, remote care and personalized health delivery.
Below is a list of experts available to comment on recent advances, research directions and the evolving role of digital tools — such as artificial intelligence (AI), wearable and ingestible devices, imaging and robotics —in health care delivery.
Raudel Avila develops wearable and implantable devices for wireless health applications, including bioresorbable pacemakers, neonatal sensors and miniaturized pressure sensors for prosthetics. He can discuss modeling, materials and wireless design challenges and opportunities in clinical bioelectronics.
Vivek Boominathan’s research lies at the intersection of computer vision, machine learning, applied optics and nanofabrication with applications in robotics, wearables, medical sensing, autonomous systems and more. Boominathan uses machine learning and imaging innovation to improve performance while reducing size, weight, power and cost in microscopy and medical imaging more broadly.
Bishal Lamichhane integrates wearables and mobile sensing with different AI in order to model and monitor the interplay between biological, psychological and behavioral factors for advanced personalized health applications, including for mental health. His experience includes developing machine learning/AI solutions for patient monitoring and consumer health applications in industry settings.
Lei Li pioneers photoacoustic imaging ⎯ an emerging biomedical technology that combines light, sound and AI to visualize structures and functions deep within the human body. Li can speak to how noninvasive AI-powered photoacoustic imaging is enabling whole-body scans, cancer diagnosis, brain function mapping, wearable health monitoring and smart microrobot navigation for surgery and drug delivery.
Marcia O’Malley is an expert on haptics and robotic rehabilitation, including exoskeletons, neuroprosthetics and wearable haptic devices. O’Malley can address how human-robot physical interaction and adaptive control systems can accelerate recovery, enhance motor learning and safely integrate robotics into virtual reality or remote surgery.
Daniel Preston’s work bridges soft robotics, fluid mechanics and energy systems. His lab develops novel wearables and soft robotic materials with embedded intelligence, power and responsiveness, enabling applications in rehabilitation, prosthetics and human-machine interfaces. Preston can discuss advancements in assistive devices and the engineering of adaptive systems for wearable health tech.
Akane Sano develops tools and systems that use wearable, mobile and social sensing and clinical data to monitor and predict mental and physical health, sleep and performance. Her work combines signal processing and machine learning to create personalized, real-world interventions that support health and well-being. Sano can speak to how multimodal data and AI models can be used for early detection and prevention in areas like depression, burnout, sleep disruption, cardiovascular disease and epilepsy.
Vaibhav Unhelkar develops AI and robotic systems to enhance human capabilities, including health care settings. His recent work uses digital twins to model surgical teams, enabling real-time coaching and AI-assisted clinical training. He can speak to how AI-powered simulations are transforming medical collaboration, training and decision-making.
Momona Yamagami investigates and engineers human-AI digital twins models to support at-home rehabilitation and technology accessibility for individuals with motor disabilities. She can speak to how human-centered engineering and wearable sensors can improve function and quality of life for people with motor impairments.
To schedule an interview with Rice’s experts, contact media relations specialists Alex Becker at alex.becker@rice.edu or Silvia Cernea Clark at silviacc@rice.edu.