The consumer wearable market is mature: nearly everyone who is going to wear a computer on their wrist probably already has one. We’ve gone from the ‘smartwatch nod’—that furtive moment of recognition between smartwatch’s early adopters—to ‘meh’ in a few short years. Although wrist wearables such as smartwatches and fitness bands continue to improve functionality, driving upgrade cycles, the explosive growth we saw in years past is likely tapering off. According to the International Data Corporation (IDC), worldwide shipments of wearable devices are forecast to reach a total of 504.1 million units in 2023 and 629.4 million units in 2027, a CAGR of 5%.
This isn’t to say the wearable market is anywhere nearing saturation. Consumer smartwatches and fitness bands may be the earliest, and probably most obvious, application of the enormous embedded compute power we take for granted these days, but other applications driven by our innate desire to measure and improve our lives are waiting in the wings. Smart rings, smart hearables, smart clothes, smart glasses, and medical-grade patient monitoring devices are already on the market and entering growth mode. These devices are designed to be unobtrusive—there will likely be no ‘smart-ring nod,’ and that is a good thing.
It is useful to think of these devices as very smart sensors. Indeed, the features a wearable device offers are usually a function of the types of sensors it relies on. Common types of sensors include inertial sensors to measure movement, sophisticated optical biosensors to measure pulse and blood oxygenation, volatile organic compound sensors to ‘smell’ chemicals, audio sensors to enable speech- driven user interfaces (also to listen for health and safety problems), and more. The real magic—the thing that adds value to the device and joy to the user experience—is what can be achieved with all the sensor data collected from the devices and on the devices.
Fitness analytics
Fitness analytics is a must- have feature for almost every wearable. It is a well- understood application that currently relies primarily on inertial sensors (or “IMUs,” which have accelerometers, gyroscopes, and even magnetometers) to figure out what the user is up to. Using AI, a device can interpret this data to determine if the wearer is walking, running, going upstairs, and many other activities (some wearables can reliably detect over 50 distinct movements). The challenge is that wearers have all kinds of different body shapes and movement patterns; this high variability makes fitness activity detection a deceptively difficult task, and this is where AI comes in. Artificial intelligence can be thought of as ‘the easiest way to do hard things’ and— in this case—it clearly is. Instead of creating thousands of rules for calculating something as simple as a universally accurate step count, we train an AI model on the step patterns of hundreds or thousands of walkers, and we count on the AI training process to find all those rules for us.
Fitness devices can be as simple as a fitness band, or as complicated as a body- wide sensor network. The most sophisticated fitness measurement systems rely on inertial sensors on the wrist, feet, and back to obtain precise measurements. Going even further, they are also increasingly instrumenting the fitness garments themselves to estimate body position and dynamics. AI is essential to fusing all these sensors into a coherent picture of what a human body is doing.
Health analytics
Digital health devices, whether consumer or medical grade, rely on a set of sensors to determine wearers’ health metrics. Optical sensors can see our pulse through our skin (PPG) or measure blood oxygenation (SpO2). With the data collected, AI can extract the usual metrics, such as heart rate and heart rate variability. Surprisingly, there are more hidden ‘signals’ in this optical data that sophisticated AI models can use to estimate blood pressure, breathing rates, and even detect some forms of arrhythmia.
With the addition of electrocardiogram (ECG) sensors, AI can do alot more, including the identification of heartbeat patterns and more accurate abnormal rhythm detection. It turns out that AI can detect many other health conditions based on additional sensor types. For example, recent AI models have shown that they can detect abnormal walking patterns (such as Parkinson’s Gait), leg or back injuries, and falls. Other models can listen to a user’s voice (without understanding the words) and detect indications of dementia.
Where things get really interesting is when devices start fusing the data from all these sensors together and pairing the consolidated data with sophisticated AI models. For example, by fusing audio, inertial, and optical sensor data, a model can not only distinguish sleeping stages such as REM and deep sleep but also detect sleep apnea and snoring. The limiting factor is the availability of data—while the healthcare system generates copious amounts of data, much of it is unlabeled and, of course, highly sensitive. Even so, the research community regularly releases public and anonymized healthcare datasets to help with machine learning.
A nurse on every wrist
Traditionally, the types of AI described above required too much computation to run on a battery-powered wearable. Although modern embedded CPUs are capable of sophisticated computation, the power drain was too high to be practical, forcing device architects to decide between short battery life, less sophisticated models, or relying on
the cloud. Each of these compromises impacted the end-user’s experience and reduced the product’s real value. For example, a device might choose to use a less computationally intense approach, such as machine learning instead of deep learning, thereby trading accuracy for battery life.
With the advent of extremely power-efficient embedded processors such as Ambiq’s Apollo4 Plus, it has become possible to increase the sophistication of the analytics running on the device. Wearable devices can finally use deep learning to detect heart conditions, fitness activities, and health conditions while maintaining practical battery life, which is the very definition of endpoint AI. Additionally, endpoint AI can preserve privacy by avoiding sending data to the cloud while improving the user experience by responding nearly instantaneously without the need for external devices.