Device sensors provide insights into what persons are doing in real-time (walking, running, driving, etc.).
Knowing users' activity allows, for instance, to interact with them through an app.
In this regard, you can apply machine learning to detect activities by reading and processing sensor data.
Human activity recognition (HAR) aims to classify a person's actions from a series of measurements captured by sensors.
Nowadays, collecting this type of data is not an arduous task. With the growth of the Internet of Things, almost everyone has some gadget that monitors their movements. It can be a smartwatch, a pulsometer, or even a smartphone.
Usually, this is performed by following a fixed-length sliding window approach for the features extraction. Here two parameters need to be fixed: the size of the window and the shift.
These are some of the data you could use:
The machine learning model used for activity recognition relies on top of the devices' available sensors.
However, analyzing this data can be a big challenge due to the complexity of human activities and the existing differences between two individuals.
Activity recognition is the basis for the development of many potential applications in health, wellness, or sports:
Neural networks are the perfect algorithms to determine a person's physical activity. This is due to their ability to recognize the patterns behind the data.
The following graph illustrates a neural network that classifies among different activities using smartphone data.
Human activity recognition has a wide range of uses because of its impact on wellbeing.
It is becoming a fundamental tool in healthcare solutions such as preventing obesity or caring for elderly persons.