Device sensors provide real-time insights into what persons are doing (walking, running, driving, etc.).
Knowing users' activity allows, for instance, to interact with them through an app.
You can apply machine learning to detect activities by reading and processing sensor data in this regard.
Human activity recognition (HAR) aims to classify a person's actions from a series of measurements captured by sensors.
Nowadays, collecting this type of data is not an arduous task. With the growth of the Internet of Things, almost everyone has some gadget that monitors their movements. It can be a smartwatch, a pulsometer, or even a smartphone.
Usually, this is performed following a fixed-length sliding window approach for feature extraction. Here two parameters need to be fixed: the window size and the shift.
These are some of the data you could use:
The machine learning model used for activity recognition relies on top of the devices' available sensors.
However, analyzing this data can be a big challenge. Indeed, human activities are complex, and there are differences between individuals.
Activity recognition is the basis for the development of many potential applications in health, wellness, or sports:
Neural networks are the perfect algorithms to determine a person's physical activity. This is due to their ability to recognize the patterns behind the data.
The following graph illustrates a neural network that classifies different activities using smartphone data.
Human activity recognition has a wide range of uses because of its impact on wellbeing.
It is becoming a fundamental tool in healthcare solutions such as preventing obesity or caring for elderly persons.