Device sensors provide insights into what persons are doing in real-time (walking, running, driving...).
Knowing the activity of users allows, for instance, to interact with them through an app.
Machine learning can be used to detect activities by reading and processing sensor data automatically.
Human activity recognition (HAR) aims to classify a person's actions from a series of measurements captured by sensors.
Nowadays, collecting this type of data is not a hard task. With the growth of the Internet of Things, almost everyone has some gadget that monitors their movements. It can be a smartwatch, a pulsometer, or even a smartphone.
Usually, this is performed by following a fixed-length sliding window approach for the features extraction where two parameters have to be fixed: the size of the window and the shift.
These are some of the data you could use:
The machine learning model used for activity recognition is built on top of the devices' available sensors.
However, analysing this data can be a big challenge due to the complexity of human activities and the existing differences between two individuals.
Activity recognition is the basis for the development of many potential applications in health, wellness, or sports:
Neural networks, due to their ability to generalize the knowledge that an individual provides and learning about particularities, are the perfect tools to determine a person's physical activity.
The following graph illustrates a neural network that classifies among different activities using smartphone data.
Human activity recognition has a wide range of uses because of its impact on wellbeing.
Nowadays, it is becoming a fundamental tool in preventing obesity or the care of elderly persons.