Abstract
Fabric-embedded sensors are of growing interest to studies requiringthe measurement and analysis of human movement outside the laboratory
environment. These small scale, minimally invasive sensors, can be
used for medical applications such as clinical diagnostics and long-term
rehabilitation studies, or other areas which require motion measurements,
such as sports analysis or human-computer interaction. However,
a major issue limiting their usage is the undesired effect of fabric
motion artefacts corrupting movement signals.
In this thesis, the role of motion artefacts in these types of sensors is
explored, with the aim of creating strategies to overcome these artefacts,
and allow for accurate, long-term human motion sensing systems.
To solve this problem, this thesis proposes treating motion artefacts
as stochastic perturbations to the sensed motion, and utilising statistical
learning approaches to develop artefact elimination and motion
classification strategies. Treating motion artefacts in this way provides
many benefits over analytical fabric modelling, as it removes the need
to estimate physical quantities of the fabric. This thesis investigates
the relationship between learning approaches and the unique problems
posted by fabric motion artefacts. Methods that explicitly account for
stochastic perturbations in the sensed signals are investigated include
supervised errors-in-variables regression, and unsupervised latent space
learning. In addition, the role that information contained within motion
artefacts is investigated, in relation to distance based classifiers.
In experiments, these methods are evaluated in a number of human
motion tasks, including pose estimation and gait analysis. It is shown
that these methods demonstrate improved prediction accuracy over
learning approaches that do not account for the unique problems of
fabric motion artefacts, and are of a suitable computational complexity
to allow them to be implemented in small-scale embedded fabricsensing
systems.
Date of Award | 1 Jul 2018 |
---|---|
Original language | English |
Awarding Institution |
|
Supervisor | Matthew Howard (Supervisor) |