December 27, 2019
by Tina Arnoldi
A recent study in JMIR found that data from Fitbit and smartphone devices could help identify college students experiencing loneliness. The results suggest “fine-grained behavioral features” from mobile and wearable devices can distinguish between high and low levels of loneliness. Students with high levels of loneliness spend less time off campus and socialize less than those with low levels of loneliness.
Technology around mental health is clearly advancing, with Amazon reading human emotions and Google’s acquisition of Fitbit, but there are legitimate concerns about its use. And with over 100 million devices sold, Fitbit owns a significant amount of wellness data.
I invited experts to share their opinion on how accurate data obtained this way is as loneliness or mental health predictors and if there are concerns with how the data might be used.
Erica Wiles, LPC notes loneliness or isolation can lead to symptoms of depression which do manifest physically. While loneliness is a subjective concept and open to some level of personal interpretation, she believes that “depression, a resulting by-product of loneliness is clearly defined in the DSM 5 and has physically measurable markers that a Fitbit could detect.” Ray Walsh, Digital Privacy Expert agrees that “It is probable that devices can tell with a great deal of certainty whether an individual is active, sedentary, or even lonely.”
While the technology is there with data about activity level, sleep pattern, heart rate, and weight changes, it requires motivation from users to utilize it. Wiles notes “a person would have to actively sync the Fitbit to record data on a regular basis. There would also have to be a commitment made to wear the device at night. A person struggling with loneliness and/or depression may not follow with this making it hard for mental health professionals to accurately track data.”
Walsh has a big concern about privacy with this passive data collection. Walsh noted “the potential for this kind of data dissemination is at the heart of the need for strong privacy laws that protect consumers. Consumers need to understand how data can be used against them, and legislators must place limitations on businesses to stop them from profiting from data that is of a sensitive nature.
People purchase a Fitbit primarily to track their activity level. The question is whether it would start offering data about loneliness without the user’s consent. “For a device to be used to ascertain whether somebody is lonely, permission for that device to do so needs to be a legal requirement,” says Walsh. “Individuals must agree to receive help and having their data processed; it should not be forced on them. And singling out an individual for help (whether that be interference by the state or the medical profession) without first receiving consent from that individual to process their data for these secondary purposes is a breach of their privacy and human rights.”
The motives of vendors who may use this data is uncertain. Will they use it for the benefit of the users or for their own purposes? Walsh believes it is “extremely likely that device-level surveillance of this nature will be used to allow firms to tailor adverts rather than provide help.”
He adds “the potential for abuse is massive, and restrictions must be put in place to ensure that this kind of ‘lonely data’ is not used to either discriminate against or profit from individuals."
Even with good objective data and privacy protections in place, the diagnosis of mental health is complex. Although tools may help, Wiles cautions “there appears to be room for error when not in a controlled study. Insurance agencies and mental health professionals would need to keep this in mind when utilizing this tool.”