Physiological responses to negative emotions, activities we regularly engage in, mobile phone use and other passive signals tell a lot about a person. With advances in technology, including wearable devices and smartphones, medical professionals can collect passive signals that may indicate a particular state of mind.
Wearable devices that capture passive data can identify risk factors for poor mental and physical health before individuals may recognize them. Fitbit users are already aware when their activity level is low. This type of information can be used by an individual or shared with a provider to improve their overall health.
But is generating even more data that mental health professionals can use in treatment a positive development? Or is it a potential invasion of privacy if passive signals are available to providers? We already know that Facebook uses user-generated content data to predict suicide risk, so is user generated activity and sensory information the next evolution?
MiCoachee, a wellness app, uses physiological data from almost any wearable device or app like Google Fit. Tom Mann, who works with the MiCoachee, said that data, along with client feedback is used “to create a digital coaching and behavioral modification content such as hypnotherapy, CBT, NLP or mindfulness that is specific to each person and able to help them overcome health and weight loss struggles by tackling the thoughts, triggers, and obstacles sabotaging their success. They love it and a study with the YMCA proves it works.”
Recently, MiCoachee conducted research with their data science team at UMASS, and published a dashboard in their data analytic retention and profitability series. Mann said, “The report is called MAD or Monthly Attendance Determiner. Using client data, we are confident we can provide a list of specific members that have an 85% chance of not going to the gym in the next 30 days, a great tool for management to know who to focus staff resources on to keep engagement up. We know a great deal about MiCoachee customers and can predict gym attendance with over 99% accuracy, but that is a relatively small part of the entire membership base. Using our infrastructure that scales into the millions, we have the horsepower to provide valuable management insight on all gym members regardless of MiCoachee participation.”
If clients opt-in to this and want reminders about their gym membership, this could be a tool to encourage and motivate them to go and take care of their physical health, which impacts mental health.
Kai Fei, a founder of WeAchieve, a data-driven goal planning and achievement app dedicated to helping individuals leverage their own data to improve their own lives, believes data collection can be a good thing.
Fei sees value in individual’s use of the data rather than the transmission of it to providers. “Our perspective on these passive datasets coming from wearables and smartphones is that there is a big opportunity for individuals to use this data themselves. It doesn’t necessarily have to be sent to medical professionals – and I can imagine that many people probably don’t actually want to share this information with others, including their doctors. Instead, just having the transparency of this data (it’s easy to lie to myself and say that I don’t spend much time on my phone – but it’s harder to say that when my screen time data shows an average screen time of 4 hours a day…) can be extremely beneficial for individuals.”
Although healthy individuals may be motivated to use data to improve upon their wellbeing, those who suffer from significant mental health symptoms may lack the insight to see their need for help. Passive data signals, whether through sensors or mobile phone use, could serve as a self-report tool, although privacy is a concern with the amount of data available today. A concern is whether the will day come when sharing this information is a requirement for getting health insurance or a new job. People already self sabotage when they choose to share information they should not; what about the information they choose not to share?