In today’s digital world, people turn daily routines into datasources for action and change through things like sleep trackers and exercise counters, made popular by the Fitbit and Apple Watch. Here, Melanie Brener reflects on the benefits and costs of these self-tracking habits and technologies.
Students get meta reflecting on tech in London’s Virtual Classroom
The contemporary information environment means that we learn, work, shop, play, and fall in love digitally. The consequence of our online participation in social and political life through likes, clicks, shares, comments, and views is a deep wealth of behavioural data — data that marketers, tech giants, businesses, the entertainment industry, the medical industry, politicians, and the government can use to target policies and predict trends. London’s Data in Society course helps students think critically and ethically about the role of data in everyday life.
Student essays examine how individuals, groups, and society create — and are created by — digital data and algorithms. The shift to remote work, distance learning, and social distancing in the midst of COVID-19 only amplified the social, political, legal, and professional issues considered in this class, giving Spring 2020 students an active laboratory for reflection.
Self-Tracking’s Relationship with Social Norms and Scientific Expertise
by Melanie Brener
Self-tracking is a social technology focused on measuring human activity. This technology is not new and has not always been digital. People tend to keep logs of their activity in order to focus on their individual functions in comparison to how the society around them function. Although this technology has always been social, the digitalization of self-tracking creates a lot of unknowns in relation to data privacy and data use. In addition to questions of data, digital self-tracking has made this technology even more socially focused as self-tracking devices can be used as a social force of deciding what healthy behavior is and shaming those whose behavior does not meet that norm. As self-tracking technology develops, users are given more power in collecting their individual data, but this power of collection sometimes fails to give users the ability to make sense of their own data. This begs the question; how can self-tracking be utilized by the individual user so that they have both control over their data and can also ask the right questions and get the right answers from it?
Social norms play a large role in self-tracking use and sharing personal data. This becomes an issue because users begin to rely on self-tracking technology to tell them what is considered normal, and therefore may self-track in order to fit in with society. Unfortunately, there will always be users who fall outside the norm due to uncontrollable limitations. When these users cannot meet the perceived social norm, they may feel isolated from the technology. Alternatively, users who do fall outside of the norm may track incorrect data in order to be perceived as fitting in. If users start tracking incorrect data that is then used to decide what is normal or not in society, unrealistic expectations will be set as a reference point for users which is a huge problem because the “user may come to see himself instead, through the data and the systems designed to encourage behavior change” (Neff and Nafus, 41). If this occurs, it is not the user who has failed but instead the technology. Self-tracking technology is not going to go away as it is embedded in human nature, but it can be developed to teach a user what behavior is healthy for them as an individual.
Earlier in the course the pitfalls of social media were discussed (see Bobbi Whitney’s essay). Some of these pitfalls can be equally applied to self-tracking technology due to its social nature, and data privacy and use is one of the main issues that needs to be addressed. Users have become accustomed to sharing their personal information to increase engagement and feelings of community. Selftracking technology is unique because it appeals to a user’s want for empowerment over their data. A user will most likely get this feeling as they use technology to count their steps, watch their weight, or track their health and behaviors in other ways, but after the tracking step, the user most likely does not have control over where their data goes. If the user does not have the ability to make sense of their personal data, they have to turn to the people who do, knowingly or unknowingly. It is unethical for the user to not know where their data is going and what it is being used for. This comes back to the idea of educating users on what they agree to when they use technology. It is important that as self-tracking technology develops and becomes more widely used that users are given transparent information about their data. The example of health insurance is very applicable to this issue. Health and fitness trackers have become very popular in recent years. As users self-track they may be required to information on their age, weight, height, calorie, intake, and exercise. The user inputs this information so that their user experience can be enhanced and supposedly individualized, and as they use the technology their personal information continues to be tracked. If one day, the user wanted to get health insurance and found out that the fitness tracker they were using was sending their data to health insurance companies making it difficult for them to get a desired plan or causing premiums to be higher, the data that they once found empowering to input has actually been used against them unknowingly. It is unethical to encourage users to share personal information if they do not know how their data will be used.
The ethics of data use ties into how self-tracking technology can be integrated with notions of scientific expertise in order to promote genuine healthy behavior. As previously mentioned, encouraging users to share personal data without being transparent about how their data is being used is unethical. Despite this, self-tracking technology can actually be a great tool in educating users on how to understand their personal data. Self-tracking or monitoring is fine because it can help users set goals and can nudge them into healthier behaviors. It can also be a good tool in helping people “who are not professional researchers to collect data and ask questions about it” so that they can gain a better understanding of their personal data (Neff and Nafus, 44). This technology should be developed in two main ways. The first is related to transparency. Self-tracking technology should be transparent about data use because the idea of the technology is to help the user track their behavior over time. Therefore, the technology should be designed to include transparent information about how the technology uses data and if the data is being compiled by outside companies. The second development is related to the ‘intranet’ approach of self-tracking. Self-tracking technology should give users the option of what to do with their data. For example, if a user has to track their blood pressure, they should have the option to keep that information to themselves or share it with a scientific expert if they choose. To make this approach successful, self-tracking technology can already include the answers to the question an individual’s data may raise. If the user find that their blood pressure is unusually low or high, the technology can be designed to have information on what they can do to bring their blood pressure back to normal.
Self-tracking has a lot of benefits for individuals and society, but like any technology has its downfalls. This technology should be developed so that users are not focused on achieving a social norm, but instead are focused on individual development. Self-tracking technology should focus on the individual instead of fundamental beliefs on how people should behave in a society.