The Ethics of Apps

Dylan Roskams-Edris warns about the habit forming and addictive capacities of smartphone apps and explains the need for an ethics of apps.


B.F. Skinner, an American psychologist, did not believe in free will. He believed that all behaviour is controlled by environmental factors and that specific behaviours can be strengthened or weakened through conditioning. Behaviours followed by positive rewards, such as praise, are reinforced and therefore likely to reoccur. The developers of smartphone apps understand this all too well. Examples of reinforcement tools include “likes,” “shares,” and “views,” all of which work tremendously well on a brain evolved to find signs of social acceptance rewarding.

Tristan Harris, a former design ethicist and product philosopher at Google addresses this issue in his effort to “Break the Binge” of app usage. He maintains that the prime directive in Silicon Valley is to “hook” users into spending time on apps. The more time spent on apps, the more ad revenue for the developers. From a commercial perspective, it is simple. From an ethics perspective, not so much. Smartphone apps manipulate our behaviour for reasons of profit, which is, to say the least, ethically troubling.

The danger lies in the combination of smartphone technology aimed at reinforcing app usage, connected to our hips and ears by vibrations and tones, and a business model based on attracting and keeping our attention. This combination co-opts our time and freedom and directs us towards information and advertisements which often add little value to our lives.

Image Description: Cell phone screen home page filled with apps.

By applying what Skinner called the “technology of behaviour” any animal can be trained to respond in predictable ways. Humans are no exception. The training required to accomplish the goal of capturing and keeping our attention is shockingly easy; simply pair a tone with a reward, whether social acceptance or gameplay achievement, and wait as the user trains themselves to reach ever more readily for their device in response. Perhaps it is the ease with which we can be drawn to our screens that has prevented us from acknowledging how powerful a training tool we carry in our pockets. We simply refuse to believe that our will could be so easily co-opted by tones and vibrations. In reality, we are animals equipped with brains specialized to recognize when a behaviour leads to activation of our reward system, and then turn those behaviours into unconscious habits.

App developers and their consultants know this and exploit our tendencies toward habit-formation and addiction. Many apps appear to be little beyond a conditioning vehicle that capitalizes on the tendencies of our reward learning system. A sufficient number of pairings between the neutral stimulus of a vibration and a rewarding indicator of social acceptance (for example a “like”), forges a deepening connection within our brains. People who have found themselves looking at the Facebook app with no idea of how the smartphone came to be in their hand knows the effects of being conditioned. Eventually our rational mind is bypassed altogether, leaving us to rationalize a reason for our distracting habit.

App and smartphone technology, like other tools, are not inherently bad or harmful. Rather, it’s the way we use the tools that can be problematic. The question is not, therefore, whether we should develop and use these technologies, but how we should use them.

Harris’s recommendation is to create a kind of certification for apps that meet ethical guidelines, and to forge partnerships between his organization – “Time Well Spent” – and app developers to encourage ethical development. While the effort to figure out how to develop apps ethically is still a nascent project, there are at least three complementary ways it can progress.

First, apps could be designed to record and display information about usage (for example, the number of times the app is used per day and the total time spent using it). People can then use this information to put their screen time into perspective and think about whether the reward is worth the price in time and distraction.

Second, apps could be designed in such a way as to limit their habit forming properties, for example controlling the number or timing of notifications. There is a compendious amount of research from neuroscientists and psychologists detailing how to train mammalian brains more or less effectively by changing the timing between the reward and the stimulus. Only allowing pings to distract us during predetermined portions of the day, for example, prevents the user from training themselves to be, at all times, prepared to attend to their device.

Third, we could remove the temptation altogether by removing the notifications on the apps we enjoy and perhaps considering deleting other apps that are designed merely as a convenient vehicle for distraction.

We need an ethics of apps and a responsible consumerism in order to realize the utility of instantaneous communication without allowing ourselves to be distracted, for little reason, by the digital sound and fury, signifying nothing.


Dylan Roskams-Edris is a 3L law student at the Schulich School of Law at Dalhousie University. @DylanWRE