The Empathy of Machine Learning

 
Illustration:  Beloved Dog  by Maira Kalman

Illustration: Beloved Dog by Maira Kalman

If you were to adopt a dog—bring it home, and have it live with you for several months—both you and the dog would likely interact with one another, develop a routine, and influence each other's behavior. So imagine that one day you come home from work to find that your dog has jumped onto your bed and is sleeping with his butt resting on your favorite pillow. If this behavior on your dog’s part is deemed undesirable, unpleasant, or even infuriating, that is acceptable. Despite this, it would be unacceptable in this scenario to place the blame on the dog and not yourself. After all, wasn’t it your job to train the dog? Wasn’t it your job to set up parameters and conditions for how the dog was expected to behave?

I believe that the ownership of a Facebook account is allegorical to dog ownership. The Facebook Timeline is time-based and develops an expected pattern of interaction with its user over the span of the relationship. While its behavior may have a predictable routine or personality to it, it will also be unique and fresh each day as it grows. Lastly, it will be influenced by its owner’s behavior and vice-versa.

So let’s take a closer look at the Timeline. Every individual's Timeline exists as a unique entity, one that manifests as a result of Facebook's algorithm. The algorithm is what determines the presentation of information on a Timeline at any given point and has many exciting features. To begin with, a Facebook user will never experience the same Timeline twice since the algorithm ensures that its content is always learning, growing, and transforming at an immeasurable rate. Furthermore, if the same user’s Timeline were to be pulled up on multiple devices simultaneously this algorithm would display a similar sentiment on each device; however it will still present a different content, order, and format for each. Therefore, by definition, we can’t ever truly measure a Timeline to ascertain whether or not it is deterministic. This shows us that while the algorithm may not technically be “conscious,” it is inherently non-deterministic and therefore exhibits behavior indistinguishable from that of a conscious being’s free-will.

Next, let’s take a look at how the algorithm operates. The manifestation of an infinitely-scrolling Timeline requires that the algorithm must form its recommendations in advance. The user sees this as a real-time result, but the list of recommendations is formed whether the user scrolls further (or even looks at the Timeline in the first place).  If we see consciousness as being aware of oneself—and being able to make conclusions based on decisions that the Self has already derived—then what is to stop us from feeding these “recommended posts” results back into the algorithm itself at the time the user views the feed? Doing so would make the algorithm "aware" of its decisions, and would allow it to choose the same path (existing results) or to choose new ones—thus granting it an arguable form of consciousness.

If we follow this line of thinking surrounding the underlying algorithm, we see that Facebook’s Timeline is nondeterministic, non-random, and, with a small tweak, could exhibit a form of consciousness. Therefore when we are considering the future of the Timeline, I believe that it would be unethical for its creators to directly modify its function or, worse, to euthanize it.

So what are the ethical options we have when considering the future of the Facebook Timeline? Having pondered these points, I can see only three possible solutions. One can
[A] grant it consciousness,

[B] allow it to continue to operate without using its results, or

[C] begin to retrain it.


If we look again at the relationship between an owner and his dog, I would suggest that the third option [C] is the best course of action. Similar to a dog, Facebook’s Timeline algorithm is an entity which can be trained to exhibit the behavior its owner deems most desirable. Furthermore, it is the responsibility of the individual user to take on the job of proper training. Just as no owner could rightfully beat down or disparage his or her own dog, no Facebook user can be seen as sound or just when beating down or disparaging their own Timeline results. To do so would be displaying a fundamental lack of empathy. Thus, if a Facebook Timeline were to exhibit undesirable behavior or content, the ideal solution would be that the individual user takes on the job of providing proper instruction to modify it so that it best meets their needs and personal definition of optimal; it then, ethically, becomes the responsibility of the end user to change the behavior of their Facebook Timeline, not the responsibility of Facebook itself.


This above essay was originally written as part of the coursework for a class titled, “Logo Insignifica,” by professor Mark Kingsley at SVA’s Masters in Branding program.


 
 
 
Kelsy PostlethwaitEssay