There is always tension between wanting to be apart from others, asserting our individual identity, and desiring to belong to a collective, be it an organization or a culture. And this tension is more evident and acceptable in some societies than in others. For instance, in Chinese culture, the desire to be more individualistic is certainly not encouraged and often actively suppressed, whereas in the States, coexistence of individualism and belonging seems to be ubiquitous. Perhaps it is inevitable that the need for individual expression and our collective penchant for conveniences brought about by high tech would clash from time to time. Particularly in computer software where the algorithm is all about finding and creating a platform of commonalities, how does individualism come through?
While most of us welcome the conveniences technology has afforded us, both at work and in our social world, some of us also yearn for a few truly smart technologies that would provide us with more individual design…something that’s uniquely “me.” Yet, however fast the ever-improving technologies are, or however “smart” the devices are, they still cannot satisfy some of our basic emotional needs. Sometimes, these supposedly smart things are downright annoying. Nothing new here. So now a trend is developing in “positive computing” to address individual needs. Rafael Calvo and Dorian Peters of University of Sydney propose a higher calling of technology “to support well-being, wisdom and human potential.” A few universities and Google have begun to take on the challenge.
“Positive computing” responds to the harassed feelings most of us seem to have acquired regarding the growing and expanding technologies. I think part of that stressful feeling comes from the constant demands from every direction with little regard to “who we are” individually. Since technologies seem to make everything happen instantaneously, we want not just something now, but something that I need or want now. In this cacophony of “I want,” “Listen to me,” “Now,” “Where is mine?” we paradoxically feel lost as individuals. It’s not blatant; it’s subtle, but the loss is palpable. I mentioned computing algorithms supporting finding and tapping into the commonalities among humanity. These may try to give us some individual control over a few areas but they require us to behave repeatedly the same, in order for the software to work.
When Facebook provided the “Year in Review” at the end of 2014, the majority of its users probably either welcomed it or shrugged it off as another marketing ploy. However, in some unfortunate and exceptional cases, souls could be crushed by this little “innocent” app. Surely it is not that unexpected for people whose year might have been marred by serious illnesses, injury, or death, to not want this particular feature. Would you like this app to hit you with the “smiling face” of someone whose life was yanked out of your existence? One web designer, Eric Meyer, lost his young daughter that year. The picture of the daughter on Mr. Meyer’s FB page under the banner, “Year in Review,” did not exactly lead to happy feelings.
Mr. Meyer penned “Inadvertent Algorithmic Cruelty” on his blog. As a web designer, he did not lambast FB’s people as others might have; he understood their business priorities. However, his own personal grief allowed him insight that obviously escaped most software designers. He offered two measures to modify the assumptions that everyone wants to share their year of pictures on FB. First, “don’t pre-fill a picture until you’re sure the user wants to see the pictures from their year.” Second, offer people the option of not wanting the app, and honor it (and don’t pester them at interval periods). Mr. Meyer further proposes that perhaps the computer designers should use “worst case scenarios” as base instead of assuming best cases for everyone. Not knowing much about computer programming, I cannot say. However, knowing human nature and logic, I think the fundamental problem is that algorithm doesn’t give much space for individual discretion, best or worst scenarios.
What’s more heartening about Eric Meyer’s story is the follow-up, a somewhat unexpected turn of events. His original blog post created a firestorm for FB, especially the “Year in Review” team. It wasn’t Mr. Meyer’s intention. The FB “Year in Review” product manager personally apologized to Mr. Meyer, and in turn, Mr. Meyer was even more humble with his own apology. He did not mean to dump it on FB; the problem with algorithm is its “thoughtless” nature that defines the industry. (sidenote: Would your or my case elicit a personal apology from the product manager?) And Mr. Meyer took time and space to actually defend FB against some very nasty comments/reactions to his first post. The irony he pointed out was this: In attacking FB for being insensitive and inflicting blind imposition, many commenters “inadvertently” made assumptions about FB programmers that might or might not be true. How was this different from the algorithm’s blind assumptions?
Indeed, are we doomed to want to impose on others standards from which we ourselves want to be excluded?
The central feature of all this back and forth, between Meyer’s blog posts, his readers’ comments, FB’s responses, and our desire to have our unique features in the universe of algorithms, is Eric Meyer’s humility. In his grief and sorrow, he still found sympathy for the programmers and offered lessons for all to contemplate. I think that’s the ultimate challenge for “positive computing/technology:” How to capture that uniquely humane spirit in a world of sameness? Personally, humility and creativity rank the highest for me. What would be you yours?
Till next time,
Staying Sane and Charging Ahead.
Direct Contact: email@example.com