<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Irina Raicu | Vox</title>
	<subtitle type="text">Our world has too much noise and too little context. Vox helps you understand what matters.</subtitle>

	<updated>2019-03-06T11:22:54+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.vox.com/author/irina-raicu" />
	<id>https://www.vox.com/authors/irina-raicu/rss</id>
	<link rel="self" type="application/atom+xml" href="https://www.vox.com/authors/irina-raicu/rss" />

	<icon>https://platform.vox.com/wp-content/uploads/sites/2/2024/08/vox_logo_rss_light_mode.png?w=150&amp;h=100&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[Artificial intelligence is forcing us to work harder to define human intelligence  — and to fight to defend it]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2017/12/19/16792566/artificial-intelligence-ai-human-ethics-john-steinbeck-brainco-brainwaves" />
			<id>https://www.vox.com/2017/12/19/16792566/artificial-intelligence-ai-human-ethics-john-steinbeck-brainco-brainwaves</id>
			<updated>2017-12-19T14:11:29-05:00</updated>
			<published>2017-12-19T08:00:04-05:00</published>
			<category scheme="https://www.vox.com" term="Artificial Intelligence" /><category scheme="https://www.vox.com" term="Big Data" /><category scheme="https://www.vox.com" term="Innovation" /><category scheme="https://www.vox.com" term="Privacy &amp; Security" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[This is a contributed article by Irina Raicu, the director of the Internet Ethics program at the Markkula Center for Applied Ethics. &#8220;Sometimes a type of glory lights up the mind of a man,&#8221; writes John Steinbeck in his novel &#8220;East of Eden,&#8221; which is set in a California valley &#8212; Salinas, though, not Silicon. [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="BrainCo CEO Bicheng Han wears the BrainCo Focus 1 to use biofeedback to change the color of a lamp by concentrating or relaxing during a press conference at the CES 2016 Consumer Electronics Show on January 7, 2016, in Las Vegas, Nevada. | David McNew / AFP / Getty Images" data-portal-copyright="David McNew / AFP / Getty Images" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/9892599/Brainco.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	BrainCo CEO Bicheng Han wears the BrainCo Focus 1 to use biofeedback to change the color of a lamp by concentrating or relaxing during a press conference at the CES 2016 Consumer Electronics Show on January 7, 2016, in Las Vegas, Nevada. | David McNew / AFP / Getty Images	</figcaption>
</figure>
<p><em>This is a contributed article by </em><a href="https://www.linkedin.com/in/irina-raicu-b65a707/"><em>Irina Raicu</em></a><em>, the director of the Internet Ethics program at the </em><a href="http://scu.edu/ethics/"><em>Markkula Center for Applied Ethics</em></a><em>.</em></p>

<p>&ldquo;Sometimes a type of glory lights up the mind of a man,&rdquo; writes John Steinbeck in his novel &ldquo;East of Eden,&rdquo; which is set in a California valley &mdash; Salinas, though, not Silicon. &ldquo;It happens to nearly everyone. You can feel it growing or preparing like a fuse burning toward dynamite. &#8230; It is the mother of all creativeness, and it sets each man separate from all other men.&rdquo;</p>

<p>Okay, but what does that have to do with artificial intelligence?</p>

<p>In the novel, published in 1952, Steinbeck continues:</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>I don&rsquo;t know how it will be in the years to come. There are monstrous changes taking place in the world, forces shaping a future whose face we do not know. Some of these forces seem evil to us, perhaps not in themselves but because their tendency is to eliminate other things we hold good.</p>
</blockquote>
<p>That line finds an echo in our times. Various ethicists are writing, these days, about the concerns that AI might eliminate some things &ldquo;we hold good&rdquo; &mdash; and not just meaning &ldquo;jobs.&rdquo; They write, for example, about the threat of <a href="https://link.springer.com/article/10.1007/s13347-014-0156-9">&ldquo;moral de-skilling&rdquo; in the age of algorithmic decision-making</a>. About what might be lost or diminished by <a href="https://www.forbes.com/sites/privacynotice/2014/07/17/why-we-should-be-careful-about-adopting-social-robots/#6847a19371ef">the advent of robot caretakers</a>. About what role humans will play, in general, in an age of machine learning and neural networks making so many of the decisions that shape human lives.</p>

<p>&ldquo;It is true,&rdquo; Steinbeck writes,</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>that two men can lift a bigger stone than one man. A group can build automobiles quicker and better than one man, and bread from a huge factory is cheaper and more uniform. When our food and clothing and housing all are born in the complication of mass production, mass method is bound to get into our thinking and to eliminate all other thinking.</p>
</blockquote>
<p>We are in the process of shifting from the kind of mass production that Steinbeck talked about to a kind of mass production that requires much less human involvement. If &ldquo;mass method&rdquo; was bound to get into our thinking back then, how is it shaping our thinking now? Is this what the current focus on data collection and analysis of patterns is about?</p>

<p>&ldquo;In our time,&rdquo; adds Steinbeck,</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>mass or collective production has entered our economics, our politics, and even our religion, so that some nations have substituted the idea collective for the idea of God. This in my time is the danger. There is great tension in the world, tension toward a breaking point, and men are unhappy and confused.</p>
</blockquote>
<p>In our own time, AI is spreading into all the various spheres of our lives, and there is tension and great concern about its impact. We are confused by dueling claims that AI will eliminate jobs or create new ones; that it will eliminate bias or perpetuate it and make it harder to identify; that it will lead us to longer, happier lives &mdash; or to extinction.</p>

<p>&ldquo;At such a time,&rdquo; writes Steinbeck&rsquo;s narrator, &ldquo;it seems natural and good to me to ask myself these questions. What do I believe in? What must I fight for and what must I fight against?&rdquo;</p>

<p>Good questions for us, too.</p>

<p>&ldquo;Our species is the only creative species,&rdquo; writes Steinbeck, &ldquo;and it has only one creative instrument, the individual mind and spirit of a man.&rdquo; He goes on to knock down the notion of collaborative creativity, and you can certainly disagree with that, but keep in mind that in our time the trajectory seems to be toward handing over creativity, too, to algorithms, leaving aside the human mind (whether individual or collective).</p>

<p>Of course, it is still the human mind &mdash; individual or collective &mdash; that decides what data to collect for algorithms to analyze, what factors to incorporate into algorithms, what weight to give distinct factors and what data to use in training those algorithms; but those decisions are camouflaged by the often-accepted myth that &ldquo;data-driven&rdquo; or &ldquo;data-based&rdquo; algorithmic processes are objective, or neutral (unlike other human decision-making processes).</p>

<p>&ldquo;And this I believe,&rdquo; continues Steinbeck:</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>that the free, exploring mind of the individual human is the most valuable thing in the world. And this I would fight for: the freedom of the mind to take any direction it wishes, undirected. And this I must fight against: any idea, religion, or government which limits or destroys the individual. &#8230; I can understand why a system built on a pattern must try to destroy the free mind, for this is the one thing which can by inspection destroy such a system.</p>
</blockquote>
<p>I think about that as I read about the latest developments in data-driven pedagogy and education technologies that try to read &mdash; in order to shape &mdash; developing minds. &nbsp;&ldquo;Are your brainwaves private, sensitive information?&rdquo; asks <a href="https://www.csoonline.com/article/3239969/security/company-with-no-privacy-policy-to-collect-brainwave-data-on-1-2-million-students.html">a recent article in CSO magazine</a>:</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>Most people probably never really gave it much thought because it is not something companies usually collect and store. But if kids in school had to start wearing brainwave-detecting headbands that measure their attention levels in real time, couldn&rsquo;t that impact student privacy? The brainwave attention-level results are shared with teachers and school administrators, and are collected and stored by a private company.</p>
</blockquote><figure class="wp-block-pullquote alignleft"><blockquote><p>Telling students that their attention levels (and emotions, and what else about their brains?) will be detected, measured and collected is very likely to impact that “free, exploring mind of the individual human” that Steinbeck wrote about.</p></blockquote></figure>
<p>This is not fiction. A company named BrainCo claims to offer the &ldquo;world&rsquo;s first wearable device specifically designed to detect and analyze users&rsquo; attention levels,&rdquo; in conjunction with &ldquo;the world&rsquo;s first integrated classroom system that improves education outcomes through real-time attention-level reports.&rdquo; <a href="https://www.csoonline.com/article/3239969/security/company-with-no-privacy-policy-to-collect-brainwave-data-on-1-2-million-students.html">CSO reports</a> that BrainCo has sold 20,000 devices to China, and that BrainCo&rsquo;s CEO has said the company&rsquo;s goal is &ldquo;to capture data from 1.2 million people &#8230; [which] will enable us to use artificial intelligence on what will be the world&rsquo;s largest database to improve our algorithms for things like attention and emotion detection.&rdquo;</p>

<p>So the claimed goal is to harvest human brain waves in order to improve artificial intelligence, purportedly with the ultimate goal of improving human education and therefore human intelligence (though, as the CSO article notes, BrainCo representatives &ldquo;did not rule out that students&rsquo; brainwave data might be used &lsquo;for a number of different things&rsquo;&rdquo;).</p>

<p>Of course, the implications of such practices go beyond student privacy; or, rather, student privacy gets at something deeper than concerns about identity theft or potential misuse of disciplinary or grade records. Creativity requires privacy. Telling students that their attention levels (and emotions, and what else about their brains?) will be detected, measured and collected is very likely to impact that &ldquo;free, exploring mind of the individual human&rdquo; that Steinbeck wrote about.&nbsp;There&rsquo;s a reason why &ldquo;Dance like nobody&rsquo;s watching&rdquo; rings so true to so many.</p>

<p>One might add, &ldquo;Think like nobody&rsquo;s strapping a band around your head to collect information about your thinking.&rdquo;</p>

<p>Creativity involves a leap &mdash; a departure from the known, the norm. Departures from the norm are subversions of the status quo. We are social animals, and most of us would be uncomfortable with being seen as subversive. So the brain-analyzing devices themselves might well chill creative thought, even if they were purely placebos. And they are not intended as placebos &mdash; so the algorithms involved might well &ldquo;learn&rdquo; about the chilled thinking that they themselves caused, and magnify and perpetuate its stunting effects.</p>

<p>The &ldquo;glory&rdquo; that Steinbeck wrote about is different from the insights of &ldquo;a system built on a pattern.&rdquo; The age of artificial intelligence forces us to work harder to define human intelligence &mdash; and to fight to defend it.</p>
<hr class="wp-block-separator" />
<p><a href="https://www.linkedin.com/in/irina-raicu-b65a707/"><em>Irina Raicu</em></a><em> is the director of the Internet Ethics program at the </em><a href="http://scu.edu/ethics/"><em>Markkula Center for Applied Ethics</em></a><em>, Santa Clara University.&nbsp; Views are her own. Follow the Internet Ethics program on Twitter at </em><a href="https://twitter.com/IEthics"><em>@IEthics</em></a><em>.</em></p>
<hr class="wp-block-separator" />
<p>&nbsp;</p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[We need to take a vacation from social media]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2017/7/28/16051888/social-media-vacation-facebook-diary-shared-experience" />
			<id>https://www.vox.com/2017/7/28/16051888/social-media-vacation-facebook-diary-shared-experience</id>
			<updated>2017-07-28T13:21:27-04:00</updated>
			<published>2017-07-28T06:15:02-04:00</published>
			<category scheme="https://www.vox.com" term="Facebook" /><category scheme="https://www.vox.com" term="Social Media" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[It&#8217;s summer, so lots of people are posting vacation pictures on various social media platforms. I do, too &#8212; but not until I&#8217;m back to my regular life. I still log in while traveling; I look at others&#8217; posts, comment sometimes; I might even post an occasional article. But I don&#8217;t post descriptions or pictures [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Hero Images / Getty Images" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/8942827/selfie_vacation.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>It&rsquo;s summer, so lots of people are posting vacation pictures on various social media platforms. I do, too &mdash; but not until I&rsquo;m back to my regular life.</p>

<p>I still log in while traveling; I look at others&rsquo; posts, comment sometimes; I might even post an occasional article. But I don&rsquo;t post descriptions or pictures of what I did or saw that day, even if I think my friends would enjoy them.</p>

<p>Like most people, I&rsquo;ve read the warnings about not posting travel plans in advance, or sharing updates that might lead potential thieves to break into empty homes. That concern is not the primary driver of my reluctance, though.</p>

<p>It&rsquo;s more that various platforms (and Facebook especially) are, weirdly, both a kind of diary and a public performance &mdash; and, when I travel, I realize that the diary-ness is really an illusion. If Facebook is a diary, it&rsquo;s one from which you read out loud in a public park &mdash; or at least at a motley gathering of friends, relatives and colleagues.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>Traveling with others means enforced physical closeness and a heightened sense of synchronously shared new experiences. It makes one realize what social media does and does not provide.</p></blockquote></figure>
<p>On vacation, for me, the &ldquo;social&rdquo; and &ldquo;media&rdquo; become separated. The social happens in my interactions with the people with whom I&rsquo;m traveling, or the new people I meet along the way. It might happen, too, with the people who are close enough to me to know where (geographically) I am. But Facebook &ldquo;friends,&rdquo; say, encompass so many others who don&rsquo;t have that granular awareness of my life. The interaction with them is a different kind of &ldquo;social.&rdquo; Likeable and rewarding though it is, it is definitely more performative. Travel makes me more aware of this.</p>

<p>On social media, each of my vacations will eventually become a story. While I travel, the script is being improvised on a daily basis; as far as I&rsquo;m concerned, it&rsquo;s not ready for public consumption. Back at home, once I know how the vacation story developed, what the highlights were, how it ended, what I want to call out or remember, I usually do post details and pictures. Back at home, the &ldquo;social&rdquo; and the &ldquo;media&rdquo; once again blur together a bit more. The awareness of what Facebook really is and isn&rsquo;t fades again &mdash; until the next trip.</p>

<p>I realize that many people use various platforms very differently while on vacation, and I don&rsquo;t mean any criticism of their ways. Our experiences of social media (especially given the vast differences among platforms &mdash; say, Facebook versus Snapchat), vary widely. But I&rsquo;m reminded of an article I read, published ever so long ago in social-media-time 2014), titled <a href="http://www.newyorker.com/science/maria-konnikova/social-media-affect-math-dunbar-number-friendships">&ldquo;The Limits of Friendship.&rdquo;</a> The article discussed the research of anthropologist and psychologist Robin Dunbar. &ldquo;There is no question,&rdquo; it said, &ldquo;that networks like Facebook are changing the nature of human interaction.&rdquo;</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>What Facebook does and why it&rsquo;s been so successful in so many ways is it allows you to keep track of people who would otherwise effectively disappear. But one of the things that keeps face-to-face friendships strong is the&nbsp;<a href="http://psycnet.apa.org/psycinfo/2001-00651-001">nature of shared experience</a>: You laugh together, you dance together, you gape at the hot-dog eaters on Coney Island together.</p>

<p>We do have a social-media equivalent &mdash; sharing, liking, knowing that all of your friends have looked at the same cat video on YouTube as you did &mdash; but it lacks the synchronicity of shared experience. It&rsquo;s like a comedy that you watch by yourself &mdash;  you won&rsquo;t laugh as loudly or as often, even if you&rsquo;re fully aware that all your friends think it&rsquo;s hysterical. We&rsquo;ve seen the same movie, but we can&rsquo;t bond over it in the same way.&nbsp;</p>
</blockquote>
<p>Traveling with others means enforced physical closeness and a heightened sense of synchronously shared new experiences. No wonder, then, that it makes one realize what social media does and does not provide.</p>

<p>Come to think of it, my family&rsquo;s never been to Coney Island. Maybe it&rsquo;s time to start planning the next trip.</p>
<hr class="wp-block-separator" />
<p><a href="https://www.linkedin.com/in/irina-raicu-b65a707"><em>Irina Raicu</em></a><em> is the director of the Internet Ethics program at the </em><a href="https://www.scu.edu/ethics/"><em>Markkula Center for Applied Ethics</em></a><em>, Santa Clara University. Follow the program on Twitter at </em><a href="https://twitter.com/IEthics?lang=en"><em>@IEthics</em></a><em>.</em></p>
<hr class="wp-block-separator" />
<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[Is it time to separate the news from the Facebook newsfeed?]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2017/2/14/14601652/facebook-newsfeed-fake-news-filter-bubble-personalized" />
			<id>https://www.vox.com/2017/2/14/14601652/facebook-newsfeed-fake-news-filter-bubble-personalized</id>
			<updated>2017-02-15T16:09:18-05:00</updated>
			<published>2017-02-14T09:00:01-05:00</published>
			<category scheme="https://www.vox.com" term="Business &amp; Finance" /><category scheme="https://www.vox.com" term="Facebook" /><category scheme="https://www.vox.com" term="Media" /><category scheme="https://www.vox.com" term="Money" /><category scheme="https://www.vox.com" term="Social Media" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[Social media scholars talk a lot about &#8220;context collapse,&#8221; the term that describes what happens when, on a platform like Facebook, users find that they can&#8217;t communicate freely with their friends while their relatives are reading the same posts, or with their relatives while the employers get to read, too, etc. The mix of audiences [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Thanasak Wanichpan" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/7979395/newsfeed_facebook.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Social media scholars talk a lot about &ldquo;context collapse,&rdquo; the term that describes what happens when, on a platform like Facebook, users find that they can&rsquo;t communicate freely with their friends while their relatives are reading the same posts, or with their relatives while the employers get to read, too, etc.</p>

<p>The mix of audiences (on Facebook especially) has led to miscommunications, conflicts and, increasingly, <a href="https://www.theguardian.com/technology/2016/apr/19/facebook-users-sharing-less-personal-data-zuckerberg">self-censorship</a>.</p>

<p>In its drive to encourage users to connect with as many people as possible on its platform, Facebook undermined its own appeal. What started as a way for college students to communicate with other college students has become a &#8230; well, it&rsquo;s not clear what. That&rsquo;s in part because, along the way, another collapse happened.</p>

<p>Writing in AllThingsD in December 2014, Mike Isaac <a href="http://allthingsd.com/20131210/facebook-wants-to-be-a-newspaper-facebook-users-have-their-own-ideas/">highlighted</a> the distinction between most users&rsquo; view of the 2014 version of Facebook and the company&rsquo;s plan at the time:</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>Most people think of Facebook in a similar way: It&rsquo;s a place to share photos of your kids. It&rsquo;s a way to keep up with friends and family members. It&rsquo;s a place to share a funny, viral story or LOLcat picture you&rsquo;ve stumbled upon on the Web.</p>

<p>This is not how Facebook thinks of Facebook. In Mark Zuckerberg&rsquo;s mind, Facebook should be &ldquo;the best personalized newspaper in the world.&rdquo;</p>
</blockquote>
<p>Fast-forward to 2017, when the Facebook newsfeed is full of links to news articles, and a new Pew Research Center study analyzes &ldquo;<a href="http://www.journalism.org/2017/02/09/how-americans-encounter-recall-and-act-upon-digital-news/">How Americans Encounter, Recall and Act Upon Digital News</a>.&rdquo; Among the study&rsquo;s interesting observations, a few stand out:</p>
<ul class="wp-block-list"><li>“When asked how they arrived at news content in their most recent web interaction, <a href="http://www.journalism.org/2017/02/09/experiential-appendix-key-concepts/">online news consumers</a> were about equally likely to get news by going directly to a news website (36 percent of the times they got news, on average) as getting it through social media (35 percent).”</li><li>“Individuals who said they followed a link to a news story were asked if they could write down the name of the news outlet they landed on. On average, they provided a name 56 percent of the time. But they were far more able to do so when that link came directly from a news organization — such as through an email or text alert from the news organization itself — than when it came from social media or an email or text from a friend.”</li><li>“… 10 percent of consumers, when asked to name the source of the news, wrote in ‘Facebook’ as a specific news outlet.”</li></ul>
<p>Following the recent presidential election, a lot has been written about Facebook&rsquo;s role in the spread of &ldquo;<a href="http://www.politifact.com/truth-o-meter/article/2016/dec/13/2016-lie-year-fake-news/">fake news</a>&rdquo; (of all stripes), as well as its role in the creation of &ldquo;<a href="https://en.wikipedia.org/wiki/Filter_bubble">filter bubbles</a>&rdquo; &mdash; the name for the phenomenon that people&rsquo;s own inclinations, combined with the &ldquo;personalization&rdquo; efforts that further entrench those inclinations, tend to create social media echo chambers in which users are not exposed to beliefs (or even facts) different than those they already possess &mdash; which in turn leads them to believe, mistakenly, that most people share their view of the world.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>Following the recent presidential election, a lot has been written about Facebook’s role in the spread of  “fake news,” and its role in the creation of “filter bubbles.”</p></blockquote></figure>
<p>It turns out that the communication medium through which you send out party selfies or pictures of your pets and/or request more pictures of your grandkids and/or describe your dinner or your walk or your breakup in order to make yourself known and be admired or comforted by people who like you is not eminently suited to be, at the same time, a significant part of your serious news diet.</p>

<p>It&rsquo;s as if the family Christmas letter has been grafted, next to a work-related photo album and a collection of flirting texts, onto the trunk of a newspaper or news show curated by your aunt and by Facebook&rsquo;s <a href="http://www.slate.com/articles/technology/cover_story/2016/01/how_facebook_s_news_feed_algorithm_works.html">ever-changing algorithm</a>.</p>

<p>Picture the front page of the [insert name of newspaper you trust] or the homepage of [insert digital media source that you trust] featuring several articles about the new Supreme Court nominee, a photo from your college roommate&rsquo;s child&rsquo;s baseball game, a note from one of your childhood friends who now lives in another country, two articles about executive orders, a friend&rsquo;s comment about no longer eating meat, and a colleague&rsquo;s post about a work-related upcoming event on ethics and robots. Is that not what your Facebook Newsfeed looks like now? Mine does.</p>

<p>What do we turn to Facebook for?</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>We lose when we miss the news that we didn’t realize we should know. We lose when we stop reading articles that aren’t directly about us, and opinions that contradict ours.</p></blockquote></figure>
<p>In the wake of the election, reading about filter bubbles and fakes, I&rsquo;ve been thinking more and more about what I want out of Facebook. Increasingly, I don&rsquo;t want it to be my source of rehashed news stories and my opportunity to preach to the choir. I want it to be my go-to place for pictures of friends and jokes and little self-revelatory notes about people I care about. Maybe the occasional recipe, recommendation of a restaurant or a book. The announcement of a new birth. Even the heart-wrenching update about the health of someone else&rsquo;s loved one. Suggestions for political involvement or directions to marches, sure. The political is personal, after all; I realize that.</p>

<p>But the news is something else. It&rsquo;s not about the &ldquo;personal.&rdquo; The last thing we need, and the last thing we should want, is a &ldquo;personalized newspaper.&rdquo; We lose when we miss the news that we didn&rsquo;t realize we should know. We lose when we stop reading articles that aren&rsquo;t directly about us, and opinions that contradict ours. Serendipity is a blessing, not a waste; it enlarges our sense of what&rsquo;s possible (both good and bad), and our understanding of the true complexity of the world.</p>

<p>Keeping up with the news is important. Communicating with friends and family is important, too. But maybe it&rsquo;s time to separate the &ldquo;news&rdquo; from the newsfeed again &mdash; not because either of them is unnecessary or frivolous, but because they deserve different kinds of attention. Blended together, they now blur into a whole less meaningful than its parts.</p>
<hr class="wp-block-separator" />
<p><a href="https://www.linkedin.com/in/irina-raicu-b65a707"><em>Irina Raicu</em></a><em>&nbsp;is the director of the Internet Ethics program at the&nbsp;</em><a href="http://scu.edu/ethics/"><em>Markkula Center for Applied Ethics</em></a><em>, Santa Clara University. Follow the Internet Ethics program&nbsp;</em><a href="https://twitter.com/iethics"><em>@IEthics</em></a><em>. The opinions expressed in this essay are the author&rsquo;s own.</em></p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[Young adults take more security measures for their online privacy than their elders]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2016/11/2/13390458/young-millennials-oversharing-security-digital-online-privacy" />
			<id>https://www.vox.com/2016/11/2/13390458/young-millennials-oversharing-security-digital-online-privacy</id>
			<updated>2016-11-02T13:30:39-04:00</updated>
			<published>2016-11-02T13:00:03-04:00</published>
			<category scheme="https://www.vox.com" term="Privacy &amp; Security" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[For some time now, conventional wisdom has claimed that young people don&#8217;t care about privacy. As it happens, that conventional wisdom had always been, at best, an oversimplification. By now, it is simply wrong. The kids are all right: A Pew Research study says that “young adults generally are more focused than their elders when [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="antoniodiaz / Shutterstock" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/7390627/passing%2520notes%2520antoniodiaz.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>For some time now, conventional wisdom has claimed that young people don&rsquo;t care about privacy. As it happens, that conventional wisdom had always been, at best, an oversimplification. By now, it is simply wrong.</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>The kids are all right: A Pew Research study says that “young adults generally are more focused than their elders when it comes to online privacy.”</p></blockquote></figure>
<p>To begin with, though, a clarifying question: What do you mean by &ldquo;young people&rdquo;? For example, three recently published studies on youth and privacy assess different age groups. In a recent survey called &ldquo;<a href="http://www.pewresearch.org/fact-tank/2016/09/21/the-state-of-privacy-in-america/">The State of Privacy in Post-Snowden America</a>,&rdquo; the Pew Research Trust brackets 18-to-29-year-olds as &ldquo;young adults.&rdquo; Another recent study, titled &ldquo;<a href="https://staysafeonline.org/about-us/news/national-cyber-security-alliance-survey-reveals-the-complex-digital-lives-of-american-teens-and-parents">Keeping Up with Generation App</a>,&rdquo; by the National Cyber Security Alliance, surveyed 13-to-17-year-olds. And in a third, described in an article titled &ldquo;<a href="http://ijoc.org/index.php/ijoc/article/view/4655">What Can I Really Do?&rsquo; Explaining the Privacy Paradox with Online Apathy</a>&rdquo; (published in the International Journal of Communication), researchers Eszter Hargittai and Alice Marwick examine &ldquo;&lsquo;young adults&rsquo; understanding of Internet privacy issues&rdquo; &mdash; based on data gathered from focus groups with participants aged 19-35.</p>

<p>When people talk about young people and privacy, it&rsquo;s useful, first, to point out that the privacy-related attitudes, concerns and practices of 13-year-olds are, as you might imagine, quite different from those of 35-year-olds.</p>

<p>Still, what the three studies cited above all demonstrate is that people between 13 and 35 <em>do</em> care about keeping some control over their information, and take measures to protect their privacy online, even as they sense that most such measures are imperfect solutions.</p>

<p>It may surprise you to find out that 60 percent of the teens surveyed for the NCSA report &ldquo;say they have created accounts that their parents were unaware of, such as on a social media site or for an app.&rdquo; That is a privacy-protective measure: When it comes to privacy violations, the people teens are most worried about are their parents. As the report notes, &ldquo;teens greatly value having some level of privacy from their parents when using the internet.&rdquo;</p>

<p>The older &ldquo;young people&rdquo; surveyed by Hargittai and Marwick report that they deploy a wide variety of privacy-protective measures: &ldquo;Using different sites and apps for different purposes, configuring settings on social media sites, using pseudonyms in certain situations, switching between multiple accounts, turning on incognito options in their browsers, opting out of certain apps of sites, deleting cookies and even using Do-Not-Track browser plugins and password-management apps.&rdquo; How many &ldquo;older adults&rdquo; do such things?</p>
<figure class="wp-block-pullquote alignleft"><blockquote><p>“We’re not an oversharing generation. We’re a generation that’s over sharing — done, finished, kaput, through.”</p></blockquote></figure>
<p>The <a href="http://www.pewresearch.org/fact-tank/2016/09/21/the-state-of-privacy-in-america/">Pew Research study</a> notes that &ldquo;young adults generally are more focused than their elders when it comes to online privacy.&rdquo; That study asked about some privacy-protective strategies, as well: Among the 18-to-29-year-olds surveyed, 74 percent said they had cleared cookies and browser histories, 71 percent had deleted or edited something they had posted, 49 percent had configured their browsers to reject cookies, 42 percent had decided not to use certain sites that demanded their real names, and 41 percent had used temporary user names or email addresses. In each of those categories, the younger users surpassed their elders. The Pew report does also note that younger adults &ldquo;are more likely to have shared personal information online&rdquo; &mdash; but then, they grew up with the opportunity to do so, an opportunity their elders didn&rsquo;t have.</p>

<p>At Santa Clara University, my colleague Laura Robinson and I have also done several informal small-scale surveys trying to gauge students&rsquo; attitudes toward online privacy and their awareness of ways in which they might protect it. Recently, for example, we asked a group of 26 students, all of whom were between 18 and 24 years old, about managing privacy on social media. (Needless to say, they were surveyed anonymously.) Of those, 92 percent reported having monitored and adjusted their privacy settings, 70 percent had used different social media platforms to communicate with different groups, 42 percent had limited the number of friends/connections they had on each particular platform, 42 percent had also chosen not to use certain platforms at all due to privacy reasons, and 31 percent had installed ad blockers that prevent online tracking.</p>

<p>More startlingly, perhaps, 92 percent reported that they limit the kinds of things they post online. This echoes one of the findings of the Hargittai/Marwick study: &ldquo;While virtually all of our participants had adopted different approaches to protecting privacy, the only widely agreed-upon technique was self-censoring, or leaving information off the Internet entirely.&rdquo; Hargittai and Marwick add that &ldquo;as users understand their lack of control over their information [online], they retreat in certain ways when it comes to sharing.&rdquo;</p>

<p>For most of us, attitudes about the sharing of personal information change as we get older. As Bianka Bosker notes in <a href="http://www.huffingtonpost.com/bianca-bosker/facebook-10-anniversary_b_4718871.html">&ldquo;The Oversharers are Over Sharing&rdquo;</a>:</p>
<blockquote class="wp-block-quote has-text-align-none is-layout-flow wp-block-quote-is-layout-flow">
<p>My generation, the first on Facebook, was supposed to grow into a noisy army of oversharers. Raised on a steady diet of social media TMI, we were expected to lump fuddy-duddy ideas about privacy and discretion in with bell-bottoms and shoulder pads. &#8230; At the rate we were going back then, and judging by the way adults rolled their eyes at us, we should be broadcasting from the bathroom by now.</p>

<p>But when I poke through 10 years of Facebook, I see something else altogether. We&rsquo;re not an oversharing generation. We&rsquo;re a generation that&rsquo;s over sharing &mdash; done, finished, kaput, through. &hellip; All the chatty candor and hyperactive disclosure of our early years on Facebook now look like just another kind of youthful indulgence.</p>
</blockquote>
<p>As it turns out, though, that &ldquo;youthful indulgence&rdquo; might have been a temporary luxury. It&rsquo;s not just getting older that makes us less chatty and less likely to disclose ourselves; better understanding of the current internet environment makes young people wary, too, even when they wish they could say more. In fact, young people might now be rolling their eyes at their elders.&nbsp;</p>
<hr class="wp-block-separator" />
<p><a href="https://www.linkedin.com/in/irina-raicu-b65a707"><em>Irina Raicu</em></a><em>&nbsp;is the director of the Internet Ethics program at the&nbsp;</em><a href="http://scu.edu/ethics/"><em>Markkula Center for Applied Ethics</em></a><em>, Santa Clara University. Follow the Internet Ethics program&nbsp;</em><a href="https://twitter.com/iethics"><em>@IEthics</em></a><em>.</em></p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[Modern variations on the &#8216;Trolley Problem&#8217; meme]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2016/6/8/11871108/internet-ethics-trolley-problem-meme-drone-tesla-hyperloop" />
			<id>https://www.vox.com/2016/6/8/11871108/internet-ethics-trolley-problem-meme-drone-tesla-hyperloop</id>
			<updated>2016-06-08T08:00:06-04:00</updated>
			<published>2016-06-08T08:00:02-04:00</published>
			<category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[A recent article in the Huffington Post huffs about the &#8220;inexplicable&#8221; and &#8220;absurd&#8221; popularity of a trolley-problem-memes page created and run by two philosophy students (freshmen!) from Slovenia. The article&#8217;s author describes the trolley problem as &#8220;a simple ethical thought experiment, little known outside the corridors of academic philosophy&#8221; &#8212; which suggests that he hasn&#8217;t [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="pathdoc / Shutterstock" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/6603039/autonomous%2520car_pathdoc.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>A recent article in the Huffington Post huffs about <a href="http://www.huffingtonpost.com/linch-zhang/behind-the-absurd-popular_b_10247650.html">the &#8220;inexplicable&#8221; and &#8220;absurd&#8221; popularity of a trolley-problem-memes page</a> created and run by two philosophy students (freshmen!) from Slovenia. The article&rsquo;s author describes <a href="https://en.wikipedia.org/wiki/Trolley_problem">the trolley problem</a> as &#8220;a simple ethical thought experiment, little known outside the corridors of academic philosophy&#8221; &mdash; which suggests that he hasn&rsquo;t hung out much in certain corners of the internet. Indeed, the very popularity of the <a href="https://www.facebook.com/TrolleyProblemMemes/?fref=ts">&#8220;Trolley Problem Memes&#8221; page</a>, which currently has more than 45,000 followers on Facebook, might be evidence that his description is wrong.</p>
<p><q class="right">You are using the driverless mode on the freeway, when you notice a wounded person on the road ahead of you. Do you take over, or let the car decide what to do?</q></p>
<p>A different article, in the Atlantic, presents <a href="http://www.theatlantic.com/technology/archive/2015/10/trolley-problem-history-psychology-morality-driverless-cars/409732/">another view of the trolley problem</a>, and notes its renewed relevance: &#8220;In the past 40 years it has occupied the attention of brilliant minds, from academic ethicists to moral psychologists to engineers. It has helped them try to answer profound questions. &#8230; But recently, trolley problems have found new life in a more realistic application: research on driverless cars.&#8221;</p>

<p>The article then paraphrases philosophy professor <a href="http://philosophy.calpoly.edu/faculty/patrick-lin">Patrick Lin</a>, whose work at Cal Poly focuses in part on the ethics of driverless cars. According to Lin, &#8220;On the one hand, [the trolley problem] is a great entry point and teaching tool for engineers with no background in ethics. On the other hand, its prevalence, whimsical tone, and iconic status can shield you from considering a wider range of dilemmas and ethical considerations.&#8221;</p>

<p>&#8220;Iconic status,&#8221; or &#8220;little known&#8221; simple thought experiment?</p>

<p>Either way, inspired by those articles, and at the risk of adding to the &#8220;whimsical tone&#8221; now surrounding this experiment, I decided to consider additional ways to take the trolley problem outside of academia and into our modern world. Below are seven variations on the trolley problem.</p>

<p><strong>The Drone Problem</strong>: You are flying a drone. It&rsquo;s in the path of a landing airplane carrying five people. However, the <a href="http://www.mayoclinic.org/medical-professionals/clinical-updates/trauma/medical-drones-poised-to-take-off">drone is carrying medicine</a> that would save the life of one person on a nearby island that can only be reached quickly by drone. Do you keep flying it?</p>

<p><strong>The Hyperloop Problem</strong>: Same as the original trolley problem, except that instead of trolleys there are <a href="https://en.wikipedia.org/wiki/Hyperloop#Human_factors_considerations">pods floating on layers of air at speeds of up to 769 miles/hour</a>, so you have to make a decision and flip a switch much, much faster than in the trolley problem.</p>

<p><strong>The Driverless Car Problem</strong>: See multiple articles by Patrick Lin and others, including one that discusses <a href="http://www.wired.com/2014/08/heres-a-terrible-idea-robot-cars-with-adjustable-ethics-settings/">&#8220;adjustable ethics settings.&#8221;</a></p>

<p><strong>The VR Trolley Problem</strong>: Same as the original trolley problem, except it is &#8220;happening&#8221; inside your <a href="http://www.nas.nasa.gov/Software/VWT/vr.html">virtual reality</a> set. No real humans would be run over by a real trolley regardless of the choice you make. However, your choices, especially if made repeatedly, will themselves constitute a moral self-training. Which kind of person would you train yourself to be?</p>

<p><strong>The </strong><a href="http://www.wired.com/2015/10/tesla-self-driving-over-air-update-live/"><strong>Tesla Driverless Mode</strong></a><strong> Problem</strong>: You are using the driverless mode on the freeway, when you notice a wounded person on the road ahead of you. You think that the car will put on the brakes by itself. You know you can take over the driving, and that you might have time to swerve and avoid that person, but swerving might lead you to crash and injure yourself and/or several other people who are standing on the side of the road. Do you take over, or let the car decide what to do?</p>

<p><strong>The </strong><a href="http://www.latimes.com/business/la-fi-made-in-california-trolleymaker-20150527-story.html"><strong>Solar-Powered Trolley</strong></a><strong> Problem</strong>: Same as the original trolley problem, but now the trolleys are clean and energy-efficient. Does that change your decision?</p>

<p><strong>The &#8220;Smart City&#8221; Trolley Problem</strong>: Can you use sensors, software and predictive analytics to prevent all iterations of the trolley problem? If so, would that create a critical gap in ethics education?</p>

<p>Whimsical tone aside, the trolley problem does continue to be useful &mdash; and not just in academia.</p>
<hr class="wp-block-separator" />
<p><a href="https://www.linkedin.com/in/irina-raicu-b65a707"><em>Irina Raicu</em></a><em> is the director of the Internet Ethics program at the </em><a href="http://scu.edu/ethics/"><em>Markkula Center for Applied Ethics</em></a><em>, Santa Clara University. Follow the Internet Ethics program </em><a href="https://twitter.com/iethics"><em>@IEthics</em></a><em>.</em></p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[On the Ethics of Online Shaming]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2016/2/15/11587868/on-the-ethics-of-online-shaming" />
			<id>https://www.vox.com/2016/2/15/11587868/on-the-ethics-of-online-shaming</id>
			<updated>2019-03-06T05:39:21-05:00</updated>
			<published>2016-02-15T13:00:18-05:00</published>
			<category scheme="https://www.vox.com" term="Business &amp; Finance" /><category scheme="https://www.vox.com" term="Media" /><category scheme="https://www.vox.com" term="Money" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[Recently, a film critic and self-described fan of Amy Schumer tweeted a selfie with the comedian, accompanied by a (possibly offensive) joke. Schumer tweeted back a reply that was characterized by at least one media outlet as &#8220;shaming&#8221; the fan. A different media outlet, however, described the fan&#8217;s own action as &#8220;shaming.&#8221; The dueling headlines [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Dfree/Shutterstock" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/15798913/amy-schumer_dfree.0.1502257703.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Recently, a film critic and self-described fan of Amy Schumer tweeted a selfie with the comedian, accompanied by a (possibly offensive) joke. Schumer tweeted back a reply that was characterized by at least one media outlet as &ldquo;shaming&rdquo; the fan. A different media outlet, however, described the fan&rsquo;s own action as &ldquo;shaming.&rdquo;</p>

<p>The dueling headlines read, &ldquo;<a href="http://www.aol.com/article/2016/01/19/amy-schumer-calls-out-twitter-troll-for-slut-shaming-joke/21299560/">Amy Schumer Calls Out Twitter Troll for Slut-Shaming Joke</a>&rdquo; and &ldquo;<a href="http://www.mediaite.com/online/amy-schumer-ridiculously-shames-kid-for-doing-an-amy-schumer-joke/">Amy Schumer Ridiculously Shames Kid for Doing an Amy Schumer Joke</a>.&rdquo; The latter article pointed out that the fan/troll/critic was 17 years old; the former noted that Schumer&rsquo;s tweet &ldquo;quickly made the rounds online, which prompted a ton of negative feedback for the film critic and praise for the comedian.&rdquo;</p>
<blockquote class="red right"><p>Is online shaming just a new version of something that humans have always done, or is it substantially different, now that the shaming is taking place via the Internet? And, either way, is it the right thing to do?</p></blockquote>
<p>Online shaming has been a <a href="http://www.cnn.com/2015/04/16/living/feat-public-shaming-ronson/">hot topic over the last year</a>. An entire book (&ldquo;<a href="http://www.jonronson.com/shame.html">So You&rsquo;ve Been Publicly Shamed</a>&ldquo;) has already been written about it, and many articles have struggled to define it and analyze the phenomenon. Is it just a new version of something that humans have always done, or is it substantially different, now that the shaming is taking place via the Internet? And, either way, is it the right thing to do? Does the answer to that depend on the <a href="https://www.washingtonpost.com/news/the-intersect/wp/2015/09/16/can-online-shaming-shut-down-the-internets-most-skin-crawly-creeps/">power differential between the &ldquo;shamer&rdquo; and the person or people being &ldquo;shamed</a>,&rdquo; as several commentators have suggested? Does it depend on what triggered the shaming? Does it depend on what exactly &ldquo;a ton of negative feedback&rdquo; really means?</p>

<p>Perhaps one reason we are struggling with this phenomenon is that it presents an <a href="http://plato.stanford.edu/entries/moral-dilemmas/">ethical dilemma</a>, in which <a href="https://www.scu.edu/ethics/ethics-resources/ethical-decision-making/justice-and-fairness/">justice arguments</a> are made on both sides. Some may see online shaming as a way of standing up for justice and equality, especially when powerful people say something mean or stupid and are socially shamed in response (&ldquo;<a href="https://www.washingtonpost.com/news/the-intersect/wp/2015/09/16/can-online-shaming-shut-down-the-internets-most-skin-crawly-creeps/">punching up at a category of smug, entitled, misogynist dudes</a>,&rdquo; as Caitlin Dewey put it in a Washington Post article describing a particular set of cases).</p>

<p>Others, however, may see it as inherently unethical because the shamer has no real control over the proportionality of the response: In many cases of online shaming, the effects seem to be disproportionate to the offense that set them off (when <a href="http://www.theguardian.com/technology/2015/feb/21/internet-shaming-lindsey-stone-jon-ronson">the shaming goes viral and then marks the person</a>, both online and off, potentially forever) &mdash; and proportionality is a key part of a just response.</p>
<blockquote class="red right"><p>We might want to be the kind of people who stand up for ourselves or people we agree with &mdash; but do we want to be the kind of people who shame others, meting out (potentially disproportionate) vigilante justice?</p></blockquote>
<p>Online shaming may also pose a conflict between justice concerns and <a href="https://www.scu.edu/ethics/ethics-resources/ethical-decision-making/ethics-and-virtue/">virtue ethics</a>. We might want to be the kind of people who stand up for ourselves or people we agree with &mdash; but do we want to be the kind of people who shame others, meting out (potentially disproportionate) vigilante justice? In fact, sometimes the people who start an online shaming &ldquo;wave&rdquo; <a href="http://www.indystar.com/story/life/food/2016/01/04/kilroys-goes-viral-after-response-angry-customer/78252026/">later regret their actions</a>. Regret, in this case, seems to be a recognition of the fact that their own actions didn&rsquo;t match up with their values.</p>

<p>And the <a href="https://www.scu.edu/ethics/ethics-resources/ethical-decision-making/the-common-good/">common good</a> is involved, as well. We&rsquo;ve decided, as a society, that public flogging and scarlet letters are not appropriate responses to wrongdoing. Why? Perhaps because we now believe that public shaming diminishes not just the person shamed, but all the people who are participating in it or observing it. As Jon Ronson, the author of &ldquo;So You&rsquo;ve Been Publicly Shamed,&rdquo; notes, &ldquo;<a href="http://www.cnn.com/2015/04/16/living/feat-public-shaming-ronson/">It&rsquo;s so corrosive to create that kind of society</a>.&rdquo;</p>

<p>Even more than older types of public shaming, online shaming lacks context. It occurs, usually, not within an ongoing relationship between the people involved &mdash; and is therefore more likely to be triggered, unfairly, by misunderstandings. It also usually doesn&rsquo;t offer those who are shamed the ability to either explain themselves or redeem themselves. And it is often hasty &mdash; a wave that grows out of individual drops of shame triggered by easy, spur-of-the-moment clicks of a button. The wave then crashes on someone&rsquo;s head; explanations, apologies or other efforts to respond never go viral in the same way as the shaming; and the attention of the denizens of a particular Internet platform (and related media coverage) moves on.</p>

<p>The private/public platforms on which so many of us communicate these days play a part in this. They allow us to easily reach out to people we don&rsquo;t know well (which means we may misread each other). At the same time, they often foster the impression that we&rsquo;re communicating with &ldquo;friends&rdquo; &mdash; and (if only by virtue of the fact that we&rsquo;re all sitting at our own computers, seemingly &ldquo;private&rdquo;) obscure the fact that those friends are themselves nodes in great networks, which can quickly and easily forward communications to other nodes.</p>

<p>What should we do, then? Do we just resign ourselves to others&rsquo; insults or threats online and do nothing, for fear of triggering a shaming wave? No &mdash; that would not be an ethical response, either.</p>
<blockquote class="red right"><p>The private/public platforms on which so many of us communicate these days play a part in this. They allow us to easily reach out to people we don&rsquo;t know well (which means we may misread each other). At the same time, they often foster the impression that we&rsquo;re communicating with &ldquo;friends.&rdquo;</p></blockquote>
<p>One thing we might do is learn to modulate our responses better. We can take a bit longer to respond. We can try to respond less publicly, at least at first: Give people a chance to clarify, explain, recant or apologize &mdash; privately. In addition, if we&rsquo;re not directly involved, we can also decide to let the wronged person respond, without feeling the need to jump in ourselves and magnify that response. And, even if we feel that public shaming is warranted sometimes, we could try to limit it to truly egregious cases (assessing whether the benefits would outweigh the &ldquo;corrosive&rdquo; effects on society).</p>

<p>Finally, <a href="http://www.cnn.com/2015/04/16/living/feat-public-shaming-ronson/">as others have noted</a>, we also need more forgiveness on the Internet. However much some people condemn the European efforts around the mistakenly dubbed &ldquo;right to be forgotten,&rdquo; the need for some kind of Internet forgetting is clear, too &mdash; and, as philosopher Luciano Floridi has noted, <a href="http://www.cnn.com/2015/04/16/living/feat-public-shaming-ronson/">it&rsquo;s very much related to forgiveness</a>.</p>

<p>In the story about Amy Schumer and the unfortunate joke, as soon as the self-described fan realized that Schumer had not taken it in stride, he apologized. Schumer accepted his apology. But we haven&rsquo;t yet figured out how to make forgiveness go viral.</p>
<hr class="wp-block-separator" />
<p><a href="https://www.linkedin.com/in/irina-raicu-b65a707"><em>Irina Raicu</em></a><em> is the director of the Internet Ethics program at the </em><a href="http://scu.edu/ethics/"><em>Markkula Center for Applied Ethics</em></a><em>, Santa Clara University. Follow the Internet Ethics program </em><a href="https://twitter.com/iethics"><em>@IEthics</em></a><em>.</em></p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[Metaphors of Big Data]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2015/11/6/11620416/metaphors-of-big-data" />
			<id>https://www.vox.com/2015/11/6/11620416/metaphors-of-big-data</id>
			<updated>2019-03-06T05:37:32-05:00</updated>
			<published>2015-11-06T05:00:25-05:00</published>
			<category scheme="https://www.vox.com" term="Big Data" /><category scheme="https://www.vox.com" term="Privacy &amp; Security" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[What do bacon, oil, tsunamis, exhaust, deluges, nuclear waste and teenage sex have in common? They are all things to which &#8220;Big Data&#8221; has been likened. Many excellent essays have addressed Big Data metaphors. They include &#8220;Data Is the New &#8216;___&#8217;,&#8221; by Sara Watson; &#8220;Big Data Metaphors We Live By,&#8221; by Kailash Awati and Simon [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="CopyBlogger.com" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/15798407/big-data.0.1485548634.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>What do bacon, oil, tsunamis, exhaust, deluges, nuclear waste and teenage sex have in common? They are all things to which &ldquo;Big Data&rdquo; has been likened.</p>

<p>Many excellent essays have addressed Big Data metaphors. They include &ldquo;<a href="http://dismagazine.com/discussion/73298/sara-m-watson-metaphors-of-big-data/">Data Is the New &lsquo;___&rsquo;,</a>&rdquo; by Sara Watson; &ldquo;<a href="https://medium.com/@kailashawati/big-data-metaphors-we-live-by-98d3fa44ebf8">Big Data Metaphors We Live By</a>,&rdquo; by Kailash Awati and Simon Buckingham Shum; &ldquo;<a href="http://ijoc.org/index.php/ijoc/article/view/2169">Big Data, Big Questions: Metaphors of Big Data</a>,&rdquo; by Cornelius Puschmann and Jean Burgess; and &ldquo;<a href="https://simplysociology.wordpress.com/2013/10/29/swimming-or-drowning-in-the-data-ocean-thoughts-on-the-metaphors-of-big-data/">Swimming or Drowning in the Data Ocean? Thoughts on the Metaphors of Big Data</a>,&rdquo; by Deborah Lupton. Those articles, however, discuss the &ldquo;metaphors of Big Data&rdquo; as if they&rsquo;re all efforts to describe the same thing. But they are not.</p>

<p>The metaphors and similes cited above refer to at least three distinct things. The &ldquo;tsunami&rdquo; and &ldquo;deluge&rdquo; are attempts to illustrate the challenges of handling vast and ever-changing datasets. The teenage-sex simile is a comment on the hype surrounding the notion of Big Data: In 2013, Dan Ariely said that &ldquo;<a href="https://www.facebook.com/dan.ariely/posts/904383595868">Big data is like teenage sex</a>: Everyone talks about it, nobody really knows how to do it, everyone thinks everyone else is doing it, so everyone claims they are doing it &hellip;&rdquo; The better-known &ldquo;oil&rdquo; and &ldquo;bacon&rdquo; metaphors refer to the large datasets themselves, which are being collected by various entities these days, regarded as assets, and mined or chewed up for insights.</p>

<p>It might be more useful to treat those three groups as distinct, and to address separately the metaphors commonly applied to each of them. In particular, I want to focus on the metaphors that are squarely directed at big datasets, at collections of information &mdash; as opposed to Big Data-related processes or hype. Because even Big Data sets are not the same, and our metaphors should reflect that. We can&rsquo;t just discuss even this one subset of the Big Data phenomenon as if we all know what we&rsquo;re talking about. The <em>kinds</em> of data matter.</p>

<p>Take, for example, <a href="http://www.theguardian.com/technology/2008/jan/15/data.security">the metaphor of Big Data as nuclear waste</a>. This metaphor has been applied as a response, a corrective, to the much better known mantra of &ldquo;data is the new oil.&rdquo; The nuclear waste metaphor is, however, a reference to a particular kind of Big Data: Personal data about individual human beings. (Privacy professionals talk a lot about &ldquo;PII&rdquo;: Personally identifiable information &mdash; which is a broader concept. They/we have long discussions about what constitutes PII. This is not one of those discussions.)</p>

<p>There are many large data sets, such as data about atmospheric or oceanic conditions, or about production outputs in various companies, or energy consumption by particular vehicles, that would probably not be described, even by Big Data critics, as &ldquo;radioactive material.&rdquo; Let&rsquo;s separate those out. Let&rsquo;s clarify that there&rsquo;s a distinct problem when intimate personal data about individual human beings is what&rsquo;s being described as &ldquo;the new oil&rdquo; or &ldquo;the new bacon&rdquo; and treated like an ordinary asset.</p>

<p>Technology critic Evgeny Morozov has argued that the commodification of personal details is not a matter of property rights. In a New Republic article titled &ldquo;<a href="http://www.newrepublic.com/article/117703/selling-personal-data-big-techs-war-meaning-life">Selling Your Bulk Online Data Really Means Selling Your Autonomy</a>,&rdquo; he writes:</p>
<blockquote class="memo"><p>&ldquo;Our data constitutes our very humanity. To voluntarily treat it as an &lsquo;asset class&rsquo; is to agree to the fate of an interactive billboard. We shouldn&rsquo;t unquestionably accept the argument that personal data is just like any other commodity and that most of our digital problems would disappear if only, instead of gigantic data monopolists like Google and Facebook, we had an army of smaller data entrepreneurs. We don&rsquo;t let people practice their right to autonomy in order to surrender that very right by selling themselves into slavery. Why make an exception for those who want to sell a slice of their intellect and privacy rather than their bodies?&rdquo;</p></blockquote>
<p>Is that true for <em>any</em> personal data, though? Should we draw even finer distinctions? Strangers have long had access to some details about most of us &mdash; our names, phone numbers and even addresses have been fairly easy to find, even before the advent of the Internet. And marketers have long created, bought and sold lists that grouped customers based on various differentiating criteria. But marketers didn&rsquo;t use to have access to, say, our <a href="http://www.theatlantic.com/technology/archive/2015/11/google-searches-privacy-danger/413614/">search topics</a>, back when we were searching in libraries, not Googling. The post office didn&rsquo;t ask us to agree that it was allowed to open our letters and scan them for keywords that would then be sold to marketers that wanted to reach us with more accurately personalized offers. We would have balked. We should balk now.</p>

<p>Maybe some personal data can be sold without undermining our autonomy, and some can&rsquo;t. Access to a person&rsquo;s name and phone number is not the same as access to his or her Social Security number, or search topics, or communications with his or her coworkers, friends, family or lovers. The intimate details of our lives, and in particular our communications (including those on any social media that does not clearly describe itself as &ldquo;public&rdquo;) should be differentiated from &ldquo;the new oil&rdquo; or &ldquo;the new bacon.&rdquo; They should, indeed, be off the market.</p>

<p>At the same time, we should acknowledge that not all Big Data is radioactive. We need to separate our metaphors, and maybe come up with some new ones, too, in order to give clarity to the issues we now face in the new data economy.</p>
<hr class="wp-block-separator" />
<p><em>Irina Raicu is the Director of the Internet Ethics program at the </em><a href="http://scu.edu/ethics/"><em>Markkula Center for Applied Ethics</em></a><em>, Santa Clara University. Follow the Internet Ethics program on Twitter at </em><a href="https://twitter.com/iethics"><em>@IEthics</em></a><em>.</em></p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[The Right to Be Forgotten, the Privilege to Be Remembered]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2015/2/20/11559248/the-right-to-be-forgotten-the-privilege-to-be-remembered" />
			<id>https://www.vox.com/2015/2/20/11559248/the-right-to-be-forgotten-the-privilege-to-be-remembered</id>
			<updated>2019-03-06T05:18:37-05:00</updated>
			<published>2015-02-20T11:15:12-05:00</published>
			<category scheme="https://www.vox.com" term="Big Data" /><category scheme="https://www.vox.com" term="Big Tech" /><category scheme="https://www.vox.com" term="Business &amp; Finance" /><category scheme="https://www.vox.com" term="Google" /><category scheme="https://www.vox.com" term="Media" /><category scheme="https://www.vox.com" term="Money" /><category scheme="https://www.vox.com" term="Privacy &amp; Security" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[My mom was born in February. She died in February, too. And this year, as in past years since her death, in February lame marketers and data brokers remember her. They send her letters which say, right on the envelope, &#8220;Happy Birthday!&#8221; Personalized, for her. I wish they would forget her. What does it mean [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Uriel Sinai/The New York Times" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/15793874/numbers.0.1502257703.png?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>My mom was born in February. She died in February, too. And this year, as in past years since her death, in February lame marketers and data brokers remember her. They send her letters which say, right on the envelope, &ldquo;Happy Birthday!&rdquo; Personalized, for her.</p>

<p>I wish they would forget her.</p>

<p>What does it mean to be &ldquo;forgotten&rdquo;? So many people are, whether they have a right to it or not. But being easily linked, forever, to certain search engine results &mdash; is that the opposite of being forgotten? Is that being &ldquo;remembered&rdquo;?</p>

<p>Last year, the Court of Justice of the European Union <a href="http://www.theguardian.com/technology/2014/may/14/explainer-right-to-be-forgotten-the-newest-cultural-shibboleth">issued a decision</a> that has been broadly mischaracterized as establishing, in Europe, a &ldquo;right to be forgotten.&rdquo; The ruling mandated that search engines delist certain results from certain searches at certain people&rsquo;s request.</p>

<p>Following that, Google <a href="https://www.google.com/advisorycouncil/">appointed an advisory council</a> (buttressed by input from public meetings held in several European capitals) to advise the company in regard to its implementation of the decision.</p>

<p>The council <a href="https://www.google.com/advisorycouncil/">released its report</a> earlier this month. One of the first things that it notes is that the ruling &ldquo;does not establish a general Right to be Forgotten. Implementation of the ruling does not have the effect of &lsquo;forgetting&rsquo; information about a data subject.&rdquo;</p>

<p>Instead, the decision is about what the report calls &ldquo;delisting&rdquo;: &ldquo;It requires Google to remove links returned in search results based on an individual&rsquo;s name when those results are &lsquo;inadequate, irrelevant or no longer relevant, or excessive.&rsquo;&rdquo; However, the report points out, &ldquo;Google is not required to remove those results if there is an overriding public interest in them &lsquo;for particular reasons, such as the role played by the data subject in public life.&rdquo;</p>

<p>(But if I were to write about the &ldquo;right to be delisted,&rdquo; would people know what I was referring to? How do we change the misnomer &mdash; which we perpetuate with each new piece about it?)</p>

<p>And how do we assess &ldquo;the role played by the subject in public life&rdquo;? I remember reading a New Yorker article about Stalin&rsquo;s daughter, Svetlana, who in 1967 &ldquo;<a href="http://www.newyorker.com/magazine/2014/03/31/my-friend-stalins-daughter">became the Cold War&rsquo;s most famous defector</a>.&rdquo; In the West, she then published several books. But then the public lost interest in her.</p>

<p>&ldquo;By the time the Cold War ended,&rdquo; writes Nicholas Thompson in the New Yorker, &ldquo;Svetlana had almost completely disappeared from public view. In the next 20 years, the Times published only one story about her, a five-paragraph squib, in 1992, declaring that she &lsquo;is living in obscurity in a charity hostel.&rsquo;&rdquo;</p>

<p>For purposes of de-linking certain articles from searches on her name, would Google have deemed her, at that point, a &ldquo;public figure&rdquo;? And, if so, would it have deemed her so even if she had not defected or published books? Would she have been a public figure simply by being Stalin&rsquo;s daughter &mdash; and so not de-linkable from stories about her as such?</p>

<p>Some critics of the decision about the &ldquo;right to be delisted&rdquo; have argued that the ruling is vague, and pointed out difficulties of line-drawing &mdash; like the one above. But most laws can be attacked for vagueness, and most pose line-drawing problems, yet we still have laws.</p>

<p>Other critics of the ruling have expressed particular concern over the impact it might have on the preservation and dissemination of information about crimes against humanity. The Advisory Council&rsquo;s report stresses this concern: &ldquo;Where content relates to a historical figure or historical events, the public has a particularly strong interest in accessing it online via a name-based search, and it will weigh against delisting. The strongest instances include links to information regarding crimes against humanity.&rdquo;</p>

<p>For a number of years now, Yad Vashem, the Israeli museum dedicated to commemorating and studying the Holocaust, has been hard at work digitizing its documents, and more recently it has launched an effort to use Big Data analytics and face-recognition techniques to try to identify victims who have remained, after all these years, unidentified.</p>

<p>A recent article about those efforts, titled &ldquo;<a href="http://www.bloomberg.com/news/articles/2015-01-27/inside-the-massive-project-to-uncover-the-holocaust-s-nameless-victims#p1">Inside the Massive Project to Uncover the Holocaust&rsquo;s Nameless Victims</a>,&rdquo; quotes Gabriel Weimann, an Israeli professor, who notes that &ldquo;[p]osting the information online has many advantages, especially for the future generations.&rdquo; Of course, linking the information, to names, would also help future generations learn more.</p>

<p>But is it accurate to describe that as &ldquo;remembering&rdquo;? Is there a difference between remembering and archiving? And how do we balance the potential benefits to future generations with the potential harms to current ones? The same techniques that are being used to identify the nameless dead could be used, by bad actors, to hunt the living.</p>

<p>My mother survived World War II as a little Jewish girl in Eastern Europe. Many people like her were sent to concentration camps. In Israel, some of the relatives of camp survivors are tattooing on their own arms the numbers that had been tattooed, by force, on their elders&rsquo;. That is an effort at something other than &ldquo;remembering&rdquo;: An effort at transmitting a certain memory, passing it into the future. A different way of linking to historical information about crimes against humanity.</p>

<p>But some of the survivors themselves are greeting this practice with shock, anger, disbelief. And maybe fear, too?</p>

<p>Privacy scholars Woodrow Hartzog and Evan Selinger have written about <a href="http://www.theatlantic.com/technology/archive/2013/01/obscurity-a-better-way-to-think-about-your-data-than-privacy/267283/">information obscurity as &ldquo;a protective state</a> that can further a number of goals, such as autonomy, self-fulfillment, socialization, and relative freedom from the abuse of power.&rdquo; Can we find the balance that allows for both obscurity and preservation &mdash; even proclamation &mdash; of the past?</p>

<p>My mother lived long enough to Google herself.</p>

<p>Do we want to be identified, and remembered? That depends: By whom, and for what purpose? The EU decision on delisting gives people some level of control over the information that can be easily found about them. Forgetting, and remembering, are about a lot more.</p>
<hr class="wp-block-separator" />
<p><em>Irina Raicu is the Director of the Internet Ethics program at the </em><a href="http://scu.edu/ethics/"><em>Markkula Center for Applied Ethics</em></a><em>, Santa Clara University. Follow the Internet Ethics program on Twitter at </em><a href="https://twitter.com/iethics"><em>@IEthics</em></a><em>.</em></p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[Metamorphosis]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2014/10/22/11632148/metamorphosis" />
			<id>https://www.vox.com/2014/10/22/11632148/metamorphosis</id>
			<updated>2019-03-06T06:22:54-05:00</updated>
			<published>2014-10-22T16:21:10-04:00</published>
			<category scheme="https://www.vox.com" term="Business &amp; Finance" /><category scheme="https://www.vox.com" term="Media" /><category scheme="https://www.vox.com" term="Money" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[When Sam Gregorsa woke up one morning from unsettling dreams, he reached for his e-reader. It was somewhere under his back, as hard as armored plate. He must have fallen asleep while reading, and then rolled on top of it. He pulled it out, but didn&#8217;t turn it on. Instead, he stretched his arm to [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="St. Martin&#039;s Press" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/15809976/kafka-cover.0.1502257703.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p><span class="dropcap">W</span>hen Sam Gregorsa woke up one morning from unsettling dreams, he reached for his e-reader. It was somewhere under his back, as hard as armored plate. He must have fallen asleep while reading, and then rolled on top of it. He pulled it out, but didn&rsquo;t turn it on. Instead, he stretched his arm to the edge of the pillow, where his phone was buzzing. A text. &ldquo;Your personal library has been upgraded. Enjoy!&rdquo;</p>
<p>The text had come from Amaizin &mdash; the maker of his e-reader. Sam was intrigued. He used lots of Amaizin services and products, but the e-books were his favorites. Far from being tempted by the siren songs of YouTunes or the visual richness of UsTube, he preferred to sit quietly, alone, vicariously experiencing the sights and sounds and smells and tastes and touches of other lives, so different from his own &mdash; through books.</p>

<p>The old-fashioned, paper kind of books were long gone. After a certain point, nobody had wanted to carry them, buy shelves for them, turn their pages, or figure out what to do with them once they were read. You could fit hundreds, thousands of them on an e-reader. You could get rid of read ones with the swipe of a finger. Trees were spared. Ink was used elsewhere. Luggage got lighter. Presents for avid readers no longer had to be wrapped.</p>

<p>Sam was indeed an avid reader, and his friends often bought him books. Well, usually they bought him Amaizin cards, so he could get for himself whatever books he wanted. To be nice, he sometimes asked the gift-giver for recommendations, but really, most of the time, Sam already knew what he wanted to read next. Or, if he didn&rsquo;t, he was happy to rely on Amaizin&rsquo;s recommendation engine.</p>

<p>The recommendation engine was a thing of wonder, fed by many springs of information. <a href="http://online.wsj.com/articles/SB10001424052702304870304577490950051438304">It kept track of what Sam read, when he read it, and for how long at a time</a>. It kept track of books he didn&rsquo;t finish, or that he made his way through slowly. It noted others that he devoured in one sitting. It kept track of what he highlighted, and words that he looked up via the e-reader&rsquo;s dictionary. And it did that for all his friends, too, and all the other readers who read some of the same books that he did.</p>

<p>But Sam had never before gotten an announcement about his library being upgraded. And it was not his birthday. As he wondered what had triggered the upgrade, he opened the hard shell and turned on the e-reader.</p>
<p><span class="dropcap">A</span>t first, things didn&rsquo;t seem different. Then he noticed that certain books had disappeared. He remembered old stories, from back in 2009, about how certain books had disappeared from a different brand of e-reader: George Orwell&rsquo;s &ldquo;1984&rdquo; and &ldquo;Animal Farm,&rdquo; for example. <a href="http://www.nytimes.com/2009/07/18/technology/companies/18amazon.html?_r=2&amp;">The New York Times had written about one particular customer</a>: &ldquo;A 17-year-old from the Detroit area was reading &lsquo;1984&rsquo; on his Kindle for a summer assignment and lost all his notes and annotations when the file vanished. &lsquo;They didn&rsquo;t just take a book back, they stole my work,&rsquo; he said.&rdquo; Sam almost shuddered. He had lots of notes in his e-books, too.</p>
<p>And he remembered another old story, from back in 2012, when a publisher had replaced a particular word in a classic novel with another: As the headline in Ars Technica had noted at the time, &ldquo;<a href="http://arstechnica.com/information-technology/2012/06/nook-version-of-war-and-peace-turns-the-word-kindled-into-nookd/">Nook version of &lsquo;War and Peace&rsquo; turns the word &lsquo;kindled&rsquo; into &lsquo;Nookd.</a>&rdquo; Of course, the change was made on &ldquo;War and Peace&rdquo; in translation; still, Sam thought books should keep their words as originally published.</p>

<p>Weirdly, though, he also noticed books that had appeared in his collection overnight, but that he&rsquo;d never bought or borrowed or asked for. Some of them sounded interesting &mdash; for example, a memoir by a guy who called himself &ldquo;Bono&rsquo;s Doppelganger.&rdquo; Sam had heard of Bono. But that reminded him of another story he had read. Back in 2014, many Apple users had realized one day that a certain <a href="http://arstechnica.com/apple/2014/09/u2s-new-album-is-showing-up-on-your-iphone-whether-you-want-it-or-not/">U2 album had appeared in their iTunes libraries</a> &ldquo;all by itself.&rdquo; And many of them were not pleased.</p>

<p>This was weird, Sam thought. What if he lived in a country in which possession of certain books was illegal, and his e-reader was full of those books? What if his e-reader was now full of pornography? Or documents that contained trade secrets &mdash; or state secrets? His eyes scanned the list, hoping not to come across &ldquo;Mein Kampf.&rdquo; But there were so many books &hellip;</p>

<p>Marx seemed to have disappeared, but so had Poe and Nietzsche and Ursula K. LeGuin. He didn&rsquo;t recognize the titles and authors of many of the books, but he was too worried to either click on them or look them up online to see what they might be. Who knows what inferences might be drawn about him based on those clicks or those searches? What if people found out, and judged him, and started treating him like a giant cockroach?</p>

<p><a href="http://www.mcsweeneys.net/articles/a-brief-q-a-with-dave-eggers-about-his-new-novel-the-circle">&ldquo;The Circle,&rdquo; by Dave Eggers</a>, was also gone; he hadn&rsquo;t had time to read that yet.</p>

<p>He scanned the list of books one more time, and he was jarred by a discrepancy. A Ray Bradbury novel was still there. Its title, though, was now &ldquo;Fahrenheit 75.&rdquo;</p>
<hr class="wp-block-separator" />
<p><em>Irina Raicu is the director of the </em><a href="http://scu.edu/ethics/"><em>Internet Ethics program at the Markkula Center for Applied Ethics</em></a><em>, Santa Clara University. Follow the Internet Ethics program </em><a href="https://twitter.com/IEthics"><em>@IEthics</em></a>.</p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Irina Raicu</name>
			</author>
			
			<title type="html"><![CDATA[&#8220;For These Times&#8221;: Dickens on Big Data]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2014/5/1/11626330/for-these-times-dickens-on-big-data" />
			<id>https://www.vox.com/2014/5/1/11626330/for-these-times-dickens-on-big-data</id>
			<updated>2019-03-06T05:52:38-05:00</updated>
			<published>2014-05-01T09:15:49-04:00</published>
			<category scheme="https://www.vox.com" term="Business &amp; Finance" /><category scheme="https://www.vox.com" term="Media" /><category scheme="https://www.vox.com" term="Money" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[While some writers like to imagine what Plato would have said about the Googleplex and other aspects of current society, when it comes to life regulated and shaped by data and algorithms, Charles Dickens is the one to ask. His novel &#8220;Hard Times&#8221; is subtitled &#8220;For these times,&#8221; and his exploration of oversimplification through numbers [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Mike Licht/Flickr" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/15802428/charles-dickens-blogging_mike-lichtflickr.0.1502257703.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>While some writers like to imagine <a href="http://www.bostonglobe.com/arts/books/2014/03/15/book-review-plato-googleplex-why-philosophy-won-away-rebecca-newberger-goldstein/JWeMh4oS5OcbqBT7vRzNIL/story.html">what Plato would have said about the Googleplex</a> and other aspects of current society, when it comes to life regulated and shaped by data and algorithms, Charles Dickens is the one to ask. His novel &ldquo;Hard Times&rdquo; is subtitled &ldquo;For these times,&rdquo; and his exploration of oversimplification through numbers certainly makes that subtitle apt again.</p>

<p>&ldquo;Hard Times&rdquo; is set in a fictional Victorian mill town in which schools and factories are purportedly run based on data and reason. In Dickens&rsquo;s days, the Utilitarians proposed <a href="https://www.scu.edu/ethics/practicing/decision/calculating.html">a new take on ethics</a>, and social policies drew on utilitarian views. &ldquo;Hard Times&rdquo; has often been called a critique of utilitarianism; however, its critique is not directed primarily at the goal of maximizing happiness and minimizing harm, but at the focus on facts/data/measurable things at the expense of everything else. Dickens was excoriating what today we would call algorithmic regulation and education.</p>

<p>Explaining the impact of &ldquo;algorithmic regulation,&rdquo; social critic <a href="http://www.evgenymorozov.com/">Evgeny Morozov</a> writes about</p>

<p><em>&hellip; the construction of </em><a href="http://www.technologyreview.com/featuredstory/520426/the-real-privacy-problem/"><em>&ldquo;invisible barbed wire&rdquo; around our intellectual and social lives</em></a><em>. Big data, with its many interconnected databases that feed on information and algorithms of dubious provenance, imposes severe constraints on how we mature politically and socially. The German philosopher J&uuml;rgen Habermas was right to warn &mdash; in 1963 &mdash; that &ldquo;an exclusively technical civilization &hellip; is threatened &hellip; by the splitting of human beings into two classes &mdash; the social engineers and the inmates of closed social institutions.</em></p>

<p>Dickens&rsquo;s &ldquo;Hard Times&rdquo; is concerned precisely with the social engineers and the inmates of closed social institutions. Its prototypical &ldquo;social engineer&rdquo; is Thomas Gradgrind &mdash; a key character who advocates data-based scientific education (and nothing else):</p>

<p><em>Thomas Gradgrind, sir. A man of realities. A man of fact and calculations. &hellip; With a rule and a pair of scales, and the multiplication table always in his pocket, sir, ready to weight and measure any parcel of human nature, and tell you exactly what it comes to. It is a mere question of figures, a case of simple arithmetic.</em></p>

<p>Today he would carry a cellphone in his pocket instead of a multiplication table, but he is otherwise a modern man: A proponent of a certain way of looking at the world, through big data and utopian algorithms.</p>

<p>Had Dickens been writing today, would he have set his book in Silicon Valley? Writers like Dave Eggers do, in novels like &ldquo;<a href="http://www.theguardian.com/books/2013/oct/12/the-circle-dave-eggers-review">The Circle</a>,&rdquo; which explores more recent efforts at trying out theoretical social systems on vast populations.</p>

<p>Morozov frequently does, too, as in his article &ldquo;<a href="http://www.faz.net/aktuell/feuilleton/debatten/the-internet-ideology-why-we-are-allowed-to-hate-silicon-valley-12658406.html">The Internet Ideology: Why We Are Allowed to Hate Silicon Valley</a>,&rdquo; where he argues that the &ldquo;connection between the seeming openness of our technological infrastructures and the intensifying degree of control [by corporations, by governments, etc.] remains poorly understood.&rdquo;</p>

<p>Dickens was certainly concerned by the intensifying control he observed in the ethos of his age. &ldquo;You,&rdquo; says one of his educators to a young student, &ldquo;are to be in all things regulated and governed &hellip; by fact. We hope to have, before long, a board of fact, composed of commissioners of fact, who will force the people to be a people of fact, and of nothing but fact. You must discard the word Fancy altogether. You have nothing to do with it.&rdquo;</p>

<p>Fancy, wonder, imagination, creativity &mdash; all qualities that can&rsquo;t be accurately quantified &mdash; may indeed be downplayed (even if unintentionally) in a fully quantified educational system. In such a system, what happens to the questions that can&rsquo;t be answered once and for all?</p>

<p>As Dickens puts it, &ldquo;Herein lay the spring of the mechanical art and mystery of educating the reason without stooping to the cultivation of the sentiments and affections. Never wonder. By means of addition, subtraction, multiplication, and division, settle everything somehow, and never wonder.&rdquo;</p>

<p>And what about the world beyond the school? In &ldquo;Hard Times,&rdquo; the other people subjected to &ldquo;algorithmic regulation&rdquo; are the workers of Coketown. Describing this fictional Victorian mill town (after having visited a real one), Dickens writes:</p>

<p><em>Fact, fact, fact, everywhere in the material aspect of the town; fact, fact, fact, everywhere in the immaterial. The &hellip; school was all fact, and the school of design was all fact, and the relations between master and man were all fact, and everything was fact between the lying-in hospital and the cemetery, and what you couldn&rsquo;t state in figures, or show to be purchaseable in the cheapest market and saleable in the dearest, was not, and never should be, world without end, Amen.</em></p>

<p>The overarching point, for Dickens, is that many of the most important aspects of human life are not measurable with precision and not amenable to algorithmically designed policies. &ldquo;It is known,&rdquo; he writes in &ldquo;Hard Times&rdquo;:</p>

<p><em>&hellip; to the force of a single pound weight, what the engine will do; but not all the calculators of the National Debt can tell me the capacity for good or evil, for love or hatred, for patriotism or discontent, for the decomposition of virtue into vice, or the reverse, at any single moment in the soul of one of these its quiet servants&hellip;. There is no mystery in it; there is an unfathomable mystery in the meanest of them, for ever.</em></p>

<p>Does the exponentially greater power of our &ldquo;calculators&rdquo; challenge that perception? Does big data mean &ldquo;no mystery,&rdquo; even in human beings?</p>

<p>We are buffered by claims that Google or Facebook or some other data-collection entities know us better than we know ourselves (or at least <a href="http://www.huffingtonpost.com/2014/02/23/ray-kurzweil_n_4842972.html">better than our spouses do</a>); we are implementing <a href="http://papers.ssrn.com/sol3/papers.cfm?abstract_id=2403028">predictive policing</a>; we ponder <a href="http://www.wired.com/2012/01/algorithmic-education/">algorithmic approaches to education</a>.</p>

<p>As we do so, Dickens&rsquo;s characters, like Thomas Gradgrind&rsquo;s daughter Louisa, call out a warning from another time when we tried this approach. In a moment of crisis, Louisa (who has been raised on a steady diet of facts) tells Gradgrind, &ldquo;With a hunger and thirst upon me, father, which have never been for a moment appeased; with an ardent impulse toward some region where rules, and figures, and definitions were not quite absolute; I have grown up, battling every inch of my way.&rdquo;</p>

<p>In &ldquo;Hard Times,&rdquo; all the children who are shaped by the utilitarian fact-based approach, with no room for wonder and fancy, are stifled and stilted. Eventually, even Gradgrind realizes this. &ldquo;I only entreat you to believe &hellip; I have meant to do right,&rdquo; he tells his daughter. Dickens adds, &ldquo;He said it earnestly, and to do him justice he had. In gauging fathomless deeps with his little mean excise-rod, and in staggering over the universe with his rusty stiff-legged compasses, he had meant to do great things.&rdquo; But this is not a Disney ending; Gradgrind&rsquo;s remorse does not reverse the damage done to Louisa and others like her. And the lives of the algorithmically-governed people of Coketown are miserable.</p>

<p>In a <a href="http://blogs.scientificamerican.com/moral-universe/2014/03/25/quantiphobia-and-turning-morals-into-facts/">recent Scientific American blog</a>, psychologist Adam Waytz uses the term &ldquo;quantiphobia&rdquo; in reference to the claim that creativity is unquantifiable. He describes himself as &ldquo;bugged&rdquo; by such claims, and wonders &ldquo;where such quantiphobia originates.&rdquo; He then writes that</p>

<p><em>&hellip; both neural and self-report evidence show that people tend to represent morals like preferences more than like facts. Getting back to the issue of quantiphobia, my sense is that when numbers are appended to issues with moral relevance, this moves them out of the realm of preference and into the realm of fact, and this transition unnerves us.</em></p>

<p>Is it irrational to be &ldquo;unnerved&rdquo; by this transition? What Waytz fails to address is the assumption that numbers or facts provide greater or more objective truth than unquantified &ldquo;preferences&rdquo; do. Of course, the process through which &ldquo;numbers are appended to issues&rdquo; is itself subjective &mdash; expressive of preferences. What we choose to measure, and how, is subjective. How we analyze the resulting numbers is subjective. The movement into the realm of fact is not equivalent to a movement into the realm of truth. The refusal to append numbers to certain things is not &ldquo;quantiphobia&rdquo; &mdash; it is wisdom.</p>

<p>This is not to dismiss the very real benefits that can be derived from big-data analytics and algorithmic functions in many contexts. We can garner those and still acknowledge that certain things may be both extremely important and unmeasurable, and that our policies and approaches should reflect that reality.</p>

<p>Dickens throws down a gauntlet for our times: &ldquo;Supposing we were to reserve our arithmetic for material objects, and to govern these awful unknown quantities [i.e., human beings] by other means!&rdquo;</p>

<p>Dear Reader, if you are a &ldquo;<a href="http://www.merriam-webster.com/dictionary/quant">quant</a>,&rdquo; <a href="http://www.gutenberg.org/ebooks/786">please read &ldquo;Hard Times.&rdquo;</a> And no, don&rsquo;t <a href="http://fivethirtyeight.com/features/parsing-is-such-sweet-sorrow/">count the lines</a>.</p>

<p><a href="http://www.scu.edu/ethics/about/people/directors/internet/raicu/"><em>Irina Raicu</em></a><em> is the director of the Internet Ethics program at the </em><a href="http://www.scu.edu/ethics-center/index.cfm"><em>Markkula Center for Applied Ethics</em></a><em>, Santa Clara University. Reach her </em><a href="https://twitter.com/IEthics"><em>@IEthics</em></a>.</p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
	</feed>
