<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Jules Polonetsky Omer Tene | Vox</title>
	<subtitle type="text">Our world has too much noise and too little context. Vox helps you understand what matters.</subtitle>

	<updated>2019-03-06T11:17:36+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.vox.com/author/jules-polonetsky-omer-tene" />
	<id>https://www.vox.com/authors/jules-polonetsky-omer-tene/rss</id>
	<link rel="self" type="application/atom+xml" href="https://www.vox.com/authors/jules-polonetsky-omer-tene/rss" />

	<icon>https://platform.vox.com/wp-content/uploads/sites/2/2024/08/vox_logo_rss_light_mode.png?w=150&amp;h=100&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Jules Polonetsky Omer Tene</name>
			</author>
			
			<title type="html"><![CDATA[The Facebook Experiment: Gambling? In This Casino?]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2014/7/2/11628536/the-facebook-experiment-is-there-gambling-in-this-casino" />
			<id>https://www.vox.com/2014/7/2/11628536/the-facebook-experiment-is-there-gambling-in-this-casino</id>
			<updated>2019-03-06T06:17:36-05:00</updated>
			<published>2014-07-02T12:55:23-04:00</published>
			<category scheme="https://www.vox.com" term="Big Tech" /><category scheme="https://www.vox.com" term="Business &amp; Finance" /><category scheme="https://www.vox.com" term="Facebook" /><category scheme="https://www.vox.com" term="Google" /><category scheme="https://www.vox.com" term="Media" /><category scheme="https://www.vox.com" term="Money" /><category scheme="https://www.vox.com" term="Privacy &amp; Security" /><category scheme="https://www.vox.com" term="Social Media" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[Critics have spent the last few days castigating Facebook for a large-scale experiment conducted by researchers who wanted to learn the effects of tweaking the dosage of positive or negative comments on a user&#8217;s News Feed. Would people who are exposed to more negative comments than the average delivered to them by the Facebook algorithm [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/15808675/claude-rains1.0.1537219264.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Critics have spent the last few days <a href="http://www.nytimes.com/2014/06/30/technology/facebook-tinkers-with-users-emotions-in-news-feed-experiment-stirring-outcry.html?_r=1">castigating Facebook</a> for a large-scale experiment conducted by researchers who wanted to learn the effects of tweaking the dosage of positive or negative comments on a user&rsquo;s News Feed. Would people who are exposed to more negative comments than the average delivered to them by the Facebook algorithm be more or less prone to positivity themselves?</p>

<p>Many <a href="https://medium.com/message/what-does-the-facebook-experiment-teach-us-c858c08e287f">scorned Facebook&rsquo;s actions</a> as an unruly experiment on human subjects without their knowledge or informed consent. Kashmir Hill <a href="http://www.forbes.com/sites/kashmirhill/2014/06/29/facebook-doesnt-understand-the-fuss-about-its-emotion-manipulation-study/">lamented</a> what she called &ldquo;a new level of experimentation, turning Facebook from a fishbowl into a petri dish.&rdquo; Arthur Caplan <a href="http://www.nbcnews.com/health/mental-health/opinion-facebook-experiment-used-silicon-valley-trickery-n144386">wrote</a> that the experiment &ldquo;should send a shiver down the spine of any Facebook user or anyone thinking about becoming one,&rdquo; and that it should never have been performed.</p>

<p>Others were more sanguine, <a href="http://www.talyarkoni.org/blog/2014/06/28/in-defense-of-facebook/">pointing out</a> that in considering the use of algorithms to tailor content &mdash; on Facebook and elsewhere &mdash; one was reminded of Captain Renault&rsquo;s protest as he <a href="https://www.youtube.com/watch?v=SjbPi00k_ME">walked into a casino</a> in &ldquo;Casablanca&rdquo;: &ldquo;I&rsquo;m shocked, <em>shocked</em> to find that gambling is going on in here!&rdquo; They claimed that, far from being an exception to conventional business practice, manipulation of user experience on a digital platform is the market norm. On the Web, on mobile and increasingly in our homes and on wearable devices, data is analyzed to increase user engagement, satisfaction, traction, or shopping appetite.</p>

<p>Indeed, Facebook itself has engaged in experimentation with much more ambitious aspirations than merely gauging user sentiment. Last year, working with researchers from Johns Hopkins University, <a href="http://www.hopkinsmedicine.org/news/media/releases/the_facebook_effect_social_media_dramatically_boosts_organ_donor_registration">Facebook adjusted its profile settings</a> so users could announce their status as an organ donor, or sign up if they weren&rsquo;t already registered. Over a single day, the new feature prompted more than 13,000 individuals to sign up as organ donors &mdash; more than 21 times the daily average. Most observers would agree that increasing organ-donation rates is a laudable goal, but clearly, some kinds of social influence must be considered off-limits or subject to special disclosures.</p>

<p>Big-data analysis is already used in multiple contexts, to personalize the delivery of education in K-12 schools, reduce the time commuters spend on the road, contain greenhouse emissions, detect harmful drug interactions, encourage weight loss, and much more. Such data uses promise tremendous societal benefits, but at the same time creates new risks of surveillance, discrimination, and opaque algorithmic decision-making. In this environment, who is best placed to distinguish right from wrong, to warn before corporate practices cross <a href="http://recode.net/2014/04/18/introducing-a-theory-of-creepy/">the &ldquo;creepy&rdquo; line</a>?</p>

<p>Increasingly, corporate officers find themselves struggling to decipher subtle social norms and make ethical choices that are more befitting of philosophers than business managers or lawyers. Perhaps the most powerful example is the <a href="http://www.nytimes.com/2014/05/14/technology/google-should-erase-web-links-to-some-personal-data-europes-highest-court-says.html?_r=0">European court&rsquo;s decision</a> to appoint Google an arbitrator of thousands of individual contests between privacy rights and freedom of speech. Google reacted by <a href="http://www.usnews.com/news/business/articles/2014/05/30/google-taking-requests-to-censor-results-in-europe">setting up a panel of experts</a> comprising senior officials as well as five external experts, including an Oxford philosopher, a civil-rights activist and a United Nations representative. It will have to deal with a steady barrage of requests from individuals who want to wipe their data record clean.</p>

<p>Google&rsquo;s model will soon have to be replicated by companies tackling a broad swath of policy dilemmas. Should a fitness app &ldquo;manipulate&rdquo; users to coax them to eat less and exercise more? Is an airline overstepping the bounds of social etiquette by Googling passengers&rsquo; names to personalize their experience? Should an app developer offer a student a level-two math app after she completes level one?</p>

<p>These decisions echo the mandates of academic review boards (IRBs), which operate in research institutions under <a href="http://www.thefacultylounge.org/2014/06/how-an-irb-could-have-legitimately-approved-the-facebook-experimentand-why-that-may-be-a-good-thing.html">formulaic rules</a> and follow strict protocols. It may be a challenge to deploy traditional IRBs in the corporate domain, which is restricted by concerns for confidentiality, patents and trade-secrecy law. But it would be unfortunate if the lesson that industry takes from this episode is to keep algorithmic decisions confidential, or prevent access to corporate data coffers by the academic-research community.</p>

<p>Going forward, companies will need to create <a href="http://www.stanfordlawreview.org/online/privacy-and-big-data/consumer-subject-review-boards">new processes</a>, deploying a <a href="http://scholarlycommons.law.northwestern.edu/cgi/viewcontent.cgi?article=1191&amp;context=njtip">toolbox of innovative solutions</a> to engender trust and mitigate normative friction. Fortunately, many companies have already laid the groundwork for such delicate decision-making by appointing chief privacy officers. Others have budding internal ethical review programs.</p>

<p>But big-data analysis <a href="http://www.stanfordlawreview.org/online/privacy-and-big-data/privacy-and-big-data">raises issues</a> that transcend privacy and implicate broader policy concerns around discrimination, filter bubbles, access to data, and the ethics of scientific research. Accordingly, it requires active engagement by both internal and external stakeholders to increase transparency, accountability and trust.</p>

<p>As the companies that serve us play an increasingly intimate role in our lives, understanding how they shape their services to influence users has become a vexing policy issue. Data can be used for control and discrimination or utilized to support fairness and freedom. Establishing a process for ethical decision-making is key to ensuring that the benefits of data exceed their costs.</p>

<p><a href="http://www.futureofprivacy.org/about/about-jules-polonetsky/"><em>Jules Polonetsky</em></a><em> serves as executive director and co-chair of the </em><a href="http://www.futureofprivacy.org/"><em>Future of Privacy Forum</em></a><em>, a Washington, D.C.-based think tank that seeks to advance responsible data practices. Founded five years ago, FPF is supported by more than 80 leading companies, as well as an advisory board comprised of the country&rsquo;s leading academics and advocates. FPF&rsquo;s current projects focus on online data use, smart grid, mobile data, big data, apps and social media. Reach him </em><a href="https://twitter.com/JulesPolonetsky"><em>@JulesPolonetsky</em></a><em>.</em></p>

<p><a href="http://omertene.com/">Omer Tene</a> is vice president of research and education at the <a href="https://www.privacyassociation.org/">International Association of Privacy Professionals</a> (IAPP), where he administers the Westin Fellowship program and fosters ties between industry and academia. Before joining IAPP, he was vice dean of the College of Management School of Law, Rishon Le Zion, Israel. Tene is an affiliate scholar at the Stanford Center for Internet and Society, and a senior fellow at the Future of Privacy Forum. Reach him <a href="https://twitter.com/omertene">@omertene</a>.</p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Jules Polonetsky Omer Tene</name>
			</author>
			
			<title type="html"><![CDATA[The Right Response to the &#8220;Right to Delete&#8221;]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2014/5/21/11627140/the-right-response-to-the-right-to-delete" />
			<id>https://www.vox.com/2014/5/21/11627140/the-right-response-to-the-right-to-delete</id>
			<updated>2019-03-06T05:53:50-05:00</updated>
			<published>2014-05-21T14:23:51-04:00</published>
			<category scheme="https://www.vox.com" term="Big Tech" /><category scheme="https://www.vox.com" term="Business &amp; Finance" /><category scheme="https://www.vox.com" term="Facebook" /><category scheme="https://www.vox.com" term="Google" /><category scheme="https://www.vox.com" term="Media" /><category scheme="https://www.vox.com" term="Money" /><category scheme="https://www.vox.com" term="Snapchat" /><category scheme="https://www.vox.com" term="Social Media" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[Last week&#8217;s decision by the European Court of Justice, requiring Google to delete search results that display a Spanish user in a bad light, continues to cause consternation among online experts and supporters of free speech. The decision demonstrates that even the highest of courts can lack basic technological dexterity. At the same time, it [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="Lonni/Shutterstock" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/15802742/erase-chalkboard_lonnishutterstock.0.1537276546.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Last week&rsquo;s <a href="http://www.nytimes.com/2014/05/14/technology/google-should-erase-web-links-to-some-personal-data-europes-highest-court-says.html">decision</a> by the European Court of Justice, requiring Google to delete search results that display a Spanish user in a bad light, continues to <a href="http://www.nytimes.com/2014/05/15/opinion/dont-force-google-to-forget.html">cause consternation</a> among online experts and supporters of free speech.</p>

<p>The decision demonstrates that even the highest of courts can lack basic technological dexterity. At the same time, <a href="http://www.nytimes.com/2014/05/21/opinion/dowd-remember-to-forget.html?hpw&amp;rref=opinion">it reflects a genuine concern</a>, which casts a long shadow over the digital revolution. It is a reaction to the uncompromising <a href="http://www.nytimes.com/2010/07/25/magazine/25privacy-t2.html">persistence of data</a>, of which search results about an individual are just one prominent manifestation.</p>

<p>Unwittingly, the European Court appointed Google a global online censor, imposing on it the unenviable burden of policing content on the Web. In doing so, it furnished Google (and similar online intermediaries) with strikingly vague criteria and little process, to boot. And if understaffed privacy regulators intend to handle complaints case by case, they will soon be swamped by an unmanageable deluge of individual <a href="https://www.google.com/transparencyreport/removals/copyright/requests/">take-down requests</a>.</p>

<p>But condemning the Court&rsquo;s decision should not invalidate the concerns it sought to address.</p>

<p>To be sure, free access in milliseconds to all of the world&rsquo;s information from a laptop, smartphone, or (soon) wearable device has brought society <a href="http://www.stanfordlawreview.org/online/privacy-and-big-data/privacy-and-big-data">untold value</a>. It revolutionized access to knowledge, invigorated democratic forces, boosted productivity and spawned innovation on a scale unrecognized since the dawn of civilization.</p>

<p>But it also created a world where our every misdeed is recorded, forever curated and instantly available. Never mind crimes and misdemeanors &mdash; today&rsquo;s pipeline spews a digital exhaust comprising emails, posts, tweets, photos, location tags, news reports, court decisions, public ledgers, you name it, painting a likeness of ourselves that we may not wish, or deserve, to have chiseled in perpetuity. Such data may, and in fact does, <a href="http://www.pbs.org/mediashift/2008/10/teacher-fired-for-inappropriate-behavior-on-myspace-page289/">come back to haunt us</a> years later, in different contexts and with entirely different audiences.</p>

<p>We are being forced to assume that anything digital will last forever and may find its way to the public domain. But such technological determinism is not an inescapable force of nature. It is a man-made construct that can adapt to accommodate generally acceptable social norms. As Jaron Lanier argued in his book &ldquo;<a href="http://www.amazon.com/You-Are-Not-Gadget-Manifesto/dp/0307389979">You Are Not a Gadget</a>,&rdquo; technology should be designed to serve humans and reflect their values, not the other way around.</p>

<p>Not every comment that we make near the water cooler is recorded for the ages. We constantly write stuff down on notes that end up in the trash.</p>

<p>In the offline world, there is a degree of impermanence to many of our activities; daily conversations and Post-its are not expected to last forever. And we share an understanding of how fleeting any type of action should be: Our expectations of the publicity and permanence of a book, for example, differ greatly from those connected to a scrap of paper left on a desk. Sometimes, our expectations are broken when social norms are violated. Someone kisses and tells, or writes down a secret and betrays a confidence.</p>

<p>But in general, much of our information is subject to fairly clear norms that guide us in who can accessible what and for how long. Why can&rsquo;t technology do more to ensure that certain types of recorded data decays or becomes less accessible with time? Much more than law, which by definition applies uniformly, technology can account for subtle differences in individuals&rsquo; subjective privacy expectations, which fluctuate based on the context and nuance of interpersonal relationships.</p>

<p>Of course, technology can already do this. The newspaper in the Google case could have used the &ldquo;<a href="http://en.wikipedia.org/wiki/Robots_exclusion_standard">robots.txt</a>&rdquo; command, which signals to the search engine to not spider its content. Companies adopt email deletion policies to ensure that old records are not available after a set date.</p>

<p>Ironically, the phenomenon of the social Web, where we set the audience for our content and have the ability to delete posts or change access settings, has started to set the direction. Facebook or Google+ posts are viewed by only your friends or followers, and are not available to the general public. And the meteoric rise of <a href="http://www.usatoday.com/story/tech/personal/2014/03/25/anonymity-apps-rise/6863433/">discreet social interaction apps</a> like Snapchat, Frankly, Secret and Whisper demonstrates the growing demand for ephemeral communications and non-public expression.</p>

<p>So, let&rsquo;s have many more companies experiment with default settings that allow for data decay. While these solutions are imperfect, they chart a promising path toward a world where some friction allows us to retain and hide a bit of ourselves.</p>

<p>More than 120 years ago, in their seminal article, &ldquo;<a href="http://groups.csail.mit.edu/mac/classes/6.805/articles/privacy/Privacy_brand_warr2.html">The Right to Privacy</a>,&rdquo; Samuel Warren and Louis Brandeis warned that &ldquo;numerous mechanical devices threaten to make good the prediction that &lsquo;what is whispered in the closet shall be proclaimed from the house-tops.&#8217;&rdquo;</p>

<p>Similarly, we need to make sure that new techno-social constructs of the digital age don&rsquo;t end up tearing our social fabric. The best way to do so is by furnishing novel technical options to those whispering and those shouting, rather than by mandating legal constructs that disrupt our cherished norms of free speech.</p>

<p><a href="http://www.futureofprivacy.org/about/about-jules-polonetsky/"><em>Jules Polonetsky</em></a><em> serves as executive director and co-chair of the </em><a href="http://www.futureofprivacy.org/"><em>Future of Privacy Forum</em></a><em>, a Washington, D.C.-based think tank that seeks to advance responsible data practices. Founded five years ago, FPF is supported by more than 80 leading companies, as well as an advisory board comprised of the country&rsquo;s leading academics and advocates. FPF&rsquo;s current projects focus on online data use, smart grid, mobile data, big data, apps and social media. Reach him </em><a href="https://twitter.com/JulesPolonetsky"><em>@JulesPolonetsky</em></a><em>.</em></p>

<p><a href="http://omertene.com/">Omer Tene</a> is vice president of research and education at the <a href="https://www.privacyassociation.org/">International Association of Privacy Professionals</a> (IAPP), where he administers the Westin Fellowship program and fosters ties between industry and academia. Before joining IAPP, he was vice dean of the College of Management School of Law, Rishon Le Zion, Israel. Tene is an affiliate scholar at the Stanford Center for Internet and Society, and a senior fellow at the Future of Privacy Forum. Reach him <a href="https://twitter.com/omertene">@omertene</a>.</p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
	</feed>
