<?xml version="1.0" encoding="UTF-8"?><feed
	xmlns="http://www.w3.org/2005/Atom"
	xmlns:thr="http://purl.org/syndication/thread/1.0"
	xml:lang="en-US"
	>
	<title type="text">Jules Polonetsky | Vox</title>
	<subtitle type="text">Our world has too much noise and too little context. Vox helps you understand what matters.</subtitle>

	<updated>2019-03-06T10:18:08+00:00</updated>

	<link rel="alternate" type="text/html" href="https://www.vox.com/author/jules-polonetsky" />
	<id>https://www.vox.com/authors/jules-polonetsky/rss</id>
	<link rel="self" type="application/atom+xml" href="https://www.vox.com/authors/jules-polonetsky/rss" />

	<icon>https://platform.vox.com/wp-content/uploads/sites/2/2024/08/vox_logo_rss_light_mode.png?w=150&amp;h=100&amp;crop=1</icon>
		<entry>
			
			<author>
				<name>Jules Polonetsky</name>
			</author>
			
			<author>
				<name>Dennis Hirsch</name>
			</author>
			
			<title type="html"><![CDATA[The emerging ethical standards for studying corporate data]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2016/6/14/11923286/facebook-emotional-contagion-controversy-data-research-review-policy-ethics" />
			<id>https://www.vox.com/2016/6/14/11923286/facebook-emotional-contagion-controversy-data-research-review-policy-ethics</id>
			<updated>2016-06-14T09:00:09-04:00</updated>
			<published>2016-06-14T09:00:03-04:00</published>
			<category scheme="https://www.vox.com" term="Facebook" /><category scheme="https://www.vox.com" term="Social Media" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[Microsoft scientists, in an article published this week in the Journal of Oncology Practice, demonstrated that by analyzing large samples of search engine queries, they may, in some cases, be able to identify internet users who are suffering from pancreatic cancer, even before they have received a diagnosis of the disease. This is an example [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="In January 2012, as part of an experiment, Facebook deliberately manipulated the News Feeds of nearly 700,000 of its users.  | Chris Jackson / Getty" data-portal-copyright="Chris Jackson / Getty" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/6641545/Facebook%2520eye_%2520Chris%2520Jackson.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
	In January 2012, as part of an experiment, Facebook deliberately manipulated the News Feeds of nearly 700,000 of its users.  | Chris Jackson / Getty	</figcaption>
</figure>
<p>Microsoft scientists, in an article published this week in the Journal of Oncology Practice, demonstrated that by analyzing large samples of search engine queries, they may, in some cases, be able to identify internet users who are suffering from <a href="http://health.nytimes.com/health/guides/disease/pancreatic-carcinoma/overview.html?inline=nyt-classifier&amp;version=meter+at+0&amp;module=meter-Links&amp;pgtype=article&amp;contentId=&amp;mediaId=&amp;referrer=&amp;priority=true&amp;action=click&amp;contentCollection=meter-links-click">pancreatic cancer</a>, even before they have received a diagnosis of the disease.</p>

<p>This is an example of extremely useful research performed by a private company using data that it collects about individuals. Such research faces an important challenge. The ethical review standards that govern academic research using human subject data frequently do not apply in the private context. Without such standards and safeguards, public confidence in research of this type will erode. In today&rsquo;s big-data world, we need effective ethical review processes in the private sector, as well as in the academic one. Some companies are starting to respond to this challenge.</p>
<p><q class="right">Today, Facebook&rsquo;s ethics and policy staff published an important paper that provides a detailed overview of the company&rsquo;s research review process.</q></p>
<p><br>Studying data about people is not new. It has been a central occupation of researchers in almost every field of scientific endeavor. Whether to seek the causes of disease, to improve the safety of transportation, to understand human behavior or simply to improve general scientific knowledge, researchers have designed experiments, sought the express consent of individuals when warranted and proceeded with their studies.</p>

<p>The <a href="https://en.wikipedia.org/wiki/Common_Rule">Common Rule</a>, a federal government requirement, sets standards for such research when supported by government funds, and makes these studies subject to approval of university Independent Review Boards (IRBs), which provide ethical guidance allowing or deterring the research.</p>

<p>Increasingly, the data sets sought by researchers are not the limited experiments conducted on a group of volunteers recruited for a study. Large data sets are often available in the public domain, made public by users themselves on we sites and services or by open data projects making government data accessible. Other large data sets sought by researchers include the wide range of consumer information held privately by companies of every size. Companies themselves regularly study this data, usually to improve their own services and test new features and often to publish general research valuable to the broader scientific community. The opportunities for breakthroughs can be unexpected and potentially lifesaving, as the recent Microsoft study demonstrates.</p>
<p><q class="left">If companies feel that working with academics to conduct research is too uncertain or risky, innovative work will continue, but it will remain confidential and protected within companies, and unavailable to a wide research audience.</q></p>
<p><br>None of these data sets are subject to Common Rule and IRB oversight, either because they aren&rsquo;t linked to federal funding, or because the ethical guidelines of IRBs do not consider public data or data already collected for business purposes to be the types of &#8220;experiments&#8221; on humans that require review. In many cases, such a private company testing which layout of a website is appealing to web shoppers, such oversight is certainly unnecessary. But other new research, whether conducted by corporate researchers at leading social media companies or by academic researchers analyzing data sets that obtained from government sources or from the open web, has generated public debate.</p>

<p>The publicity around the <a href="http://www.theverge.com/2014/12/9/7360441/facebook-screwing-with-user-emotions-was-2014s-most-shared-scientific">Facebook &#8220;Emotional Contagion&#8221; study</a>, which sought to understand the effect of posts by social media users on their friends, helped bring the research ethics question to a broad audience. But many academic or corporate researchers had long been struggling to find the right frameworks for ethical review of the vast amount of research taking place today beyond the traditional academic context.</p>

<p>The questions about such research are numerous, and important. If a new type of research review process is needed, who should be subject to it? Startups don&rsquo;t have the resources to staff special review committees, and major corporations often have hundreds of tests of different kinds happening at any given time. Is only research intended for scientific publication subject to review, or should general product improvement be studied? Who should staff these review committees, and how do they fit in with privacy and security reviews which often look at related issues? What factors should be assessed to determine the ethics of a project? Are they universal? Or should they differ from culture to culture? What benefits are valuable enough that researchers should allow risks to users, if ever?</p>

<p><br></p>
<p id="Cttn5F"><q class="right">If members of the public feel that they cannot trust the type of big-data research that the private sector is able to carry out today, this could pose serious obstacles for this type of research.</q></p>
<p>Companies are starting to provide some answers. Today, Facebook&rsquo;s ethics and policy staff <a href="https://fpf.org/2016/05/24/roundtable-ethics-privacy-research-reviews/">published an important paper</a> that provides a detailed overview of the company&rsquo;s research review process. Informed by consultations with a wide range of experts, the Facebook process details the specifics steps taken by the company to review its internal research work, and is an important step forward for corporate research ethics.</p>

<p>More is happening, in academic and corporate circles, but it can&rsquo;t happen quickly enough. The Center for Democracy and Technology recently <a href="https://cdt.org/insight/cdt-fitbit-report-privacy-practices-rd-wearables-industry/">published</a> a report describing the internal research ethics process at Fitbit. And academics, advocates and researchers in every field are <a href="http://bigdata.fpf.org/">continuing to work</a> through the wide range of issues to be considered for such review processes to become widespread and meaningful.</p>

<p>The stakes are high. If companies feel that working with academics to conduct research is too uncertain or risky, innovative work will continue, but it will remain confidential and protected within companies, and unavailable to a wide research audience.</p>

<p>By the same token, if members of the public feel that they cannot trust the type of big-data research that the private sector is able to carry out today, this could pose serious obstacles for this type of research.</p>

<p>Protecting consumer data, while ensuring that it can be used safely and responsibly for scientific research that may yield the next breakthroughs in knowledge, is an ethical challenge we need to meet.</p>
<hr class="wp-block-separator" />
<p><br><a href="https://www.linkedin.com/in/julespolonetsky"><em>Jules Polonetsky</em></a><em> is the CEO of the </em><a href="http://www.futureofprivacy.org"><em>Future of Privacy Forum</em></a><em>, a think tank committed to advancing responsible data practices. Reach him </em><a href="https://twitter.com/JulesPolonetsky?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor"><em>@Jules Polonetsky</em></a>.</p>

<p><a href="http://moritzlaw.osu.edu/faculty/professor/dennis-hirsch/"><em>Dennis Hirsch</em></a><em> is the faculty director of the </em><a href="http://oaa.osu.edu/data-analytics.html"><em>Ohio State University Program on Data and Governance</em></a><em>, and is a professor of law at the Moritz College of Law. Reach him </em><a href="https://twitter.com/OSU_Law"><em>@OSU_Law</em></a><em>.</em></p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
			<entry>
			
			<author>
				<name>Jules Polonetsky</name>
			</author>
			
			<title type="html"><![CDATA[Effective Regulators, Effective Privacy Choices]]></title>
			<link rel="alternate" type="text/html" href="https://www.vox.com/2016/2/29/11588314/effective-regulators-effective-privacy-choices" />
			<id>https://www.vox.com/2016/2/29/11588314/effective-regulators-effective-privacy-choices</id>
			<updated>2019-03-06T05:18:08-05:00</updated>
			<published>2016-02-29T05:00:40-05:00</published>
			<category scheme="https://www.vox.com" term="Privacy &amp; Security" /><category scheme="https://www.vox.com" term="Technology" />
							<summary type="html"><![CDATA[Tracking individuals across devices as they stream video, browse on tablets and use apps and mobile websites on phones is now standard practice, according to ad industry publications. Enabled by ad networks, data brokers and companies that provide services across multiple screens, advertisers are focused on being able to attribute media content accurately and consistently [&#8230;]]]></summary>
			
							<content type="html">
											<![CDATA[

						
<figure>

<img alt="" data-caption="" data-portal-copyright="optimisecentre.com.au" data-has-syndication-rights="1" src="https://platform.vox.com/wp-content/uploads/sites/2/chorus/uploads/chorus_asset/file/15793732/wifi-loud.0.1488932380.jpg?quality=90&#038;strip=all&#038;crop=0,0,100,100" />
	<figcaption>
		</figcaption>
</figure>
<p>Tracking individuals across devices as they stream video, browse on tablets and use apps and mobile websites on phones is now standard practice, according to <a href="http://www.mediapost.com/publications/article/267400/why-cross-device-programmatic-targeting-is-primed.html">ad industry publications</a>. Enabled by ad networks, data brokers and companies that provide services across multiple screens, advertisers are focused on being able to attribute media content accurately and consistently to individuals, regardless of where content is viewed.</p>

<p>As audiences have dispersed across screens and devices, advertisers have worked with a wide range of companies seeking to link together the disparate mobile identities of customers and prospects. The number of devices and scale of this type of tracking is immense: Data firm <a href="https://www.iovation.com">Iovation</a> claims it has detailed data on three billion unique devices.</p>

<p>What choices do consumers have if they don&rsquo;t want to be tracked across devices? It depends &hellip;</p>
<blockquote class="red right"><p>What choices do consumers have if they don&rsquo;t want to be tracked across devices?</p></blockquote>
<p>Facebook, Microsoft, Apple and Google have policies and consumer choices for various kinds of tracking. ISPs offer similar policies and choices, but they are only able to track consumers on their network. The typical consumer today has one ISP at home, may use a different company for mobile data, relies on their office ISP service during the day and turns to other Wi-Fi networks during the day. Some ad networks offer an opt-out that breaks the link between devices, while others only agree to stop targeting ads across devices but to continue tracking.</p>

<p>The Federal Trade Commission has taken the lead in examining cross-device tracking with a recent workshop and has advocated for greater transparency and better consumer controls. The industry has started to respond by adding additional opt-out options to its self-regulation code, mandating that companies stop using data across devices for behavioral ads when users opt-out, but the FTC continues to advocate for broader consumer controls for such tracking.</p>

<p>Industry practices can be difficult to examine, as many privacy policies do not spell out the use of tracking in great detail, but top experts on ad tech, tracking and privacy who have recently joined the FTC should enable the agency to continue to examine the quickly developing business practices. Already, senior FTC officials have made it clear that failure to disclose cross-device tracking could amount to a violation of Section 5 of the FTC Act. Privacy advocates have urged the FTC to do much more to press this issue.</p>

<p>Cross-device tracking is but one issue the FTC is grappling with as the online data ecosystem becomes more complex. Oracle now offers <a href="http://www.oracle.com/us/corporate/features/data-as-a-service/index.html">data as a service</a>, ensuring that every company has access to data on demand. <a href="http://placeiq.com">PlaceIQ</a> claims to track the location of 100 million devices. Mobile apps may have access to our contacts, text messages, photos, fitness data and more.</p>

<p>So, what role should the Federal Communications Commission play here?</p>

<p>In the upcoming months, as the FCC examines extending the privacy rules enabled by its recent broadband reclassification, it will need to decide the role it can play in setting rules for data use by ISPs. The FCC&rsquo;s ability to impact the consumer experience in the online ecosystem will be less significant than the FTC&rsquo;s, since its efforts will be limited to the tracking enabled by ISPs which are only one of many technologies and business models involved.</p>

<p>The FTC has been highly effective by basing its enforcement on the standard of whether companies are acting deceptively or unfairly. The FCC should study how consumers are protected online by a long series of FTC enforcement actions and adopt a consistent model that allows the two agencies to collaborate effectively.</p>

<p>One important consideration for regulators is the rapid pace of change in technology and the uses of data. A decade ago, the leaders in the world of ad tracking and targeting were the companies that had access to the most data. Today, data has been democratized. Data is available to any vendor with a credit card. <a href="http://www.oracle.com/us/corporate/acquisitions/bluekai/index.html">Blue Kai</a>, the key data provider in Oracle&rsquo;s new data division, offers more than 80 comprehensive sources of data to its customers. Debates over which company holds more data miss the more central challenge ad companies are facing: The issue of state management of user identity in a world of diverse devices.</p>

<p>Many companies are providing consumers with information and choices, and the industry has taken steps to develop guidelines for expanded cross-device tracking. Historically, the brawny cop on the beat here has been the FTC, which has brought a range of actions focused on the misuses of identifiers against leading online companies, ad networks, apps, children&rsquo;s sites and others. A new cop on the beat has also recently started to have an impact on online practices. The rise of the ad blocker, driven by a range of aggressive ad practices, has sparked an industry drive for reforms.</p>

<p>When it comes to the complicated ad tech ecosystem, what consumers need is regulators and technology solutions that work together to ensure easy to use and effective choices and cross-industry standards that are logical.</p>
<hr class="wp-block-separator" />
<p><a href="https://www.linkedin.com/in/julespolonetsky"><em>Jules Polonetsky</em></a><em> is executive director and co-chair of the </em><a href="http://www.futureofprivacy.org"><em>Future of Privacy Forum</em></a><em>, a think tank committed to advancing responsible data practices. Reach him </em><a href="https://twitter.com/JulesPolonetsky?ref_src=twsrc%5Egoogle%7Ctwcamp%5Eserp%7Ctwgr%5Eauthor"><em>@Jules Polonetsky</em></a>.</p>

<p><small><em>This article originally appeared on Recode.net.</em></small></p>
						]]>
									</content>
			
					</entry>
	</feed>
