top of page
k40_edited.jpg

Data Discrimination

Bias & new forms of algorithmic discrimination

Can companies determine the risk profile of an individual from their web history, for instance their recent searches? Wonga, a British startup developed a solution for creditors that uses technology to determine whether an individual is unemployed or not. Kreditech, a German startup, leveraged a repertoire of 20,000 data points ranging from location data, Facebook social graph such as likes, posts and friends to compete with Wonga’s web history approach, to classify risk. A loan applicant having individuals in his or her network that are either unemployed or have missed their bills, can see his or her probability of receiving a loan diminishing. We can also be classified depending on our search history. Thus, knowing someone unemployed can hurt our chances at receiving credit. The company ZestFinance uses as criteria the speed by which users accept the user agreements of their site, or whether a candidate types exclusively in uppercase or lowercase letters.

​

Algorithms are equally central to “dynamic pricing”, a commercial practice consisting of adjusting prices based on supply and demand. Amazon for example, changes its prices at least 2.5 million times a day. Depending on the product, the price varies in function of the day of the week or the weather. Widespread in the airline industry, this pricing method is alluring to various sectors. In France, a study made by price comparison website Idealo delivered stunning results. For example, “do it yourself” material can be 10% less expensive on Tuesday than on Thursday, because clients are more likely to pass an order close to the weekend. An investigation ran by the Wall Street Journal equally concluded that prices can have other criterion, most notably the postal code of the buyer. This investigation demonstrated that office furniture prices at Staples.com varied by location. When the location of purchase was close to a competing store, the site offered reductions to appetize prospective client in a competitive zone.

​

This logic also applies to predictive policing. To detect crimes and react in real time, the automated gunshot detection system ShotSpotter provides the location of a gunshot via triangulation within a matter of seconds upon of firing.  Big Data can therefore be used to predict and avoid crimes. Former New York Mayor Michael Bloomberg’s primary objective was to arrest criminals in action and dissuade prospective criminals. Conversely, he invested in the Domain Awareness System, a project developed by Microsoft alongside the New York police department. This system synchronized 9,000 surveillance cameras, emergency calls, license plate recognition technology, radiation detectors, gunshot sounds, police calls and court orders. Since 2015, this information is available in real time on the smartphones of policemen. Its proponents tout that this project was able to realize 50 million dollars in yearly cost savings and a 6% decline in crime.

​

In the United States, other applications have already seen the light of day, such as PredPol which leveraged temporo-spatial variables to predict crime at a given time and place. It is essentially a tool to track and census offense data. The Los Angeles County Delinquency Prevention Pilot, another predictive tool, targets children that are susceptible to delinquency using various indicators such as academic success or drug use. These tools reinforce the risk of focalizing attention on certain types of social behaviors and tread the line of invading personal privacy and diminishing civil liberties.

 

These algorithms can in turn become relays for social prejudice. Whilst acknowledging some of the relevance of their results, the end goal of these systems resides in understanding risk factors and consequently guiding public policy. Using these same principles as the methods used in anticipating earthquakes, PredPol measures the concentration and contagion of criminality to determine the intensity of risk. If an algorithm concludes that a certain zone is dangerous, law enforcement will naturally flow to these areas, increasing the number of arrests and prosecutions in these areas at the expense of other zones. Through a vicious cycle, arrests increase in these hot spots, increasing the perceived risk levels of these zones, and hyperinflate the predictive power of the algorithm and generates a phenomenon known as self-fulfilling predictions. These predictive algorithms risk diluting criminality to a set of correlations and statistical measures. As such, resources will be allocated to treating the symptoms of criminality, and not the root cause of it, which is often socioeconomic.

​

There exists in Europe similar surveillance systems such as Indect, a system financed by the European Union aimed at managing terrorism and criminality. Indect systematizes the investigation process by cross referencing surveillance footage, web search and police file data. The German Precobs, and British Squeaky Dolphin, monitor in real time social media activity from Youtube, Facebook and Twitter. The deployment of these systems shouldn’t go without scrutiny. There need to be assurances that these algorithms do not reflect the biases of their creators when establishing the criterion needed to determine threat and risk. These issues allow us to finally tackle preexisting questions that were until today considered taboo: there must exist objective reason for any given police judgement, but will there be when an algorithm recommends the same one? The presumption of innocence is a fundamental right in any law-abiding State, and one of the primary markers of liberal democracies. The respect of this principal should persist in the algorithmic age.

​

These predictive technologies are concepted under the criterion set by an individual’s potential for danger and not on evidence of their culpability. The risk lies therein in future sanctions being guided by social or economic norms defined by computer code, without a proper foundation based on legal norms. We will therefore not punish and individual for his or her actions, but for his/her profile in a given situation. What shall be the reference? French law professor Mireille Delmas-Marty warns us: “In the long run, the equivalence between army and police, criminal and enemy, and finally the confusion between war and peace could be programmed into machines”. Questions pertaining to the place of criminal law, procedural safeguards, and hence the rule of law, should be asked with insistence.

​

Adrien Basdevant

bottom of page