This article is republished with permission from BusinessThink at UNSW Business School. You can access the original article here.
Conduct driven by algorithms is a challenge for competition law
Slip the term ‘algorithmic tacit collusion’ into a casual conversation and, not surprisingly, most people will draw a blank.
The concept, however, is firmly on the agenda for regulators, academics and lawyers – and it has significant privacy and pricing implications for everyday consumers as data sharing and data-driven commerce becomes increasingly sophisticated.
What does it mean? In essence, at a time when machine learning and the availability of massive amounts of data are reshaping digital markets, more companies are using algorithms – or software robots known as bots – to improve their pricing models and customise services.
At its best, it can generate efficiencies and benefits for all; for example, consumers could use an app on their smartphone to compare the prices of fuel, eliminating the need to drive around looking for the best deal. At its worst, there are fears algorithms may have anti-competitive effects and make it easy for companies to collude on prices without any human input.
In his paper, Mind the Gap: Platform Ethics and Competition Issues, UNSW Business School senior lecturer and competition law expert Rob Nicholls explores how the algorithm-driven conduct of platform operators is challenging the enforcement of laws.
Whereas the usual legal test for collusive price fixing requires what lawyers call a ‘meeting of the minds’ of the involved parties and a proven commitment to the price-fixing conduct, Nicholls says “it’s not clear that bots meet either of these tests”.
The concern is that promises of greater choice and competition for consumers in digital markets could be compromised if bots – by design or otherwise – take control of pricing.
“We need to ensure through competition law and policy that [these advantages] are not lost,” says Nicholls, who is also a research fellow at the Centre for Law, Markets and Regulation at UNSW Law.
His paper notes that whereas explicit collusion over prices is not permissible, “the illegality of algorithmic tacit collusion is less clear” because of complexities around technology platforms and data sharing. So expect lawyers to be central to the unfolding debate.
On the radar
In June this year, the OECD held a roundtable on algorithms and collusion as part of an examination of competition in the digital economy. Speakers considered potential solutions, including changes to the law on tacit collusion and the possible development of ‘consumer algorithms’ that could track prices and even identify ‘maverick’ sellers who were not engaging in wrongdoing.
The push comes as automated pricing and selling mechanisms are becoming more prevalent in an era when bots can monitor human purchasing activity and mine data in retail, services and other areas of business.
They can also make pricing decisions, as demonstrated by Uber’s controversial ‘surge pricing’ model which uses an algorithm to raise prices under particular conditions and especially when demand for drivers is high.
Nicholls says other examples demonstrate the power – and limits – of algorithms. Famously, two booksellers using Amazon’s algorithmic pricing to ensure they were making marginally more revenue than their chief competitor ended up elevating the price of a book on evolutionary biology, The Making of a Fly, to more than US$23 million.
“The real harm that comes from this is that consumers are not aware as to what’s going on,” Nicholls says. “And it’s pretty hard to work out, if you’re the competition authority, that there is a wrong occurring. It’s hard to detect.”
Price and privacy concerns
A test case in the US has highlighted the dangers of machine or bot-driven collusion. In the first criminal anti-trust case of its kind, David Topkins pleaded guilty in 2015 to manipulating prices for cinema posters sold through Amazon’s online marketplace. He admitted to programming customised algorithms to keep prices artificially high.
While the case is relatively small in its scale, it has shown what can be done with algorithms. In a growing number of sectors, bots are sent out to competitors’ websites to conduct pricing analyses and to inform price-setting.
A significant proportion of foreign exchange trading and stock market trading is also driven by algorithms.
Peter Leonard, principal of data commercialisation consultancy Data Synergies and an advisory board member of the NSW Data Analytics Centre, has closely examined the topic of data linkage and data sharing between organisations.
Leonard believes there are legitimate privacy concerns related to unconstrained data sharing – along with the potential for discrimination or bias against certain segments of the community – and that the public should not blindly trust governments and businesses that engage in data collection and sharing.
“There is a grey world between privacy concerns and current regulation,” Leonard says.
While privacy impact assessments (PIAs) shape up as a logical means of addressing some concerns over the use of algorithms, he notes that the issue also raises ethical questions that are “not as cut and dried as privacy analysis”.
Whereas methodologies can be used effectively for PIAs, Leonard says any consideration of ethics requires a range of views that reflect different societal groups, generations and cultures.
However, setting up such review committees could be cumbersome and lead to cases getting bogged down in endless debate, just as has been the case in Australia with medical ethics committee reviews.
“So we can’t let this happen, or it will be a barrier to getting services out there,” Leonard says.
Ariel Ezrachi and Maurice Stucke, the authors of Virtual Competition: The Promise and Perils of the Algorithm-Driven Economy, suggest that the conditions which foster tacit collusion through the use of algorithms are likely to become more and more common, raising the stakes for prosecutors and regulators.
As online purchases become more prevalent, sellers will increasingly rely on sophisticated algorithms to set prices – and proving that there is foul play between bots may be hard.
Nicholls’ paper charts some approaches that could be applied in this new algorithm-driven era, and comments that proposed changes in competition law in Australia could unintentionally act as a guide.
Lawmakers are considering an amendment that would prohibit a firm from engaging “in a concerted practice that has the purpose, or has or is likely to have the effect, of substantially lessening competition”.
If passed, Nicholls says Australia may become one of the first jurisdictions to deal with algorithmic tacit collusion under competition law. However, he adds that the legislation was not drafted to have an effect on automated price fixing.
“The law was intended to deal with forms of price fixing and to replace the price-signalling laws that only ever applied to banks,” he says.
“So it’s likely to have an effect which is an unintended consequence. The issue here is that favourable unintended consequences are fabulous and really rare, but you are always worried that if you found one favourable unintended consequence there must be a dozen unfavourable ones under the floorboards like cockroaches.”
Nicholls expects regulators such as the Australian Securities & Investments Commission and the Australian Competition and Consumer Commission – and their international counterparts – to play a key role in setting standards in the growing world of algorithm-driven business.
Such watchdogs can help ensure that business does not reside in the hands of too few at a time when tech giants such as Facebook, Netflix, YouTube, Google and Amazon are dominating markets.
At the same time, he warns against regulatory over-reach that could stifle innovation and rob consumers of the potential benefits of more efficient pricing.
“As long as algorithms are not colluding, the outcome of well-informed competitors is that the rivalry will be more intense and the prices [of products and services] are likely to drop,” says Nicholls.