The US Department of Justice (“DOJ”), in collaboration with the Federal Trade Commission (“FTC”) and the European Commission (“EU Commission”), has recently made submissions in this respect to the Organization for Economic Co-operation and Development (the “OECD”), and there are varying degrees of confidence in the current tools available to combat anti-competitive online behavior. At the same time, various perspectives abound in the antitrust bar, many of which suggest a range of proposed defenses for corporate defendants in hypothetical scenarios or express pessimism about the ability of the antitrust laws to address adequately these new and unprecedented technological scenarios. We take a very different view, namely that the antitrust laws are robust and not in need of further tinkering simply because of the evolution of pricing systems. These laws reflect over a hundred years of complex anticompetitive agreement analysis, and have been no less vital over the last two decades of rapid technological change. In short, the present antitrust analytical frameworks can capably root out—and punish—collusion, even among robots. This article attempts to distill some of the recent dialogue and debate on the issue.
What is an algorithm?
An algorithm (at least in this context) is a piece of software that sets forth a process or set of rules to be followed in order to solve a particular problem. By following this process or set of rules, an algorithm can automatically make decisions depending on the data fed into it. The utility and effect of an algorithm depends on how it is designed and the quality of the input data. Algorithms generally follow human instruction, but an algorithm can also be programmed to amend its own decision-making rules to account for past experience - becoming a self-learning algorithm. Self-learning algorithms form the basis for technologies such as search engines and self-driving cars, and unsurprisingly pose the greatest threat to effective regulation because of their potential for unchecked anticompetitive behavior.
How are algorithms used?
The capabilities of algorithms are constantly evolving. Algorithms can be designed to track online prices in the market; adjust prices instantaneously to undercut prices offered by competitors; tailor products or service offerings to consumers; or assist consumers to find the lowest price of a product or service. And, again, all of this can be automated and designed to reduce or eliminate the need for human assistance or supervision.
The EU Commission’s recent Preliminary Report on the E-commerce Sector Inquiry reported that two-thirds of retailers who track their competitors’ prices use automatic systems to do so.  Of those companies employing algorithm software, 78% subsequently adjusted their own prices. Another recent study found that over 500 sellers on the Amazon Marketplace are using algorithmic pricing.
What anticompetitive risks do algorithms pose for consumers?
Despite the potential for utility and cost benefits, algorithms pose threats to consumers. Algorithms can be used to spark, implement, and monitor vertical and horizontal anticompetitive restraints among companies. For example, algorithms can be used to facilitate the monitoring of resellers who are unwilling to respect the resale price recommendations of their suppliers; algorithms can also be used to monitor agreed-upon prices. Algorithms can also provide companies with automated mechanisms to signal price changes, implement parallel/common policies, and monitor and punish deviators that are a party to a price-fixing agreement.
US enforcement agency and practitioner views on algorithms and competition law
The DOJ and FTC’s policy paper for the OECD, along with comments by FTC Commissioner Terrell McSweeny and acting FTC Chair Maureen Ohlhausen, suggest that US regulators are confident that existing antitrust regimes and principles can capably address harms associated with algorithms. Not everyone in the antitrust bar seems to agree, however; various commentators have advanced a different view, pointing out defenses that might exist for claims of algorithmic collusion, and/or otherwise proposing that the existing laws are insufficient to address collusion given these new technological scenarios.
Acting FTC Chair Ohlhausen’s recent remarks before the Concurrences Antitrust in the Financial Sector Conference expressed optimism: “From an antitrust perspective, the expanding use of algorithms raises familiar issues that are well within the existing canon.” Similarly, the DOJ and FTC’s paper submitted to the OECD echoed the same sentiment regarding algorithms impact on antitrust enforcement by analyzing scenarios and recent cases that ultimately showed that antitrust laws are equipped to handle such conduct.
The DOJ/FTC paper analyzed cases dating back to 1993 that demonstrated how technological developments do not preclude antitrust liability. Specifically, in the Airline Tariff Publishing Company case, the court declared that the use by competitors of a common computer system to establish or implement an illegal pricing agreement violated the US antitrust laws. The two US agencies also highlighted a 2015 case in which DOJ brought criminal price-fixing charges against retailers who had agreed to use pricing algorithms on Amazon to eliminate competition among themselves. These cases reaffirmed, from the enforcers’ viewpoint, that price-fixing cartels aided by technology can be challenged successfully, regardless of the means by which they are implemented or operated. Acting FTC Chair Ohlhausen likewise noted: “[W]hether it’s calls, text messages, algorithms, or Morse code, the underlying legal rule is the same . . . agreements to set prices among competitors are always unlawful.”
Notwithstanding this confidence, US regulators are mindful of the challenges that pricing algorithms may present. Former FTC Commissioner McSweeney’s speech at Oxford on algorithms and coordinated effects noted that “[c]oncerns of algorithmic tacit collusion are still largely theoretical at this point . . . We have a lot to learn about the effects of pricing algorithms and artificial intelligence. Further research will contribute to better and more effective competition enforcement in this area.” Recognizing the need to better understand how algorithms and artificial intelligence software works, she noted that the FTC has created an Office of Technology, Research, and Investigation, which now includes technology specialists and computer scientists.
Some prominent US antitrust attorneys have expressed a contrary view: that the Sherman Act and a century of jurisprudence are likely ill-equipped to address this new technological frontier of pricing decisions made through automation, far removed from their human counterparts. In particular, a number of recent articles by practitioners have advanced the view that prosecutions and civil litigation will be hampered by evidentiary challenges. Smart robots may well find ways to cover their tracks.
The US antitrust authorities have stressed, however—and we agree—that there is already a rich body of case law concerning anticompetitive conduct carried out through agents and employees of the companies acting within the scope of their employment and authority. It is no defense to suggest that algorithms, programmed for autonomy, have learned and executed anticompetitive behavior unbeknownst to the corporation. The software is always a product of its programmers—who of course have the ability to (affirmatively) program compliance with the Sherman Act, even erring on the side of caution. Nor can the company’s decision not to supervise, monitor, or check for compliance on a continuing basis prevent antitrust liability; on the contrary, the decision to employ algorithmic pricing carries with it significant corporate responsibility, as with any decision to disperse pricing authority widely among employees (and permit inter-competitor ‘discussions,’ interactions, and information-sharing).
In short, the technology may be different, but the antitrust principles—and harms—remain the same.
EU views on algorithms and competition law
The EU Commission’s paper published for the OECD in June 2017, combined with recent comments from the Commission’s antitrust chief Margrethe Vestager, show a vigilance by the Commission towards algorithmic pricing.
Thus, in its paper prepared for the OECD, the Commission cautioned that the use of algorithms that lead to collusion may breach competition law and already herald an increase in standard cartel fines. It also stressed that companies should ultimately be held responsible for the activities of any algorithm or pricing software they deploy, noting: “like an employee or an outside consultant working under a firm’s ‘direction or control,’ an algorithm remains under the firm’s control, and therefore the firm is liable for its actions.” 
Some practices are simply an online version of long-established unlawful offline conduct and therefore should not pose challenges to enforcers. However, the Commission recognizes that more subtle practices featuring ‘tacit’ rather than ‘explicit’ collusion may prove more troublesome to identify and regulate. Tacit collusion through algorithms may or may not fall foul of EU competition law, as proving the basic conditions of tacit collusion in order to evidence anti-competitive behavior may be complicated and unclear. The Commission also warns that some of the more sophisticated tools used by companies to observe others’ pricing may not be caught by the wording set out in the EU’s law against anticompetitive communications.
Speaking at the 18th Bundeskartellamt IKK Conference in Berlin, Commissioner Vestager warned: “automated systems could be used to make price-fixing more effective. That may be good news for cartelists. But it’s very bad news for the rest of us . . . we need to make it very clear that companies can’t escape responsibility for collusion by hiding behind a computer program.”
In the same vein, the highest European court made it clear in the recent Etura decision that companies cannot escape liability where collusion has been achieved and executed through automated systems. In that case, the operator of a Lithuanian travel booking system sent an electronic message to its travel agents that proposed to limit discounts to no more than 3%. The CJEU held that travel agents who saw the message and did not distance themselves from the proposal could be liable.
Antitrust authorities in Europe also have been reviewing how the use of algorithmic technology by platforms such as Google, Facebook and Amazon affects consumer markets. Notably, on June 27, 2017, the EU Commission announced a fine of €2.42bn against Google for abusing its dominance as a search engine by giving an illegal advantage to its own comparison-shopping service. The Commission found that Google denied rival shopping comparison websites the opportunity to compete by demoting them within search results through the use of algorithms, thereby denying consumers a genuine choice. The Commission is also currently investigating whether four makers of electronic goods prevented retailers from setting their own prices by using software to make the policing of retailers’ prices more effective.
Online investigations have become a major focus of competition work at the Competition and Markets Authority (“CMA”) in the UK. Last year, the CMA fined online sellers of posters for automated price collusion on Amazon Marketplace. The CMA is also preparing a report on the impact of digital comparison tools on consumer behavior which will focus on the effect of website and smartphone application price-comparison tools on consumer choice.
The way forward
It remains to be seen how the current competition laws will be used to combat algorithmic collusion. Despite differing degrees of confidence in the existing antitrust laws to root out and punish anticompetitive agreements among humans or robots, there can be no disagreement about the importance of keeping pace with developments in the technological sphere. In order to ensure effective, balanced enforcement, authorities will need to understand the inner workings of algorithms and their impact upon business models and competition more broadly. Continued economic studies across various markets and sectors will assist in this understanding by providing vital information on when algorithms result in coordinated behaviour, and under what conditions. It is particularly heartening to learn of the involvement of computer scientists in progressing our collective understanding of these issues.
We look forward to addressing these challenges as advocates and are buoyed by the measured calm reflected in recent enforcer remarks. On the US side, acting FTC Chair Ohlhausen has noted: “There is nothing inherently suspect about using computer algorithms to look carefully at the world around you . . . If conduct was unlawful before, using an algorithm to effectuate it will not magically transform it into lawful behaviour. Likewise, using algorithms in ways that do not offend traditional antitrust norms is unlikely to create novel liability scenarios.”
Similarly, for the EU, Commissioner Vestager has declared: “We certainly shouldn’t panic about the way algorithms are affecting markets. But we do need to keep a close eye on how algorithms are developing. We do need to keep talking about what we’ve learned from our experiences. So that when science fiction becomes reality, we’re ready to deal with it.”
Only time will tell whether the antitrust laws are up to the task and will keep pace with technology. Still, we expect to have some preliminary answers soon as these issues percolate up through the courts. Stay tuned for an update in 12-18 months.
 European Commission, Preliminary Report on the E-commerce Sector Inquiry, Brussels, 15.9.2016, SWD(2016) 312, page 56.
 Chen, Mislove, Wilson, ‘An Empirical Analysis of Algorithmic Pricing on Amazon Marketplace’, Proceedings of the 25th International Conference on World Wide Web, Montréal, Québec, Canada — April 11 - 15, 2016.
 See e.g., Maurice E. Stucke and Ariel Ezrachi, How Pricing Could Form Cartels and Make Things More Expensive, (October 27, 2016) available at https://hbr.org/2016/10/how-pricing-bots-could-form-cartels-and- make-things-more-expensive (discussing scenarios for collusion that current antitrust laws are not equipped to handle technological advancement).
 See Maureen K. Ohlhausen, Acting Chairman, U.S. Federal Trade Commission, ‘Should We Fear the Things That Go Beep in the Night? Some Initial Thoughts on the Intersection of Antitrust Law and Algorithmic Pricing,’ pg. 2 (May 23 2017). Id. at 7. (“[T]he same analytical framework is sufficiently flexible and robust that it can already accommodate several of the current concerns applicable to the widespread use of algorithms.”)
 OECD, Directorate for Financial and Enter. Affairs Competition Comm., Algorithms and Collusion – Note by the United States (DAF/COMP/WD(2017)41), available at https://one.oecd.org/document/DAF/COMP/WD(2017)41/en/pdf.
 United States v. Airline Tariff Publishing Co., 836 F. Supp. 9 (D.D.C. 1993).
 Press Release, DoJ Office of Public Affairs, E-Commerce Exec and Online Retailer Charged with Price Fixing Wall Posters (Dec. 4, 2015), available at: https://www.justice.gov/opa/pr/e-commerce-exec-and-online-retailer-charged-price-fixing-wall-posters
 See Maureen K. Ohlhausen, at 9. See also Pallavi Guniganti, US DOJ Obtains Guilty Pleas in E-commerce Cartel, (Aug. 7, 2017) available at http://globalcompetitionreview.com/article/1145355/us-doj-obtains-guilty-pleas-in-e-commerce-cartel (discussing Antitrust Division settlement with companies who were charged with violations of the Sherman Act who communicated in person and online, using Facebook, Skype and Whatsapp, to make and implement their price-fixing agreements).
 See Terrell McSweeny, Commissioner, U.S. Federal Trade Commission, ‘Algorithms and Coordinated Effects’,University of Oxford Center for Competition Law and Policy, pg. 6 (May 22, 2017), available at https://www.ftc.gov/system/files/documents/public_statements/1220673/mcsweeny_-_oxford_cclp_remarks_-_algorithms_and_coordinated_effects_5-22-17.pdf.
 See, e.g., United States v. Basic Construction Co., 711 F.2d 570, 573 (4th Cir. 1983) (“[A] corporation may be held criminally responsible for antitrust violations committed by its employees . . . even if, such acts were against corporate policy or express instructions.”); United States v. Hilton Hotels Corp., 467 F.2d 1000, 1004-07 (9th Cir. 1972) (same); United States v. Am. Radiator & Standard Sanitary Corp., 433 F.2d 174, 204-05 (3d Cir. 1970) (same).
 Directorate for Financial and Enterprise Affairs Competition Committee, ‘Algorithms and Collusion – Note from the European Union’, 21-23 June 2017, at 4.2.
 Id at 5.
 Id at 4.4 and 5.
 Id at 4.5.
 Case C-74/14 Eturas and others, judgment of 21 January 2016.
 European Commission Press Release, ‘Antitrust: Commission fines Google €2.42 billion for abusing dominance as search engine by giving illegal advantage to own comparison shopping service’, Brussels, 27 June 2017.
 European Commission Press Release, ‘Antitrust: Commission opens three investigations into suspected anticompetitive practices in e-commerce’, Brussels, 2 February 2017.
Decision of the Competition & Markets Authority, Online sales of posters and frames, Case 50233, 12 August 2016.
 Statutory deadline: 28 September 2017.
*Sathya Gosselin is a partner in the Washington, D.C. office, April Jones is a law fellow in the Washington, D.C. office, and Annabel Martin is an associate in the London office.