Algorithms and competition

Bundeskartellamt 18th Conference on Competition, Berlin, 16 March 2017

*Check against delivery*

 

Introduction

Ladies and gentlemen

I want to thank Andreas Mundt and the Federal Cartel Office for inviting me to be here today.

I admire the way this conference is not afraid to ask the really big questions, about whether the tools of competition enforcement can do all we ask of them. We need to do that, if we want to be sure we’re doing our jobs as well as we can.

The important thing, of course, is to make sure we get answers we can use.

In The Hitchhiker’s Guide to the Galaxy, a computer called Deep Thought is asked to calculate the answer to the ultimate question, of Life, the Universe and Everything. For seven and a half million years, it runs its algorithm. Then it comes up with an answer. The answer to Life, the Universe and Everything, it says, is 42.

Which just goes to show that asking the right questions is important, but understanding the answers matters even more.

 

The power of algorithms

Right now, we need to think especially carefully about the answers that algorithms are giving us.

We’re not yet dealing with an algorithm quite as smart as Deep Thought. But we do have computers that are more powerful than many of us could have imagined a few years ago. And clever algorithms put that power – quite literally – in our hands.

There are search algorithms that find us the information we need from more than four and a half billion web pages. Recommendation algorithms that sift through millions of products on an Internet store to choose the ones that interest us.

The trouble is, it's not easy to know exactly how those algorithms work. How they’ve decided what to show us, and what to hide. And yet the decisions they make affect us all.

When an algorithm makes it harder to find rivals’ products, that could deny those rivals the chance to compete. And the result could be higher prices, and less choice, for consumers. That's precisely the issue in our case with Google Shopping. We’re concerned that the way  Google used its algorithms may have given its own comparison shopping service more prominent treatment than it gives to competitors.

The way that algorithms are used to make decisions automatically could even undermine our democracy. These days, social media is a vital source of news. One recent study found that nearly two thirds of US adults get their news this way. So the scope for social media algorithms to create an alternative reality, by showing people one story after another that just isn't true, is a concern for us all.

 

Algorithms and competition

But there’s also another side to algorithms that's attracting the attention of competition authorities. One which our hosts today mentioned in the report which they published last year with the French Competition Authority. I'm thinking about the issue of automated systems that monitor, and even adjust, prices automatically.

These programs crawl the Internet continuously, checking prices in hundreds of online shops.

And these algorithms are very common. Our sector inquiry into e-commerce has shown that two thirds of retailers who track their competitors’ prices use automatic systems to do that. Some of them also use that software to adjust prices automatically.

From the point of view of competition, what matters is how these algorithms are actually used.

A few years ago, two companies were selling a textbook called The Making of a Fly. One of those sellers used an algorithm which essentially matched its rival’s price. That rival had an algorithm which always set a price 27% higher than the first. The result was that prices kept spiralling upwards, until finally someone noticed what was going on, and adjusted the price manually. By that time, the book was selling – or rather, not selling – for 23 million dollars a copy.

So the effect of an algorithm depends very much on how you set it up.

If you design it to raise prices without limit, that's what it will do. If you ask it to make sure your prices undercut your rivals’, it will do that instead.

And if you want to help consumers find the lowest prices, you can design an algorithm to do that. In fact, there are many applications out there which do just that, for things like air fares.

That's why I don't think competition enforcers need to be suspicious of everyone who uses an automated system for pricing.

 

Cartel enforcement and a new whistleblower tool

But we do need to be alert.

Because automated systems could be used to make price-fixing more effective. That may be good news for cartelists. But it's very bad news for the rest of us. And protecting consumers from cartels is at the very heart of our work.

So I think it's good that you're spending part of this conference looking at our tools for finding out about cartels.

We’ve discovered a lot of cartels thanks to leniency programmes, which allow companies to escape a fine if they're the first to tell us about a cartel. Last year, the Commission fined five truck makers nearly 3 billion euros, for a cartel that lasted fourteen years. We found out about that cartel because one of the companies came forward to avoid a fine.

But we don't just rely on leniency. We pay attention to other methods as well. And that includes encouraging individuals to come forward, when their conscience is troubled by the information they have about a cartel.

That's why we recently launched a new IT system to help people tell us anonymously about cartels. The system means we can communicate both ways with them without risking their anonymity while we gather information. It's similar to the system that’s worked well in Germany since 2013, and I hope it will help us to protect consumers even more effectively.

 

Using automated systems to implement cartels and resale price maintenance

But when we look at the challenges for cartel enforcement in the future, one of the biggest things we need to deal with is the risk that automated systems could lead to more effective cartels.

Every cartel faces the risk that its members will start cheating each other as well as the public. If everyone else’s price is high, you can gain a lot of customers by quietly undercutting them. So whether cartels survive depends on how quickly others spot those lower prices, and cut their own price in retaliation. By doing that quickly, cartelists can make sure that others will be less likely to try cutting prices in the future.

And the trouble is, automated systems help to do exactly that.

They can also help to establish a cartel in the first place.

A few years ago, two companies that sold posters featuring artists like Justin Bieber agreed not to undercut each other on the Amazon Marketplace. But as one employee put it, “logistically it is going to be difficult to follow the pricing effectively on a daily basis.” When you take out the euphemisms, that meant that the companies sold so many different items that it would have been hard to keep their prices aligned manually. Instead, they decided to use pricing algorithms to keep their prices the same.

This isn't just about cartels, either. Manufacturers that want their retailers to stick to minimum retail prices need a way to deal with discount stores that undercut those prices.

With monitoring algorithms, manufacturers can easily spot when that's happening. And they can ask retailers – not always very politely – to put the price back up.

As competition enforcers, we need to be sure that we can deal with these new challenges.

So far, I think the signs are promising. Companies may be using algorithms to make their price-fixing agreements more effective. But of course, that doesn't stop competition authorities from taking action against those agreements.

In the last two years, the authorities in both the US and the UK have dealt with those companies that used automated systems to help fix prices for posters that they sold online.

And last month, the Commission launched a case which looks at whether four companies broke the competition rules by limiting the ability of retailers to set their own prices for consumer electronics. Part of our concern there is that software may have made those limitations more effective.

 

Automated systems that collude

So far, those cases have dealt with agreements that were put together by humans. The computers only took over when it was time to put them into practice.

But no one should imagine they can get away with price-fixing by allowing software to make those agreements for them.

It's true that the idea of automated systems getting together and reaching a meeting of minds is still science fiction.

But illegal collusion isn't always put together in back rooms. There are many ways that collusion can happen, and some of them are well within the capacity of automated systems.

A few years ago, the operator of a Lithuanian travel booking system sent an electronic message to its travel agents, which proposed to limit discounts to no more than 3%. And the European Court made clear that travel agents who saw that message and did not distance themselves from that proposal could have found themselves caught up in a cartel.

So as competition enforcers, I think we need to make it very clear that companies can’t escape responsibility for collusion by hiding behind a computer program.

 

Compliance by design

The thing is, the digital world is full of examples of new technology that has huge potential to do good, but also great risks if it's abused.

And I think the EU’s new rules on data protection, which will come into force next year, give us valuable ideas about how we can face that challenge. The concept of “data protection by design” makes clear that people’s privacy can never be an afterthought. It has to be built into the way that services work from the very start.

That's also how businesses need to think when they design and use algorithms.

They may not always know exactly how an automated system will use its algorithms to take decisions. What businesses can – and must – do is to ensure antitrust compliance by design. That means pricing algorithms need to be built in a way that doesn't allow them to collude. Like a more honourable version of the computer HAL in the film 2001, they need to respond to an offer of collusion by saying “I'm sorry, I'm afraid I can't do that.”

 

Conclusion

The challenges that automated systems create are very real. If they help companies to fix prices, they really could make our economy work less well for everyone else.

So as competition enforcers, we need to keep an eye out for cartels that use software to work more effectively. If those tools allow companies to enforce their cartels more strictly, we may need to reflect that in the fines that we impose.

And businesses also need to know that when they decide to use an automated system, they will be held responsible for what it does. So they had better know how that system works.

In The Hitchhiker's Guide to the Galaxy, the Guide in question was a sort of electronic book. Although it was often wildly inaccurate, it was also a huge success. That was partly because of the words printed in big, friendly letters on the cover: “Don't Panic”.

I think that's good advice. We certainly shouldn't panic about the way algorithms are affecting markets.

But we do need to keep a close eye on how algorithms are developing. We do need to keep talking about what we’ve learned from our experiences. So that when science fiction becomes reality, we’re ready to deal with it.

Thank you.