by Ajay Agrawal
on ad conversions so they can better determine their optimal bids. Google
explained that the algorithms use vast amounts of data and continually
refi ne models of users’ conversion to better spend an advertiser’s dollars to
where they bring in the highest conversion.
Another important application of AI’s strength in improving forecasting
to help markets operate more effi
ciently is in electricity markets. To operate
effi
ciently, electricity market makers such as California’s Independent Sys-
tem Operator must engage in demand and supply forecasting. An inaccurate
How AI and Machine Learning Can Impact Market Design 569
forecast in the power grid can dramatically aff ect market outcomes causing
high variance in prices, or worse, blackouts. By better predicting demand
and supply, market makers can better allocate power generation to the most
effi
cient power sources and maintain a more stable market.
As the examples above demonstrate, the applications of AI algorithms
to market design are already widespread and diverse. Given the infancy of
the technology, it is a safe bet that AI will play a growing role in the design
and implementation of markets over a wide range of applications. In what
follows, we describe several less obvious ways in which AI has played a key
role in the operation of markets.
23.2 Machine Learning and the Incentive Auction
In the fi rst part of the twentieth century, the most important infrastruc-
ture projects for the United States related to transportation and energy infra-
structure. By the early twenty- fi rst century, however, it was not just people
and goods that needed to be transported in high volumes, but also informa-
tion. The emergence of mobile devices, WiFi networks, video on demand,
the Internet of Things, services supplied through the cloud, and much more
has already created the need for major investments in the communication
network, and with 5G technologies just around the corner, more is coming.
Wireless communications, however, depend on infrastructure and other
resources. The wireless communication rate depends on the channel capac-
ity, which in turn depends jointly on the communication technology used
and the amount of radio spectrum bandwidth devoted to it. To encourage
growth in bandwidth and the rapid develop of new uses, the Obama White
House in 2010 issued its National Broadband Plan. That plan set a goal of
freeing a huge amount of bandwidth from older, less productive uses to be
used instead as part of the modern data highway system.
In 2016– 2017, the US Federal Communications Commission (FCC)
designed and ran an auction market to do part of that job. The radio spec-
trum licenses that it sold in that auction raised about $20 billion in gross
revenue. As part of the process of making room for those new licenses, the
FCC purchased TV broadcast rights for about $10 billion, and incurred
nearly $3 billion in costs to move other broadcasters to new TV channels.
Some 84MHz of spectrum was made available in total, including 70MHz for
wireless broadband and 14MHz for unlicensed uses. This section describes
the processes that were used, and the role of AI and machine learning to
improve the underlying algorithms that supported this market.
Reallocating spectrum from one use to another is, in general, neither easy
nor straightforward, in either the planning or the implementation (Leyton-
Brown, Milgrom, and Segal 2017). Planning such a change can involve sur-
prisingly hard computational challenges, and the implementation requires
high levels of coordination. In particular, the reallocation of a portion
570 Paul R. Milgrom and Steven Tadelis
of the spectrum band that had been used for UHF broadcast television
required deciding how many channels to clear, which stations would cease
broadcasting (to make room for the new uses), what TV channels would
be assigned to the remaining stations that continued to broadcast, how to
time the changes to avoid interference during the transition, and to assure
that the TV tower teams, which would replace the old broadcast equipment,
had suffi
cient capacity, and so on. Several of the computations involved
are, in principle, nondeterministic polynomial time (NP)- hard, making this
a particularly complex market- design problem. One of the most critical
algorithms used for this process—the “feasibility checker”—was developed
with the aid of machine- learning methods.
But why reallocate and reassign TV stations at all? Broadcast television
changed enormously in the late twentieth century. In the early days of tele-
vision, all viewing was of over- the- air broadcasts using an analog tech-
nology. Over the decades that followed, cable and satellite services expanded
so much that, by 2010, more than 90 percent of the US population was
reached by these alternative services. Standard defi nition TV signals were
replaced by high defi nition and, eventually, 4K signals. Digital television
and tuners reduced the importance of channel assignments, so that the
channel used by consumers/ viewers did not need to match the channel used
by the broadcaster. Digital encoding made more effi
cient use of the band
and it became possible to use multiplexing, so that what was once a single
standard- defi nition broadcast channel could carry multiple high- defi nition
broadcasts. Marginal spectrum had fallen in value compared to the alterna-
tive uses.
Still, the reallocation from television broadcasting would be daunting and
beyond what an ordinary market mechanism could likely achieve. The signal
from each of thousands of TV broadcast towers across the United States
can interfere with potential uses for about 200 miles in every direction, so
all of the broadcasts in any frequency needed to be cleared to make the fre-
quencies available for new uses. Not only would it be necessary to coordinate
among diff erent areas of the United States, but coordination with Canada
and Mexico would improve the allocation, too; most of the population of
Canada lives, and most of its TV stations operate, within 200 miles of the
US border. Because a frequency is not usable until virtually all of the rele-
vant broadcasters have ceased operation, effi
ciency would demand that these
changes would need to be coordinated in time, too; they should be roughly
simultaneous. In addition, there needed to be coordination across frequen-
cies. The reason is that we need to know in advance which channels will be
cleared before the frequencies can be effi
ciently divided between uplink uses
and downlink uses.
Among the many issues to be resolved, one would be how to determine
which stations would continue to broadcast after the transition. If the goal
were effi
ciency, then the problem can be formulated as maximizing the total
value of the TV stations that continue to broadcast after the auction. Let N
How AI and Machine Learning Can Impact Market Design 571
be the set of all currently broadc
asting TV stations and let S ⊆ N be a subset of those TV stations. Let C be the set of available channels to which to assign
stations after the auction, and let ∅ denote the null assignment for a station
that does not continue to broadcast. A channel assignment is a mapping
A : N → C ∪ {∅}. The constraints on the channels available for assignment are to ones that rule out interference between pairs of TV stations, taking
the form: A( n ) = c ⇒ A( n ) ≠ c for some ( c , c ) ∈ C2. Each such constraint 1
1
2
2
1
2
is described by a fourtuple: ( n , c n , c ). There were more than a million such 1
1 2
2
constraints in the FCC’s problem. A channel assignment is feasible if it sat-
isfi es all the interference constraints; let A denote the feasible set of assign-
ments. A set of stations Sʹ can be feasibly assigned to continue broadcasting,
which we denote by Sʹ ∈ F(C), if there exists some feasible channel assign-
ment A ∈ A such that ∅ ∉ A( Sʹ).
Most of the interference constraints took a special form. Those con-
straints assert that no two stations that are geographic neighbors can be
assigned to the same channel. Let us call such stations “linked” and denote
the relationship by ( n , n ) ∈ L. For such a pair of stations, the constraint can 1
2
be written as: A( n ) = A( n ) ⇒ A( n ) = ∅. These are the cochannel interference 1
2
1
constraints. One can think of ( N, L) as defi ning a graph with nodes N and arcs L. If the cochannel constraints were the only ones, then determining
whether Sʹ ∈ F would amount to deciding whether there exists a way to
assign channels in C to the stations in N so that no two linked nodes are on
the same channel.
Figure 23.1 shows the graph of the cochannel interference constraints for
the United States and Canada. The constraint graph is most dense in the
eastern half of the United States and along the Pacifi c Coast.
In the special case of cochannel constraints, the problem of checking the
feasibility of a set of stations is a standard graph- coloring problem. The
problem is to decide whether it is possible to assign a color (channel) to each
node (station) in the graph so that no two linked nodes are given the same
color. Graph- coloring is in the class of NP- complete problems, for which
there is no known algorithm that is guaranteed to be fast, and for which it is
commonly hypothesized1 that worst- case solution time grows exponentially
in the problem size. Since the general station assignment problem includes
the graph coloring problem, it, too, is NP- complete, and can become intrac-
table at scales such as that of the FCC’s problem.
The problem that the FCC would ideally like to solve using an auction is
to maximize the value of the stations that remain on- air to broadcast, given
the reduced set of channels C. If the value of station j is v , the problem can j
be formulated as follows:
max
v . j
S F (C ) j S
1. The standard computer science hypothesis that P ≠ NP implies that no fast algorithm exists for NP- complete problems.
eallocation
or spectrum r
aph f
ence gr
Cochannel interfer
Fig. 23.1
How AI and Machine Learning Can Impact Market Design 573
This problem is very hard. Indeed, as we have just argued, even checking the
condition S ∈ F(C) is NP- complete, and solving exactly the related optimi-
zation is even harder in practice. Computational experiments suggest that
with weeks of computation approximate optimization is possible, but with
an optimization shortfall that can be a few percent.
For a TV station owner, it would be daunting to formulate a bid in an
auction in which even the auctioneer, with all the bids in hand, would fi nd it
challenging to determine the winners. Faced with such a problem, some sta-
tion owners might choose not to participate. That concern led the FCC staff
to prefer a strategy- proof design, in which the optimal bid for the owner of a
single station is relatively simple, at least in concept: compute your station’s
value and bid that amount. As is well known, there is a unique strategy-
proof auction that optimizes the allocation and pays zero to the losers: the
Vickrey auction. According to the Vickrey rules, if the auctioneer purchases
the broadcast rights from station j, it must pay the owner this price:
p = max
v
max
v .
i
j
j
S F ( C )
S F ( C)
j S
j S
i S
For a winning station i, the Vickrey price p will be larger than the station i
value. With roughly 2,000 stations to include in the optimization, a 1 per-
cent error in either of the two maximizations would result in a pricing error
for p equal to about 2,000 percent of the value of an average station. Such
i
huge potential pricing errors would likely raise hackles among some of the
potential bidders.
One way to put the problem of the Vickrey auction into sharp relief is to
imagine the letter that the FCC might write to broadcasters to encourage
their participation:
Dear Mr. Broadcaster:
We have heard your concerns about the complexity of the spectrum
reallocation process. You may even be unsure about whether to participate
or how much to bid. To make things as easy as possible for you, we have
adopted a Nobel Prize– winning auction procedure called the “Vickrey
auction.” In this auction, all you need to do is to tell us what your broad-
cast rights are worth to you. We’ll fi gure out whether you are a winner and,
if so, how much to pay to buy your rights. The rules will ensure that it is in
your interest to report truthfully. That is the magic of the Vickrey auction!
The computations that we do will be very hard ones, and we cannot
guarantee that they will be exactly correct.
Such a letter would leave many stations owners uncomfortable and unsure
about whether to participate. The FCC decided to adopt a diff erent design.
What we describe here is a simplifi ed version of the design, in which the
broadcasters’ only choices are whether to sell their rights or to reject the
574 Paul R. Milgrom and Steven Tadelis
FCC’s off er and continue to broadcast. Each individual broadcaster was
comforted by the assurance that it could bid this way, even if it had addi-
tional options, too. 2
In the simplifi ed auction, each bidder i was quoted a price p ( t) at each i
round t of the auction that decreased from round- to-round. In each round,
the bidder could “exit,” rejecting the current price and keeping its broadcast
rights, or it could accept the current price. After a round t of bidding, sta-
tions were processed one at a time. When station i was processed, the auction
software would use its feasibility checker to attempt to determine whether it
could feasibly assign station i to continue broadcasting, given the oth
er sta-
tions that had already exited and to which a channel must be assigned. This is
the generalized graph- coloring problem, mentioned earlier. If the software
timed out, or if it determined that it is impossible to assign the station, then
the station would become a winner and be paid p ( t – 1). Otherwise, its price i
would be reduced to p ( t) and it would exit or continue, according to the bidi
der’s instructions. It would be obvious to a station owner that, regardless of
the pricing formula and of how the software performed, its optimal choice
when its value is v is to exit if p ( t) < v and otherwise to continue.3
i
i
i
The theory of clock auctions of this sort for problems with hard compu-
tations has been developed by Milgrom and Segal (2017), who also report
simulations showing high performance in terms of effi
ciency and remarkably
low costs of procuring TV broadcast rights.
The performance of this descending auction design depends deeply on
the quality of the feasibility checker. Based on early simulations, our rough
estimate was the each 1 percent of failures in feasibility checking would add
about 1.5 percent—or about $150 million—to the cost of procuring the
broadcast rights. So, solving most of the problems very fast became a high
priority for the auction- design team.
As a theoretical proposition, any known algorithm for feasibility checking
in the spectrum- packing problem has worst- case performance that grows
exponentially in the size of the problem. Nevertheless, if we know the dis-
tribution of likely problems, there can still be algorithms that are fast with
2. In the actual auction, some broadcasters also had the option to switch from a UHF TV
channel to a channel in the high VHF band, or one in the low VHF band (the so-called HVHF
and LVHF options).
3. The pricing formula that the FCC used for each station was p ( t) = (Pop Links )0.5 q( t). In i
i
i
this formula, q( t) is the “base clock price” that scaled the price off ers to all the bidders. This price began at a high level q(0) to encourage participation, and it declined round- by- round during the auction; Pop denotes the population of the area served by the station, which stands in for i
the value of the station. By linking prices to population served, the auctioneer is able to off er higher prices to valuable stations in high- population areas that it might need to acquire for a successful auction; Links measured the number of other stations to which station i was linked i