EconCS @ UC Berkeley
There have been two major paradigms in market price determination. One is the market equilibrium approach which has been been very important in the understanding the structure of an economy with various types of products, and is very useful, when combined with realistic economic data, to develop numerical evaluation of the performance of an economy under the influence of different policies in the framework of Computable General Equilibrium. Another is auction protocols under the general framework of mechanism design, first started for goods of scarcity, and now widely used in E-commerce for all kinds of goods.
Though the two models are studies on the same subject matter of pricing and allocation, those two approaches are based on different principles, one by market clearance and another by truthfulness report of agents' private information. On the one hand, there is a weakness of the market equilibrium being criticised from the mechanism point of view that there may be possible untruthful reports of utility functions by individual agents in the market. On the other hand, it has been well known that truthful mechanisms such as the VCG protocol may not be able to extract the full value from the market that hinders their usefulness in practice.
In this talk, we consider a model where a central agent determines a market clearance price and allocations to the buyers based on their reported utility functions. We consider two issues: 1. Assuming full rationality of all agents, what is the equilibrium based on the bidding utilities of agents, 2. what is the limit of agents' rationality to change, for a market equilibrium solution to hold.
We will discuss our framework for important market models: for Issue 1, the matching market, and for Issue 2, linear market, Leontief, and Cobb-Douglas market.
In the combinatorial allocation problem, agents have combinatorial preferences over bundles of possible allocations. The example I will discuss in detail is business school course allocation, where students may have complex preferences over schedules of courses (e.g., two desirable courses could meet at the same time). Other examples of combinatorial allocation include scheduling workers to shifts and scheduling airlines to landing slots.
I will begin by giving the intuition why conventional approaches to this problem fail. I will then introduce a mechanism that implements a fair allocation while being approximately truthful and efficient. Unfortunately, implementing the mechanism requires minimizing a highly discontinuous function in ~100-dimensional space. I will detail the two-stage search technique we used to solve this difficult problem. At the master level, the center uses tabu search over the union of two distinct neighborhoods to suggest prices to the agents; at the agent level, we use MIPs to solve for student demands in parallel at the current prices. Our method scales near-optimally in the number of processors used and is able to solve realistic-size problems fast enough to be used in practice.
This approach recently won a competition at the Wharton School to serve as their new course selection mechanism starting in Spring 2013. The practical success of the mechanism is exciting for two reasons beyond business schools. First, it offers a wide-ranging solution for repugnant market design with complex preferences. Second, it suggests a highly parallelizable approach to solving large-scale optimization problems.
Joint work with Eric Budish (Chicago Booth).
Feb 28 | Isabelle Stanton (UC Berkeley) on showThe Structure, Efficacy and Manipulation of Double-Elimination Tournaments.
A double-elimination (DE) tournament is a competition where no participant is eliminated until they have lost two matches. It is structured as two single-elimination tournaments: the winner bracket and the loser bracket. Players who lose once in the winner bracket are mapped to positions in the loser bracket, according to a mapping called the link function. Surprisingly, although the same structure of the winner and loser brackets is used universally, there is no standard definition of the link function. By investigating several design goals, we show that the functions used in practice are not optimal and propose a similar function that is optimal with respect to avoiding repeated match-ups. We empirically show that use of the new link function does not impact the ability of a DE tournament to select a strong winner.
Given our definitions, we address the manipulability of DE tournaments. We show that they are vulnerable to manipulation by a coalition of players who can improve their chance of winning by throwing matches. We also discuss the computational complexity of manipulation by a tournament organizer (agenda control) in two settings: by changing the player seeding in the winner bracket, or by picking the mapping of losers to the loser bracket. We provide algorithms, hardness proofs, and we formulate open problems for future research.
Finally, we empirically compare single and double-elimination tournaments in terms of the probability that the strongest player wins the tournament and show that this probability can be drastically higher in DE tournaments, confirming the intuition that DE tournaments are more robust than SE tournaments.
This paper examines investor reaction to stale information using a novel data set containing a time-stamped transcript of the financial news network CNBC. I measure changes in stock price and trading volume at the precise time that a company is mentioned on CNBC in the 24 hours following a corporate news event, and find strong evidence that some investors react to stale news. There is a significant increase in stock price at the precise time that a company is mentioned on CNBC following a positive news event. Surprisingly, there is also a significant increase in stock price at the precise time that a company is mentioned on CNBC following a negative news event. This puzzle is not explained using observable differences between positive and negative news events or their subsequent mentions. Evidence using cross-sectional variation in the number of positive and negative words suggests that media attention can inflate asset prices in the presence of short-sale constraints as investors with the most optimistic valuations are able to buy while those with the most pessimistic valuations are unable to sell short.
An outdated draft of the paper.
I will present auction and optimization problems with ad exchanges, and also discuss pricing problems with user data on the web.
Truthfulness is fragile and demanding. It is oftentimes computationally harder than solving the original problem. Even worse, truthfulness can be utterly destroyed by small uncertainties in a mechanism's outcome. One obstacle is that truthful payments depend precisely on outcomes other than the one realized, such as the lengths of non-shortest-paths in a shortest-path auction. Single-call mechanisms are a powerful tool that circumvents this obstacle --- they implicitly charge truthful payments, guaranteeing truthfulness in expectation using only the outcome realized by the mechanism. The cost of such truthfulness is a trade-off between the expected quality of the outcome and the risk of large payments.
We largely settle when and to what extent single-call mechanisms are possible. The first single-call construction was discovered by Babaioff, Kleinberg, and Slivkins [BKS10] in single-parameter domains. They give a transformation that turns any monotone, single-parameter allocation rule into a truthful-in-expectation single-call mechanism. Our first result is a natural complement to [BKS10]: we give a new transformation that produces a single-call VCG mechanism from any allocation rule for which VCG payments are truthful. Second, in both the single-parameter and VCG settings, we precisely characterize the possible transformations, showing that that a wide variety of transformations are possible but that all take a very simple form. Finally, we study the inherent trade-off between the expected quality of the outcome and the risk of large payments. We show that our construction and [BKS10] simultaneously optimize a variety of metrics in their respective domains.
Our study is motivated by settings where uncertainty in a mechanism renders other known techniques untruthful. As an example, we analyze pay-per-click advertising auctions, where the truthfulness of the standard VCG-based auction is easily broken when the auctioneer's estimated click-through-rates are imprecise.
Mar 27 | Spring break
Apr 3 | Kamal Jain (eBay) on showEquilibrium Pricing of Semantically Substitutable Digital Goods.
We extend the Arrow-Debreu existence of market equilibrium theorem to digital economy. Our economic model include all the conventional goods and as well as digital goods of mass consumption, such as movies and songs. We also show that the proposed extension preserves the important First and Second welfare theorems of the convention economy.
Joint work with Prof. Vijay V. Vazirani. Paper at http://arxiv.org/pdf/1007.4586.pdf
Apr 10 | Giorgos Zervas (Yale) on showThe Groupon Effect on Yelp Ratings: A Root Cause Analysis .
Daily deals sites such as Groupon offer deeply discounted goods and services to tens of millions of customers through geographically targeted daily e-mail marketing campaigns. In our prior work we observed that a negative side effect for merchants using Groupons is that, on average, their Yelp ratings decline significantly. However, this previous work was essentially observational, rather than explanatory. In this work, we rigorously consider and evaluate various hypotheses about underlying consumer and merchant behavior in order to understand this phenomenon, which we dub the Groupon effect. We use statistical analysis and mathematical modeling, leveraging a dataset we collected spanning tens of thousands of daily deals and over 7 million Yelp reviews. In particular, we investigate hypotheses such as whether Groupon subscribers are more critical than their peers, or whether some fraction of Groupon merchants provide significantly worse service to customers using Groupons. We suggest an additional novel hypothesis: reviews from Groupon subscribers are lower on average because such reviews correspond to real, unbiased customers, while the body of reviews on Yelp contain some fraction of reviews from biased or even potentially fake sources. Although we focus on a specific question, our work provides broad insights into both consumer and merchant behavior within the daily deals marketplace.
Link to paper: http://arxiv.org/abs/1202.2369
Brief Bio: Giorgos Zervas is a Simons Postdoctoral Fellow at Yale University, and an Affiliate at the Center for Research on Computation and Society (CRCS) at Harvard University. He earned his PhD in Computer Science from Boston University under the supervision of John Byers and Michael Mitzenmacher. He is broadly interested in problems lying in the intersection of computer science, economics, and marketing.
We study the extent to which simple auctions can simultaneously achieve good revenue and efficiency guarantees in single-item settings. Motivated by the optimality of the second price auction with monopoly reserves when the bidders' values are drawn i.i.d. from regular distributions [Myerson'81], and its approximate optimality when they are drawn from independent regular distributions [Hartline Roughgarden '09], we focus our attention to the second price auction with general (not necessarily monopoly) reserve prices, arguably one of the simplest and most intuitive auction formats. As our main result, we show that for a carefully chosen set of reserve prices this auction guarantees at least 20% of both the optimal welfare and the optimal revenue, when the bidders' values are distributed according to independent, not necessarily identical, regular distributions. We also prove a similar guarantee, when the values are drawn i.i.d. from a---possibly irregular---distribution.
This is joint work with Costis Daskalakis (MIT)
Apr 24 | Moshe Babaioff (Microsoft) on showPeaches, Lemons, and Cookies: Designing Auction Markets with Dispersed Information.
This paper studies the role of information asymmetries in second price, common value auctions. Motivated by information structures that arise commonly in applications such as online advertising, we seek to understand what types of information asymmetries lead to substantial reductions in revenue for the auctioneer. One application of our results concerns online advertising auctions in the presence of "cookies", which allow individual advertisers to recognize advertising opportunities for users who, for example, are customers of their websites. Cookies create substantial information asymmetries both ex ante and at the interim stage, when advertisers form their beliefs. The paper proceeds by first introducing a new refinement, which we call "tremble robust equilibrium" (TRE), which overcomes the problem of multiplicity of equilibria in many domains of interest.
Second, we consider a special information structure, where only one bidder has access to superior information, and show that the seller's revenue in the unique TRE is equal to the expected value of the object conditional on the lowest possible signal, no matter how unlikely it is that this signal is realized. Thus, if cookies identify especially good users, revenue may not be affected much, but if cookies can (even occasionally) be used to identify very poor users, the revenue consequences are severe. In the third part of the paper, we study the case where multiple bidders may be informed, providing additional characterizations of the impact of information structure on revenue. Finally, we consider richer market designs that ensure greater revenue for the auctioneer, for example by auctioning the right to participate in the mechanism.
We present a new model of crowdsourcing, in which a principal seeks production of a single good from a number of potential producers within a limited time frame. There is a hard deadline, after which any good produced has no value to the principal. The output of each producer is a stochastic function of effort expended, which in light of the deadline, may warrant simultaneous production of multiple goods, despite there being no value for extra goods produced. This motivates crowdsourcing as a model of procurement. We address efficient execution of crowdsourcing from a social planner’s perspective, taking into account the value to the principal and the costs to producers (modeled as effort expenditure), while contending with self-interest on the part of all players. A solution to this problem involves an algorithmic aspect that determines an optimal effort level for each producer given the principal’s value, and an incentive mechanism that achieves implementation of the socially optimal policy in equilibrium. In contrast to popular “winner take all” contests, the efficient mechanism we propose involves a payment to every producer that expends non-zero effort in the efficient policy.
Joint work with Ruggiero Cavallo.