Mostrando postagens com marcador microeconomia. Mostrar todas as postagens
Mostrando postagens com marcador microeconomia. Mostrar todas as postagens

03 abril 2014

Preço dos alimentos em alta, reduz a pobreza e desigualdade

Standard microeconomic methods consistently suggest that, in the short run, higher food prices increase poverty in developing countries. In contrast, macroeconomic models that allow for an agricultural supply response and consequent wage adjustments suggest that the poor ultimately benefit from higher food prices. In this paper we use international data to systematically test the relationship between changes in domestic food prices and changes in poverty. We find robust evidence that in the long run (one to five years) higher food prices reduce poverty and inequality. The magnitudes of these effects vary across specifications and are not precisely estimated, but they are large enough to suggest that the recent increase in global food prices has significantly accelerated the rate of global poverty reduction. The policy implications of these findings are therefore nuanced: short-run social protection is justified in the face of high food price volatility, but passing on higher prices to producers in the long run is an important means of reducing poverty in the poorest countries.

12 março 2013

Valor dos mecanismo de busca da internet

MEASURING the value of a good is much trickier than measuring the cost, since value inherently involves consideration of a hypothetical: what would your life be like without that good? 
Economists commonly use two measures to assign monetary value to some good or service: the "compensating variation" and the "equivalent variation". The compensating variation asks how much money we would have to give a person to make up for taking the good away from them while the equivalent variation asks how much money someone would give up to acquire the good in question. The term "consumer surplus" refers to an approximation to these theoretically ideal measures.
If we want to estimate "the value of Google search" we have to look at both the commercial and non-commercial aspects of search: users are searching for answers to questions (some of which are commercial in nature) and advertisers are searching for customers for (mostly) commercial transactions. So is useful to break the problem up into two pieces: the value of ads to advertisers and publishers, and the value of search results to users.
Suppose Google were to disappear tomorrow. In the first instance, advertisers and publishers would lose a lot of visitors to their web sites.  How much are those lost visitors worth to them? This is the question I tried to answer in the "value of Google" study. The tricky part is ascribing a value to the advertisers of those web site visitors, but it turns out there is a way to infer that value from advertising bidding behavior. This allows us to get a back-of-the-envelope estimate of the immediate loss in value from Google vanishing.
Of course, "if Google did not exist, man would have to invent it". So we would expect that as the weeks went on, users would start to use other search engines, and advertisers would spend advertising dollars in different ways, and publishers would find other ways to get ad dollars for their web sites. 
So the long-run loss in value would be substantially less than the immediate impact. Ultimately the lost value would be the difference between Google and the next best advertising alternative but that would be almost impossible to estimate given the available data.
Turning to the user side, we drew on the work of Yan Chen, Grace YoungJoo Jeon and Yong-Mi Kim of the University of Michigan to estimate the value of online search in general.
Some of us are old enough to remember what life was like before search engines. We had to look through a pile of reference books to find the answers to basic questions. Even small questions, like how to spell a word, or whether it was likely to the rain the next day, required some effort to answer. Even trivia was hardly trivial: finding obscure facts involved substantial research.
So one way to measure the value of online search would be to measure how much time it saves us compared to methods we used in the bad old days before Google. Based on a random sample of Google queries, the UM researchers found that answering them using the library took about 22 minutes while answering them using Google took 7 minutes. Overall, Google saved 15 minutes of time. (This calculation ignores the cost of actually going to the library, which in some cases was quite substantial. The UM authors also looked at questions posed to reference librarians as well and got a similar estimate of time saved.)
I attempted to convert this time to dollar savings using the average wage and came up with about $500 per adult worker per year. This may seem like a lot, but it works out to just $1.37 a day. I would guess that most readers of this blog get $1.37 worth of value per day out of their search engine use.
When doing this calculation, it is important to take account of the fact that since the cost of getting answers is now so low, we ask a lot more questions. When getting an answer involved a trip to the library and 22 minutes to answer an average question, we only attempted to get answers to important questions. Now that it involves only a few minutes at a search engine to answer questions, we ask many more—and a lot less valuable—questions.
Said another way, we wouldn't bother to even to go to the library unless we were willing to spend at least 22 minutes (on average) to find the answer. Now that it takes us only a few seconds or minutes to get an answer, we ask a lot of frivolous questions (along with the important ones, of course.)
Taking this effect into account involves estimating a "demand curve for questions" as a function of the "cost of getting answers". I don't know any serious research on this topic, so I made a rough approximation to that demand curve and came up with the $1.37 a day figure. It could be larger or it could be smaller, but I think that is the right order of magnitude.
There are many other ways to estimate the value of the internet and the services it provides. However, to the extent that they are based on current practices, they likely underestimate the long-term value of the internet. 
It is now possible for everyone on the planet to have access to all the information humans have ever produced.  The barriers to this utopian dream are not technological, but legal and economic.  When we manage to solve these problems, we will be able to unlock vast pools of human potential that have hitherto been inaccessible.  In the future this will be viewed as a turning point in human history, and economic advances generated by global access to all information will be recognised as the true value of the internet.

20 janeiro 2013


ON THE face of it, economics has had a dreadful decade: it offered no prediction of the subprime or euro crises, and only bitter arguments over how to solve them. But alongside these failures, a small group of the world’s top microeconomists are quietly revolutionising the discipline. Working for big technology firms such as Google, Microsoft and eBay, they are changing the way business decisions are made and markets work.
Take, for example, the challenge of keeping costs down. An important input for a company like Yahoo! is internet bandwidth, which is bought at group level and distributed via an internal market. Demand for bandwidth is quite lumpy, with peaks and troughs at different times of the day. This creates a problem: because spikes in demand must be met, firms run with costly spare capacity much of the time.
This was one of the first questions that Preston McAfee, a former California Institute of Technology professor, looked at when he arrived at Yahoo! in 2007. Mr McAfee, who now works for Google, found that uses of bandwidth fall into two categories: urgent (displaying a web page) and delayable (backups and archiving). He showed how a two-part tariff (high prices when demand peaks, low ones otherwise) could shift less time-sensitive tasks to night-time, allowing Yahoo! to use costly bandwidth more efficiently.
The solution—two types of task, two prices—has intuitive appeal. But economists’ ideas on how to design markets can seem puzzling at first. One example is the question of how much detail an online car auctioneer should reveal about the condition of the vehicles on offer. Common sense would suggest some information—a car’s age and mileage—is essential, but that total transparency about other things (precise details on subpar paintwork) might deter buyers, lowering the auctioneer’s commissions. Academic theory suggests otherwise: in some types of auction more information always raises revenues.
To test the idea, Steve Tadelis of the University of California at Berkeley (now also working for eBay) and Florian Zettelmeyer of Northwestern University set up a trial, randomly splitting 8,000 cars into two groups. The first group were auctioned with standard information, including age and mileage. The second had a detailed report on the car’s paintwork. The results were striking: cars in the second group had better chances of a sale and sold for higher prices. This effect was most pronounced for cars in poorer condition: the probability of a sale rose by 23%, with prices up by 5%. The extra information meant that buyers were able to spot the type of car they wanted. Competition for cars rose, even the scruffier ones.
But more information is not always better. Studies show that shoppers overwhelmed by choice may simply walk away. Mr Tadelis tested whether it would be better to tailor eBay’s auctions to users’ experience level. The options for new users were narrowed, by removing sellers who are more difficult to assess (for example those who had less-than-perfect feedback on things like shipping times). When new users had a simpler list of sellers to choose from, the number of successful auctions rose and buyers were more likely to use eBay again. Tailoring the market meant gains for buyers, sellers and eBay.
The desire to use theory to challenge conventional thinking is one reason economists are valuable to firms, says Susan Athey, of Stanford University and Microsoft. When Ms Athey arrived at the software giant in 2007 it faced what was seen as an unavoidable trade-off: online advertising was good for revenues, but too much would deter users. If advertisers gained, users would lose. But economic theory challenges this, showing that if firms are dealing with two groups (advertisers and users, say), making one better off often benefits the other too.
Ms Athey and Microsoft’s computer scientists put that theory to work. One idea was to toughen the algorithm that determines whether an ad is shown. This means ads are displayed fewer times, so advertisers lose out in the short-term. But in the longer run, other forces come into play. More relevant ads improve the user experience, so user numbers rise. And better-targeted ads mean more users click on the advert, even if it is shown less often. Empirical evidence showed that although advertisers would respond only after some time, the eventual gain was worth the wait. Microsoft made the change.
Microeconomists have their sights on problems outside their home turf too. At the moment the policies picked by central banks and finance ministries are based on old news, since things like GDP, inflation and unemployment are measured with long lags. A team at Google headed by its chief economist, Hal Varian, is using search-engine data to provide more timely measures. Search terms like “job”, “benefits” and “solitaire” are closely correlated with unemployment claims (see chart). These types of relationship help construct new indexes that offer a real-time picture of the economy. If policymakers start to use these in a systematic way, their decisions could be based on how the economy looked yesterday, rather than months ago.