Friday, 30 December 2016

The battle for shelf space: Lewis Road Creamery vs. Fonterra

One of my favourite discussion points in the pricing strategy topic of ECON100 has little to do with pricing at all. It is actually about making the business environment difficult for your competitors, by competing with yourself. The principle is actually very simple. If you produce a wide variety of similar substitute products and stock them on supermarket shelves, then there is less space left available for competing products (meaning those made by your competitors). By competing with yourself, you actually ensure that it is more difficult for any other firm to do so, in spite of giving the consumer the impression that there is a lot of competing brands. If you doubt this happens, I suggest going to the laundry powder aisle of the supermarket, and comparing the backs of the boxes there. I can almost guarantee that most of the apparent variety of brands will actually only be produced by two (or maybe three) suppliers.

Which brings me to the current bust-up between Lewis Road Creamery and Fonterra, first reported by Liam Damm in the New Zealand Herald on 12 December (see also here and here, where the latest news is that LRC will commence legal action against Fonterra, but presumably that has more to do with packaging and less to do with shelf space). The first of those articles has the detail:
Lewis Road Creamery founder Peter Cullinane is accusing Fonterra of negotiating a "greedy" deal with supermarkets which would limit the ability of smaller dairy brands to get space in the chiller.
In an open letter to Fonterra chief executive Theo Spierings, Cullinane questions whether Fonterra is playing fair.
He said he understood that "Fonterra is looking to use its market power to introduce an exclusionary deal with supermarkets in the North Island that would all but remove non-Fonterra brands".
The deal would give Fonterra's white milk brands 95 per cent of the chiller space, he says.
This is something we should not be surprised about at all, knowing that reducing the shelf space for competing products is a viable strategy. The supermarkets gave the following responses:
Yesterday a representative for the Countdown supermarket chain said it was not involved in any deal with Fonterra.
"Countdown does not have any deal with Fonterra of this nature, and would not enter into any agreement like this," said Countdown general manager merchandise, Chris Fisher. "We treat all of our suppliers fairly and shelf space is determined based on the merit and popularity of each product," Fisher said.
A spokesperson for Foodstuffs North Island said the company had "an agreement in place with Fonterra, as we do with other suppliers including Lewis Road Creamery, to supply a range of dairy products, the terms of our supplier agreements are confidential".
I'd be more surprised if the supermarkets weren't negotiating deals with suppliers where a single supplier provides multiple competing products, especially if they involve side-payments from the suppliers. The fewer suppliers the supermarket chains have to deal with, the simpler (and therefore less costly) their logistics will be. Everybody wins. Well, everybody but the consumer who wants fair competition that leads to a variety of products and low prices.

Ultimately though, there may be a question for the Commerce Commission as to whether these agreements over-step the mark in terms of reducing competition. It will be interesting to see how this story progresses in the new year.

Thursday, 29 December 2016

The most forgotten Nobel Prize winners in economics are...

A new paper (open access) in the latest issue of the journal Applied Economics Letters by Aloys Prinz (University of Munster) caught my eye. In the paper, Prinz develops a measure of the memorability of Nobel Prize winners in economics:
In this article, fame, as measured by the number of Google hits at a certain time t, is supposed to be fading with time. The importance of a Nobel Prize laureate seems to be the higher, the more she or he is remembered years after receiving the Prize.
The paper focuses on the most memorable Nobel Prize winners (in 2012) who, unsurprisingly, are Milton Friedman, Paul Krugman, and Joseph Stiglitz. However, taking Prinz's Table 1 and re-ordering it in reverse order of memorability gives us the most forgotten Nobel Prize winners, being (in order of forgottenness) Thomas Sargent, Dale Mortensen, and Christopher Pissarides:


Sargent is surprising given that, in terms of achievement (as measured in this 2013 paper by Claes and De Ceuster; ungated version here), he is tenth overall. It is sad to see the low memorability (or high forgottenness) ranking of Elinor Ostrom, the only woman ever to win the award (again, in spite of a high achievement ranking). I imagine there's been a few more searches for Schelling and Selten in recent months, which would push them up the memorability (and down the forgottenness) rankings. Finally, it would be interesting (to me at least) to know where some of the short list for next year's award would measure up.

Wednesday, 28 December 2016

The behavioural economics of Lotto

Lotto (and similar lotteries in other countries) presents a problem for economists' assumption of rational behaviour. A rational, risk-averse person should never choose to play Lotto - it has a negative expected value (playing Lotto hundreds of times will lose you money, on average). This point has been made many times (see here for one example from Stats Chat - a simple search on Stats Chat will find you dozens more posts relating to Lotto).

So, if a rational person would never play Lotto, there must be some other explanation for that behaviour. So, I was happy to read this article in The Conversation by Ryan Anderson and David Mitchell (both James Cook University), which used some behavioural economics (among other reasons) to explain why people play Lotto. One reason related to availability bias:
The availability bias/heuristic relates to the idea that people judge the likelihood of something based roughly on how readily examples of it come to mind.
For example, you can probably think of news stories about when a shark has bitten a swimmer. One reason is this kind of a story is sensational, and will likely be highly reported. How often have you seen the headline: “No sharks at the beach today”?
Because you can easily bring to mind examples of shark attacks, you might be tempted to conclude shark attacks are far more common than they actually are. In fact, the chances of being attacked by a shark are somewhere in the neighbourhood of one in 12 million.
You hear and read stories about lottery winners all the time. Jackpot winners always make the news, but the battlers who have been playing for 20 years without winning are relegated to obscurity.
Based on this, it’s at least reasonable to think “jackpotting” can’t be that rare. The net effect is that winning seems possible.
Another related to the sunk cost fallacy:
In economics, a sunk cost is any previous expense that can’t be recovered – like a previous business expenditure on software, education, or advertising. Because this cost has already occurred and can’t be recovered, it should no longer be factored into future decisions. But this is seldom the case.
The sunk-cost fallacy occurs when you make a decision based on the time and resources you have already committed. Research suggests adults are more likely to fall victim to the sunk-cost fallacy than either children or lower-order animals.
In lotto, people will often persevere with what they sometimes know is economically irrational – like buying more lotto tickets – simply because they have already invested so much.
People are susceptible to the sunk cost fallacy because of loss aversion and mental accounting. Loss aversion simply means that we value losses more than we value equivalent gains - we prefer to avoid losses more than we seek to capture gains. This would seem to suggest we should avoid the losses that are inherent in Lotto. However, mental accounting (as the name implies) suggests that we keep 'mental accounts', such as an account for Lotto, and we like to keep those accounts in positive balances. Once we've played Lotto once, we will continue to play because if we stopped when we are behind, then we have to accept the loss from that mental account. As long as we keep playing, there is the chance that we win and the mental account turns positive. Note that this is also an explanation for much gambling behaviour, but also why we stay in jobs we hate, relationships we don't enjoy, investments that are not paying off, and so on. We don't want to acknowledge the loss.

Another behavioural economics explanation that Anderson and Mitchell referenced was our misunderstanding of small probabilities:
Gambling studies professor Robert Williams suggests that although humans have evolved some appreciation for numbers, we don’t really understand big numbers.
We deal with amounts like six, 24 and 120 all the time, but throughout history it’s never really been important to measure out 18 million of something, or count 50 million of something else.
Odds of one in 200 million don’t seem that different to odds of, say, one in 3 million. In both cases success is really unlikely.
Give someone a choice between odds of one in three and one in 200, however, and the difference is really obvious. It’s certainly not that people can’t grasp really big numbers, but that they don’t have much meaning until we stop and think about them.
It's actually worse than Anderson and Mitchell infer. Not only do people not understand probabilities, we have a tendency to overestimate the likelihood of very unlikely events. This is one of the cornerstones of prospect theory, the theory developed by Nobel Prize-winner Daniel Kahneman, along with Amos Tversky. So, if we overestimate small probabilities, we overestimate our chances of winning Lotto, and this makes us more likely to play (compared with if we understood the real probability of winning).

Finally, Anderson and Mitchell also made a point that I have made before, that people facing hard times have less to lose and are more willing to take a punt on Lotto. This isn't behavioural economics at work though - it is people becoming less risk averse. Either way, Lotto is more proof (if any were needed) that our behavioural biases are actively working against us making rational decisions, and the assumption of rationality is dead.

Tuesday, 27 December 2016

The supply-and-demand puzzle that is the Boxing Day sale

In basic supply-and-demand models, if there is an increase in demand the equilibrium price rises (ceteris paribus; which means 'all else being equal'). However, Boxing Day is the biggest shopping day of the year - so we know demand is high. Why are prices so low then? Surely they should be higher?

Helpfully, the New York Times had a good article last month about the price of turkeys in the lead-up to Thanksgiving in the U.S., which illustrates a similar point:
According to government data, frozen whole-turkey prices drop significantly every November; over the last decade, retail prices have fallen an average of 9 percent between October and November.
That trend seems to defy Econ 101. Think back to those simple supply-and-demand curves from introductory micro, and you’ll probably remember that when the demand curve shifts outward, prices should rise. That’s why Major League Baseball tickets get most expensive during the World Series — games that (theoretically, anyway) many more people want to see. Similarly, airline tickets spike around Christmas...
The most intuitive and popular explanation for a high-demand price dip is that retailers are selling “loss leaders.” Stores advertise very low prices — sometimes even lower than they paid their wholesalers — for big-ticket, attention-grabbing products in order to get people in the door, in the hope that they buy lots of other stuff. You might get your turkey for a song, but then you also buy potatoes, cranberries and pies at the same supermarket — all at regular (or higher) markups. Likewise, Target offers a big discount on select TVs on Friday, which will ideally entice shoppers to come in and buy clothes, gifts and other Christmas knickknacks on that frenzy-fueled trip.
That is the supply-side explanation of what’s going on. But plenty of economists disagree, and argue that it’s actually demand-side forces — changing consumer preferences — that drive these price drops...
Consumers might get more price-sensitive during periods of peak demand and do more comparison-shopping, so stores have to drop their prices if they want to capture sales. Perhaps, during the holidays, the composition of consumers changes; maybe only rich people or people who really love turkey buy it in July, but just about everybody — including lower-income, price sensitive shoppers — buys it in November. Or maybe everyone becomes more price-sensitive in November because they’re cooking for a lot of other people, not just their nuclear families.
Count me among the economists with a preference for demand-side explanations, especially for Boxing Day sales. The discounts are so widespread that loss-leading seems extraordinarily unlikely as an explanation. To me, the most likely explanation for the low prices on Boxing Day is that consumers have more price elastic demand then. To see why that leads to lower prices, consider the diagram below (which is the profit-maximising diagram for a firm with market power - note that this is a constant-cost firm to make the diagram a bit easier). The black demand curve (D0) is associated with the black marginal revenue curve (MR0). With this demand curve the firm profit-maximises where marginal revenue (MR0) meets marginal cost (MC), at a quantity of Q*. The profit-maximising price (the price that leads to exactly Q* being demanded) is P0. Compare that with the red demand curve (D1), which is associated with the red marginal revenue curve (MR1). The red demand curve is more price elastic (flatter). With this demand curve the firm profit-maximises where marginal revenue (MR1) meets marginal cost (MC), which is still at a quantity of Q*. The profit-maximising price (the price that leads to exactly Q* being demanded) is P1. Note that when demand is more elastic (for a product with the same cost), the profit maximising price will be lower (the mark-up over marginal cost is also lower).


Why would demand be more elastic on Boxing Day? Consider the factors associated with higher (more elastic) price elasticity of demand. Are there more, or closer, substitute goods available? Arguable. Do consumers have longer time horizons? Perhaps (especially if you compare with shopping the day before Christmas). Is there a higher significance of price in the total cost to the consumer? Doesn't seem to fit. That leaves as a possibility that there is a higher proportion of income spent on the good. [*] If consumers spent a large proportion of their income on gifts and food for Christmas, that leaves a much smaller budget constraint for shopping on Boxing Day, which should lead to more elastic demand. That is, consumers with less money left over for shopping on Boxing Day will be more price sensitive, and retailers respond by lowering their prices to attract those price-sensitive shoppers. [**]

Finally, the NYT article had an interesting comparison between frozen turkeys on Thanksgiving (high demand and low price) and roses on Valentine's Day (high demand and high price), which is worth sharing:
There are a few possible reasons why market forces are different for roses and frozen turkeys on their respective holidays. For one, the loss-leader strategy really only works if you’re a multiproduct retailer, says Chevalier. Florists sell only flowers; they’re not willing to take a loss on the one thing they sell in the hope that you’ll buy a bunch of other stuff, since you’re not likely to buy anything else.
More important, roses — like airline seats or World Series tickets — are what economists refer to as “supply inelastic.” It’s costly to ramp up rose production in time for peak demand, since the roses must all be picked (and for the most part, flown in from Colombia and Ecuador) in the single week preceding Valentine’s Day.
Which is also why you see the biggest discounts on Boxing Day are on items that the retailer can easily hold back for the day, like electronics. Which I admit I took full advantage of!

*****

[*] Alert readers and economics students will notice that I have ignored two other factors that are associated with more elastic demand: (1) narrower definition of the market; and (2) normal goods (compared with inferior goods). Both of these factors compare different goods, so don't make sense when comparing elasticity for the same good on Boxing Day and not-on-Boxing-Day.

[**] Another alternative is that the composition of shoppers is different on Boxing Day from other days, with more low-income (and hence, price-sensitive) shoppers shopping on that day. Unless I see some evidence of this, I'm going to discount it as an explanation.

Wednesday, 21 December 2016

Don't free that lobster!

Last month, BBC reports that a well-meaning vegan bought a 10.4 kg giant lobster, in order to set it free. While on the surface, this act of inter-species kindness seems like a good idea, if it catches on in a big way and many people do likewise, it is easy to envisage this leading to unintended consequences. Here's how:

Say that the market for lobster (shown in the diagram below) is initially in equilibrium with the price P0 and the quantity of lobster taken from the ocean is Q0. Now, many well-meaning vegans decide to buy lobsters in order to free them. Because there are now more buyers in the market for lobster, this simply increases the demand for lobsters. Increased demand for lobsters (from D0 to D1) increases the price of lobsters (from P0 to P1), and importantly increases the quantity of lobsters caught (from Q0 to Q1). So setting lobsters free actually increases creates incentives that lead to an increase in the quantity of lobsters caught - the opposite of what was intended!


Of course, this is very similar to the problems of slave redemption in the Sudan that I have blogged about before. There is one important difference though, not accounted for in the above diagram. In many countries, the lobster catch is governed by a transferable quota system (which I've written on here and here). Even if demand for lobster increases, the number of lobsters caught cannot rise because fishermen are limited in the quantity that they are allowed to catch. In this case (shown in the diagram below), the quantity of lobster caught does not increase (it stays at Q0), but the increase in demand instead leads to a substantial increase in price (to P2 instead of P1). So, the actions of the vegans cause everyone's lobster meals to be significantly more expensive. Nice.



Tuesday, 20 December 2016

Thomas Schelling, 1921-2016

Among the several things I missed during my busy last week was the sad passing of Thomas Schelling, a Nobel Prize winner and one of the most important theorists and early writers in game theory. My ECON100 class (and past ECON308/408 Managerial Economics students) will recognise Schelling's name from the Schelling Point, the most likely solution to a coordination game (where there is more than one Nash equilibrium) in game theory. I also very recently (within the last couple of weeks) discussed Schelling's work on polarisation, in particular his model of segregation, with one of my new PhD students who will be working on a microsimulation model of ethnic communities in Auckland (more on that in a far future post - she is only just getting started on that work).

The Washington Post and the New York Times have excellent summaries of Schelling's life and work. I particular like these bits from the New York Times (which illustrate both the breadth of his work and its real-world implications):
Among other counterintuitive propositions he put forth, Professor Schelling suggested that one side in a negotiation can strengthen its position by narrowing its options, using as an example a driver in a game of chicken who rips the steering wheel from the steering column and brandishes it so his opponent can see that he no longer controls the car. He also argued that uncertain retaliation is more credible and more efficient than certain retaliation...
In “Meteors, Mischief and Wars,” published in Bulletin of the Atomic Scientists in 1960, Professor Schelling looked at the possibility of an accidental nuclear exchange between the United States and the Soviet Union and reviewed three novels that imagined such an event. The director Stanley Kubrick read his comments on the novel “Red Alert” and adapted the book for “Dr. Strangelove,” on which Professor Schelling was a consultant.
And on the model of segregation I noted above:
Expanding on the work of Morton Grodzins, a political scientist at the University of Chicago who used the term “tip point” to describe the crucial moment when white fears become white flight, Mr. Schelling offered a simple diagram, almost like a game board, to show how mixed urban neighborhoods could quickly become entirely black, even when white residents expressed only a slight preference for living among members of their own race.
His papers on the subject, and his book “Micromotives and Macrobehavior” (1978), achieved wider currency when his ideas were popularized by Malcolm Gladwell in his best-selling book “The Tipping Point” (2000).
You can expect a book review of Micromotives and Macrobehavior sometime in the new year - it has been on my 'to-be-read' list for some time.

Sunday, 18 December 2016

Book review: Circus Maximus

I've been a bit quiet while trying to finish a bunch of things before year's end, but I did manage to finish reading Andrew Zimbalist's book "Circus Maximus: The Economic Gamble behind Hosting the Olympics and the World Cup". I mentioned this book in a post about the lack of economic impact of stadiums and arenas earlier this year.

When I think about the Olympic Games or the FIFA World Cup, and see the crazy investments that cities (or countries in the case of the World Cup) are prepared to make not only for the event once it has been awarded to them, but also in simply bidding for the event, I immediately think about the winner's curse. Say that the potential hosts don't know for sure what the benefit of hosting the event is, but must outbid all other potential hosts in order to be awarded the event. The 'winning' host will be the city (or country) which had the most wildly optimistic view of the benefits of hosting, since they will be the city (or country) willing to bid the most. And it is very likely that that city (or country) will have bid more than the real benefits of hosting are worth.

Zimbalist's book makes that point much more forcefully than I have above, and much more besides. Barcelona is the poster-child of successful Olympic Games. In one chapter, Zimbalist contrasts the particular experience of Barcelona in leveraging the hosting to develop the city into a future powerhouse of tourism, with the experience of Sochi hosting the Winter Olympics, and the largely wasteful expenditures and investment in that city. This wasteful investment arises because one of the key arguments made in favour of hosting these events is both the expected short-term and the expected long-term boosts to tourism. The opportunity cost of this spending on tourism promotion is potential very great, and Zimbalist astutely remarks at one point:
Each prospective host would do well to ask the question, if we have $10 billion or $20 billion or more to spend on promoting tourism, what is the most effective use of those resources?
How much more good could be done with that money, in terms of tourism promotion, through more conventional means?

Throughout the book, Zimbalist does a good job of outlining the evidence from the academic literature on the cost-benefit calculus of hosting these large events. As one would expect, the winner's curse is broadly evident. Zimbalist also does an excellent job of looking beyond the academic evidence, and bringing together stories that illustrate some of the very real negative impacts of hosting. I hadn't followed the case of Rio de Janeiro and the impacts on the favelas of hosting both the Olympics and the World Cup, but some of the stories are quite horrific. London also comes in for some criticism on the inability to leverage the Olympics for the betterment of people living in East London, the area immediately surrounding the main Olympic venues.

The first paragraph of the concluding chapter is a useful summary to finish this review on. Zimbalist writes:
The perennial claims that hosting the Olympics or the World Cup is an engine of economic development find little corroboration in independent studies. In the short run, the increasingly massive costs of hosting cannot come close to being matched by the modest revenues that are brought in by the games. The payoff, if there is one, must be realized in the long run. But even the legacy return is at best dubious. Much of the alleged legacy comes in the form of the qualitative gains, and the rest comes over very long periods of time, difficult to trace back to the several-week period of the games or the prior construction. But more often than not, the main legacy consists of white elephants that cost billions to build and millions annually to maintain, along with mountains of debt that must be paid back over ten to thirty years.
If there is one thing missing from this book, it is the consideration of smaller events, such as the Commonwealth Games (the Delhi games are mentioned a couple of times only) and the Rugby World Cup. These smaller events require smaller outlays from the hosts, but also have much smaller potential gains. Nevertheless, this is an excellent book both for those who are familiar with the literature on economic impact studies, and those who are not, along with anyone with a specific interest in the Olympics or World Cup.

Monday, 12 December 2016

Video: Economist's Christmas

Tyler Cowen and Alex Tabarrok of Marginal Revolution have been providing us with video debates on various issues this year (I particularly liked the debate on "Is education signalling or skill-building?". The latest (though I notice it isn't on the same Econ Duel playlist) is on the economics of Christmas:


Enjoy!

Sunday, 11 December 2016

More contingent valuation debates

Back in March I highlighted a series of papers debating the validity (or not) of contingent valuation studies. To recap:
One non-market valuation technique that we discuss in ECON110 is the contingent valuation method (CVM) - a survey-based stated preference approach. We call it stated preference because we essentially ask people the maximum amount they would be willing to pay for the good or service (so they state their preference), or the minimum amount they would be willing to accept in exchange for foregoing the good or service. This differs from a revealed preference approach, where you look at the actual behaviour of people to derive their implied willingness-to-pay or willingness-to-accept.
As I've noted before, I've used CVM in a number of my past research projects, including this one on landmine clearance in Thailand (ungated earlier version here), this one on landmines in Cambodia (ungated earlier version here), and a still incomplete paper on estimating demand for a hypothetical HIV vaccine in China (which I presented as a poster at this conference).
One of the issues highlighted in that earlier debate had to do with scope problems:
Scope problems arise when you think about a good that is made up of component parts. If you ask people how much they are willing to pay for Good A and how much they are willing to pay for Good B, the sum of those two WTP values often turns out to be much more than what people would tell you they are willing to pay for Good A and Good B together. This issue is one I encountered early in my research career, in joint work with Ian Bateman and Andreas Tsoumas (ungated earlier version here).
Which brings me to two new papers published in the journal Ecological Economics. But first, let's back up a little bit. Back in 2009, David Chapman (Stratus Consulting, and lately the US Forest Service) and co-authors wrote this report estimating people's willingness-to-pay to clean up Oklahoma’s Illinois River System and Tenkiller Lake. In 2012, William Desvousges and Kristy Mathews (both consultants), and Kenneth Train (University of California, Berkeley) wrote a pretty scathing review of scope tests in contingent valuation studies (published in Ecological Economics, ungated here), and the Chapman et al. report was one that was singled out. You may remember Desvousges, Mathews, and Train from the CVM debate I discussed in my earlier post.

Four years later, Chapman et al. respond to the Desvousges et al. paper (sorry I don't see an ungated version online). In their reply, Chapman et al. demonstrate a quite different interpretation of their methods that on the surface appears to validate their results. Here's their conclusion:
In summary, DMT argue that Chapman et al.'s scope difference is inadequate because it fails to satisfy theoretical tests related to discounting and to diminishing marginal utility and substitution. Also, according to them, our scope difference is too small. Once the fundamental flaws in their interpretation of the scenarios are corrected, none of these arguments hold. The upshot is that Chapman et al. must be assigned to the long list of studies cited by DMT where their tests of adequacy cannot be applied.
However, Desvousges et al. respond (also no ungated version available). The response is only two pages, but it leaves the matter pretty much resolved (I think). The new interpretation of the methods employed by Chapman et al. has raised some serious questions about the overall validity of the study. Here's what Desvousges et al. say:
...this statement indicates that respondents were given insufficient information to evaluate the benefits of the program relative to the bid amount (the cost.) The authors argue that the value of the program depends on how the environmental services changed over time, and yet the survey did not provide this information to the respondent. So the authors violated a fundamental requirement of CV studies, namely, that the program must be described in sufficient detail to allow the respondent to evaluate its benefits relative to costs. The authors have jumped – to put it colloquially – from the frying pan into the fire: the argument that they use to deflect our criticism about inadequate response to scope creates an even larger problem for their study, that respondents were not given the information needed to evaluate the program.
All of which suggests that, when you are drawn into defending the quality of your work, you should be very careful that you don't end up simply digging a bigger hole for your research to be buried in.

Friday, 9 December 2016

The debate over poached elephants

No, this isn't a post about the best method for cooking endangered species. Back in June, I wrote a post about this NBER Working Paper by Solomon Hsiang (UC Berkeley) and Nitin Sekar (Princeton). In the paper, Hsiang and Sekar demonstrated that a one-time legal sale of stockpiled ivory that happened in 2008 led to a significant increase in elephant poaching.

Since that working paper was released, there has been quite a debate over the validity of the results. First, Quy-Toan Do (World Bank), Andrei Levchenko (University of Michigan) and Lin Ma (National University of Singapore) wrote this blog post on the World Bank website, where they showed that Hsiang and Sekar's results were present only in a subsample of small sites (i.e. sites where there were few elephant carcasses). Do et al. concluded:
In this short discussion, we argued that the results reported by Hsiang and Sekar are confined to only the smallest sites surveyed by the MIKE programme. Aggregate poaching therefore did not experience a step increase in 2008 as argued by the authors. Our data on raw ivory prices in both Africa and China further support the conclusion that the 2008 one-off sale actually had no discernible effects on ivory markets. Rather, we postulate that small changes in the classification of carcasses could account for the results documented by Hsiang and Sekar.
Hsiang and Sekar then responded to the criticisms here, where they argue that the Do et al. results were the "result of a sequence of coding, inferential, and logical errors".

In the latest on this debate, Do et al. have responded to the response. In this latest response, they included all of the Stata code and links to the dataset, so that you can replicate their results and test alternative results:
Hsiang and Sekar's results are now looking increasingly shaky. We'll see if they have any further response (to the response to their response)...
[HT: David McKenzie at Development Impact]

Read more:

Sunday, 4 December 2016

Big data and loyalty to your insurer could raise your insurance premiums

Back in September, I wrote a post about how the most loyal customers are the ones that firms should charge higher prices to, based on this Wall Street Journal article. Last week, the Telegraph had a similar article:
The financial regulator has warned that insurance companies could start charging higher premiums to customers who are less likely to switch by using “big data”.
In a speech to the Association of British Insurers, Andrew Bailey, chief executive of the Financial Conduct Authority, suggested that big data could be used to “identify customers more likely to be inert” and insurers could use the information to “differentiate pricing between those who shop around and those who do not.”...
James Daley, founder of Fairer Finance, the consumer group, said that to some degree big data was already being used to punish inert customers.
He said: “Insurers already know how their own customers are behaving. Those who don’t switch are penalised for their loyalty with higher premiums. Inert customers will be priced partly on risk and partly on what the insurer can get away with.”
To recap, these insurers are engaging in price discrimination -  where firms charge different prices to different customers for the same product or service, and where the price differences don't reflect differences in cost. There are three necessary conditions for effective price discrimination:
  1. Different groups of customers (a group could be made up of one individual) who have different price elasticities of demand (different sensitivity to price changes);
  2. You need to be able to deduce which customers belong to which groups (so that they get charged the correct price); and
  3. No transfers between the groups (since you don't want the low-price group re-selling to the high-price group).
 As I noted in my previous post in September:
If you are an insurance company, you want to charge the customers who are most price sensitive a lower price. If customer loyalty is associated with customers who don't shop around, then customer loyalty is also associated with customers who are less price sensitive. Meaning that you want to charge those loyal customers a higher price.
What about big data? The Telegraph article notes:
Earlier this month Admiral, the insurer, announced that it planned to use Facebook status updates and “likes” to help establish which customers were safe drivers and therefore entitled to a discount.
Campaigners called the proposal it intrusive and the social media giant then blocked Admiral’s technology just hours before it was due to launch.
Just last week a telematics provider, Octo, launched an app that that shares customers' driving data with insurers so that they could bid for custom. It claimed that the safest drivers would get the lowest premiums.
The problem here is that opting out of having your social media profiles available for your insurer to peruse may be an option, but it would also provide a signal to the insurer. Who is most likely to refuse? The high-risk insured, of course. So, anyone who refuses will likely face higher premiums because of the signal they are providing to their insurer. Again, this is a point I made a couple of months ago.

It seems that we will have to accept the reality that big data, and especially our 'private' social media and activity data, is simply going to determine our insurance premiums in future.

Read more:


Friday, 2 December 2016

Riccardo Trezzi is immortal

I very much enjoyed reading this new paper by Riccardo Trezzi (US Federal Reserve Board of Governors), forthcoming in Economic Inquiry (sorry I don't see an ungated version anywhere). In the paper, Trezzi creates a time series model of his own ECG, and uses it to estimate his life expectancy:
In this paper, I go well beyond the frontier. I employ time series econometrics techniques to suggest a decomposition of the heart electrical activity using an unobserved components state-space model. My approach is innovative because the model allows not only to study electrical activity at different frequencies with a very limited number of assumptions about the underlying data generating process but also to forecast future cardiac behavior (therefore estimating the date of death), overcoming the “sudden death forecast” issue which typically arises when using standard time-series models.
My results are duo-fold. First, I show how the heart electrical activity can be modeled using a simple state-space approach and that the suggested model has superior out-of-sample properties compared to a set of alternatives. Second, I show that when the Kalman filter is run to forecast future cardiac activity using data of my own ECG I obtain a striking result: the n-step ahead forecast remains positive and well bounded even after one googol period, implying that my life expectancy tends to infinite. Therefore, I am immortal.
And people wonder about the validity of economic forecasts...

[HT: Marginal Revolution]