Friday, 9 December 2016

The debate over poached elephants

No, this isn't a post about the best method for cooking endangered species. Back in June, I wrote a post about this NBER Working Paper by Solomon Hsiang (UC Berkeley) and Nitin Sekar (Princeton). In the paper, Hsiang and Sekar demonstrated that a one-time legal sale of stockpiled ivory that happened in 2008 led to a significant increase in elephant poaching.

Since that working paper was released, there has been quite a debate over the validity of the results. First, Quy-Toan Do (World Bank), Andrei Levchenko (University of Michigan) and Lin Ma (National University of Singapore) wrote this blog post on the World Bank website, where they showed that Hsiang and Sekar's results were present only in a subsample of small sites (i.e. sites where there were few elephant carcasses). Do et al. concluded:
In this short discussion, we argued that the results reported by Hsiang and Sekar are confined to only the smallest sites surveyed by the MIKE programme. Aggregate poaching therefore did not experience a step increase in 2008 as argued by the authors. Our data on raw ivory prices in both Africa and China further support the conclusion that the 2008 one-off sale actually had no discernible effects on ivory markets. Rather, we postulate that small changes in the classification of carcasses could account for the results documented by Hsiang and Sekar.
Hsiang and Sekar then responded to the criticisms here, where they argue that the Do et al. results were the "result of a sequence of coding, inferential, and logical errors".

In the latest on this debate, Do et al. have responded to the response. In this latest response, they included all of the Stata code and links to the dataset, so that you can replicate their results and test alternative results:


Hsiang and Sekar's results are now looking increasingly shaky. We'll see if they have any further response (to the response to their response)...

[HT: David McKenzie at Development Impact]

Read more:

Sunday, 4 December 2016

Big data and loyalty to your insurer could raise your insurance premiums

Back in September, I wrote a post about how the most loyal customers are the ones that firms should charge higher prices to, based on this Wall Street Journal article. Last week, the Telegraph had a similar article:
The financial regulator has warned that insurance companies could start charging higher premiums to customers who are less likely to switch by using “big data”.
In a speech to the Association of British Insurers, Andrew Bailey, chief executive of the Financial Conduct Authority, suggested that big data could be used to “identify customers more likely to be inert” and insurers could use the information to “differentiate pricing between those who shop around and those who do not.”...
James Daley, founder of Fairer Finance, the consumer group, said that to some degree big data was already being used to punish inert customers.
He said: “Insurers already know how their own customers are behaving. Those who don’t switch are penalised for their loyalty with higher premiums. Inert customers will be priced partly on risk and partly on what the insurer can get away with.”
To recap, these insurers are engaging in price discrimination -  where firms charge different prices to different customers for the same product or service, and where the price differences don't reflect differences in cost. There are three necessary conditions for effective price discrimination:
  1. Different groups of customers (a group could be made up of one individual) who have different price elasticities of demand (different sensitivity to price changes);
  2. You need to be able to deduce which customers belong to which groups (so that they get charged the correct price); and
  3. No transfers between the groups (since you don't want the low-price group re-selling to the high-price group).
 As I noted in my previous post in September:
If you are an insurance company, you want to charge the customers who are most price sensitive a lower price. If customer loyalty is associated with customers who don't shop around, then customer loyalty is also associated with customers who are less price sensitive. Meaning that you want to charge those loyal customers a higher price.
What about big data? The Telegraph article notes:
Earlier this month Admiral, the insurer, announced that it planned to use Facebook status updates and “likes” to help establish which customers were safe drivers and therefore entitled to a discount.
Campaigners called the proposal it intrusive and the social media giant then blocked Admiral’s technology just hours before it was due to launch.
Just last week a telematics provider, Octo, launched an app that that shares customers' driving data with insurers so that they could bid for custom. It claimed that the safest drivers would get the lowest premiums.
The problem here is that opting out of having your social media profiles available for your insurer to peruse may be an option, but it would also provide a signal to the insurer. Who is most likely to refuse? The high-risk insured, of course. So, anyone who refuses will likely face higher premiums because of the signal they are providing to their insurer. Again, this is a point I made a couple of months ago.

It seems that we will have to accept the reality that big data, and especially our 'private' social media and activity data, is simply going to determine our insurance premiums in future.

Read more:


Friday, 2 December 2016

Riccardo Trezzi is immortal

I very much enjoyed reading this new paper by Riccardo Trezzi (US Federal Reserve Board of Governors), forthcoming in Economic Inquiry (sorry I don't see an ungated version anywhere). In the paper, Trezzi creates a time series model of his own ECG, and uses it to estimate his life expectancy:
In this paper, I go well beyond the frontier. I employ time series econometrics techniques to suggest a decomposition of the heart electrical activity using an unobserved components state-space model. My approach is innovative because the model allows not only to study electrical activity at different frequencies with a very limited number of assumptions about the underlying data generating process but also to forecast future cardiac behavior (therefore estimating the date of death), overcoming the “sudden death forecast” issue which typically arises when using standard time-series models.
My results are duo-fold. First, I show how the heart electrical activity can be modeled using a simple state-space approach and that the suggested model has superior out-of-sample properties compared to a set of alternatives. Second, I show that when the Kalman filter is run to forecast future cardiac activity using data of my own ECG I obtain a striking result: the n-step ahead forecast remains positive and well bounded even after one googol period, implying that my life expectancy tends to infinite. Therefore, I am immortal.
And people wonder about the validity of economic forecasts...

[HT: Marginal Revolution]

Wednesday, 30 November 2016

Jetstar regional services cause loss of $40 million to main centres' economies

Last week, Jetstar announced a report by Infometrics that suggested their introduction of regional flights to Nelson, Palmerston North, Napier, and New Plymouth boosted the economy of those regions by around $40 million. Here's what the New Zealand Herald reported:
Jetstar's regional operations could boost the economy of four centres it serves by about $40 million a year, according to Infometrics research.
The regional GDP growth could support up to 600 new jobs according to the research which notes domestic air travel prices have fallen by close to 10 per cent during the past 12 months.
Jetstar Group CEO, Jayne Hrdlicka, said the report highlighted how important cheap fares were to growing local economies.
That sounds like a good news story, but as with most (if not all) economic impact studies, it only provides half the picture. That's because flying to the regions doesn't suddenly create new money. So, every dollar that is spent by travellers to the regions is one less dollar that would have been spent somewhere else. In the case of domestic travellers who would not have otherwise travelled to those regions if Jetstar hadn't been flying there (which is the assumption made in the report), every dollar they spend on their trip to Napier is one less dollar they would have spent at home in Auckland. One could make a similar case for international travellers, although perhaps cheaper flights encourage them to spend more on other things than they otherwise would (although this is drawing a pretty long bow).

So, if it's reasonable to believe that Jetstar flights add $40 million to the economies of those regions, it is also reasonable to believe that Jetstar flights cost around $40 million in lost economic activity elsewhere in the country (depending on differences in multiplier effects between different regions), and much of this will likely be from the main centres.

To be fair, the Infometrics report (which I obtained a copy of, thanks to the Jetstar media team) does make a similar point that:
...the economic effects of this visitor spending should only be interpreted on a region-by-region basis, rather than as an aggregate figure for New Zealand as a whole. It is likely that some of the increase in visitor spending in regions with additional flights represented spending that was diverted from other parts of New Zealand.
The Infometrics report has some other issues, such as assuming a fixed proportion of business travellers to all four airports, which seems fairly implausible but probably doesn't have a huge impact on the estimates. A bigger issue might be the underlying model for calculating the multiplier effects, since multi-region input-output models (I'm assuming this is what they use) are known to suffer from aggregation bias that overstates the size of multiplier effects. I had a Masters student working on multi-region input-output models some years ago, and that was one of the main things I took away from that work. However, that's a topic that really deserves its own post sometime in the future.

Of course, these problems aren't important to Jetstar, which only wants to show its regional economic impact in the best light possible. The next step for them might be to say: "Ooh, look. We've done such a great job enhancing the economy of these regions. The government should subsidise us to fly to other regions as well so we can boost their economies too". Thankfully, they haven't taken it that far. Yet.

You might argue that boosting the economies of the regions, even if it is at the expense of the main centres, is a good thing. That might be true (it is arguable), but it isn't clear to me that increased air services is the most cost effective mechanism for developing the regional economies. I'd be more convinced by an argument that improved air services are a consequence of economic development, not a source of it.

For now, just take away from this that we should be sceptical whenever firms trumpet their regional economic impact based on these sorts of studies.