# Highlights from the Amsterdam Workshop: Zombies, Arthropods, and Other Technical Terms

This week the JASP team organized their third annual workshop in Amsterdam. Below you’ll find some of the highlights. A select set of course materials is here. The plenary lectures were given by Richard Morey and E.J. Wagenmakers; special thanks goes out to Johnny van Doorn for dealing with the logistics, and to Quentin Gronau for helping out in general. The workshop was attended by 50 participants.

## Highlight 1: Live-Tweeting Richard Morey’s Lectures

On Monday morning Richard discussed the concept of evidence, relating it to belief updating and, ultimately, the Bayes factor. The JASP twitter account (@JASPstats) provided live updates. For instance:

…and

## Highlight 2: A Shiny-App for Understanding Bayesian Updating

In order to understand Bayesian updating it always helps to see it in action. The shiny app “A First Lesson in Bayesian Inference” was used to study whether people tend to answer “odd” or “even” to one of life’s more puzzling questions: “is the 1000th digit in the decimal expansion of π odd or even?” Participants answered one at a time, and the uncertainty about the group’s propensity to say “odd” was updated as the answers came in. I believe the final score was 11 “even” and 19 “odd” (and then we stopped data collection because we got bored, which is how Bayesians roll, despite the gnashing of frequentist teeth).

## Highlight 3: Twelve Hungry Zombies

For a class project, Sunday Mourning interviewed 12 zombies and queried their appetitive state. All 12 zombies indicated they were hungry. How much evidence is this for the general law that “all zombies are hungry”? The null hypothesis or general law is instantiated through a binomial rate parameter θ=1. If the alternative hypothesis assigns θ a uniform prior (yes, yes, I know) then the Bayes factor is 13 in favor of the null. With N hungry zombies, the Bayes factor in favor of the general law equals N+1. The example was used to illustrate how the Bayes factor rewards models that make risky predictions, and, later, how the answer (i.e., the evidence) changes if you change the question (i.e., the prior distribution on θ).

Note that the p-value is always 1, and the post-hoc power (yuk!) is always 0, regardless of how many hungry zombies were observed. The zombie example (well, it did not involve zombies, and there weren’t twelve, but otherwise it was identical) inspired Dorothy Wrinch and Harold Jeffreys in 1921 to lay the foundation for the Bayes factor hypothesis test. More specifically, without the Bayes factor (that is, without taking the general law θ=1 seriously), the answer to the question “are all zombies (from an infinite zombie population) hungry?” would always be “absolutely not!”. Wrinch and Jeffreys felt this foregone conclusion was “preposterous”, because we would “never be sure that apples fall from apple trees”. A recent historical overview published in Statistical Science is here (by Alex “The Voice” Etz, an undergrad at the time).

## Highlight 4: Arthropods

Richard Morey described the Bayesian repeated measures ANOVA (really a linear mixed model) for an analysis of how eager people are to kill, or otherwise get rid of, scary insects. Oh wait, they are arthropods.

## Highlight 5: A Technical Term

The complete link to the paper from the tweet is here.

## Conclusion

We had a blast, and we’d like to thank the participants for their unwavering attention and smart questions. We even discovered a statistical mistake in a PNAS paper (tssk, tssk). We are looking forward to next year’s workshop.