How to Do a Hierarchical Regression in JASP

The latest JASP version, 0.8.3, introduced a plethora of new features, including hierarchical regression. This blog post briefly describes this analysis.

In traditional linear regression, predictors are selected that form a statistical model; this model is then compared to the null model that includes only the intercept term. Performance of the specified model is then assessed by metrics such as the change in the F-statistic, or the change in R2, compared to the null model. However, it can be informative to specify a null model that not only includes the intercept, but other predictors as well.

To illustrate, we borrow a data set from Andy Field’s popular statistics books and consider album sales (the criterion variable that we wish to predict) along with advertisement budget, attractiveness of the band, and the number of airplays the album received [click here to download the JASP file containing the data set, analysis and annotations]. We can then use linear regression to determine which variables predict album sales. Traditionally in JASP, we would have to include all the predictors in the model, and test it against the null model containing only the intercept. This practice, however, did not allow one to assess the additional effect of a particular predictor, having already accounted for the effect of other predictors. This was unfortunate because, if a model with two predictors outperforms the intercept-only model, this does not necessarily mean that both predictors are meaningful. Now, with hierarchical regression added in JASP 0.8.3, the user can choose which predictors should be included in the null model.

For example, let’s predict album sales using bands’ attractiveness, while having accounted for advertisement budget. In other words, let’s assess the extent to which bands’ attractiveness has predictive worth over and above the advertisement budget. In order to accomplish this task in JASP, we first specify the linear model, with sales as the dependent variable and “attract” and “adverts” as predictor variables:

To include “adverts” in the null model, we go to the Model submenu, where all predictors are listed, and tick the checkbox for “adverts”. In the Statistics submenu, we can tick the “R2 change” checkbox to display the predictive performance of the two models. The model summary table then shows the change in both R2 and the F-statistic, along with the associated p-value:

Both metrics indicate that attractiveness has additional predictive worth over and above the advertisement budget alone. The full procedure can also be viewed in the following .gif image:

Feel free to try out hierarchical regression yourself, on your own data – and tell us on our GitHub page if you feel a feature is still missing!


JASP file including the data set and the analysis discussed here.

Like this post?

Subscribe to our newsletter to receive regular updates about JASP including our latest blog posts, JASP articles, example analyses, new features, interviews with team members, and more! You can unsubscribe at any time.

About the author

Johnny van Doorn

Johnny van Doorn is a PhD candidate at the Psychological Methods department of the University of Amsterdam. At JASP, he is responsible for Bayesian nonparametric analyses.