(65) Using heteroskedasticity-consistent standard errors and the bootstrap for linear regression analysis in SPSS: A tutorial

Published in Advances in Methods and Practices in Psychological Science, 2026

In the landscape of statistical software, from customizable programming-language-based to point-and-click systems, SPSS remains a popular choice among researchers. In SPSS, analyses with conventional methods, such as ordinary least squares regression, can be easily performed. However, violated assumptions, such as homoskedasticity or normality of the errors, can lead to altered Type I error rates or a reduction in statistical power. SPSS provides a multitude of alternative inference methods associated with linear regression, but accessing them is not always straightforward. To facilitate data analysis when assumptions for conventional inference methods are not met, in this tutorial, we aim to provide applied researchers, particularly SPSS users, with a guide for performing linear regression analyses using heteroskedasticity-consistent (HC) standard errors (HC3 and HC4) and two different bootstrap resampling methods (pairs bootstrap and wild bootstrap). Each bootstrap method can further be combined with a bootstrap p value, a percentile confidence interval, or a bias-corrected and accelerated confidence interval. For illustration, the methods are then compared using a computer-generated data set. Although the focus of this article is on applied researchers who use mainly SPSS for their analyses, a tutorial on how to do everything shown here in R (with custom functions) is included in the supplementary materials.

Recommended citation: Rajh-Weber, H., Huber, S. E., & Arendasy, M. (2026). Using heteroskedasticity-consistent standard errors and the bootstrap for linear regression analysis in SPSS: A tutorial. Advances in Methods and Practices in Psychological Science, 9(1), 1-9.
Get Paper