Part of project: Slinky Projects

Regression, regression, regression. It seems these days that all I ever do is to create some regression model. We all know that the linear-scale shows the absolute change whereas the log-scale shows the relative change of the data (see this). That is, with the linear regression, you would interpret log transformed variables as a percentage change. Here’s a quick breakdown:

  • If your response variable, y, is log-transformed and assuming you have one predictor, a unit increase in the predictor x is associated with the x’s coefficient, call it b1, (b1*100) percentage increase in y.
  • If your predictor is log-transformed(e.g., log(x)), then one unit increase of x means a (b1/100) percent increase in y.
  • If both the response and the predictor is log-transformed then a one percent increase in x means a b1 percent increase in y.

Log transformations are useful for highly skewed distribution (i.e., if you Box-Cox Test tells you lambda is 0). Money amounts are often logged so they often log-transformed because it makes sense to look at the values multiplicatively (nearly all macroeconomics studies I come accross essentially take logs). For example, if I make $10K and got a raise of $5K, that’s huge since the raise is 50% of my income. But if I make $100K and got a raise of $5K, then it might not matter so much comparatively. Now, there are two interesting things about log transformations that people (myself included) often forget:

  1. Log transformations (or any data transformations for that matter) of the dependent variable changes its p-value, sometimes significantly enough to alter your decision of whether to accept/reject the null hypothesis. In one of my projects, my colleagues and I debated many times on whether or not we should do the log transform because it affects our decision. You can essentially change your regression model conclusion to fit a narrative if you transform or not transform your predictors. We eventually decided to do the transformation because it makes for better interpretation though we did make note of it.

  2. Taking the log does not make the distribution normal. Rather, it makes the distribution less skewed. Skewness isn’t always bad. For example, consider ordinal data like a movie rating from 1-5 which might be skewed high because the film is really good. Is it necessary to do a log transformation in this case? Even if taking the log of a predictor makes it normal, recall that the assumption of normality in regression is for the error distribution and not predictor variables. variables.

TL;DR Think about why you are taking a log transformation. It isn’t always necessary and can change the significance of your regression predictors.