NCCR Postdoc opportunity in Geneva

Postdoctoral Position in Sociology 60%

The nccr – on the move and the HETS School of Social Work of Geneva are seeking applicants for a Postdoctoral Position in Sociology to carry out research and contribute to research projects in the area of post-retirement international mobilities, transnational lifestyles, and care configurations. Applications should be submitted before the 10th of June 2020.

Mixed methods, Stata or similar

Deadline: 10 June (!) 2020

Further Information

An Ode to Low R2

It’s the time of the year when many of us do their share of grading. In my case, it’s quantitative projects, and every time I’m impressed how much the students learn. One thing that annoys me sometimes is to see how many of them (MA students) insist on interpreting R2 in absolute terms (rather than to compare similar models, for instance). That’s something they seem to learn in their BA course:

[in this simple model with three predictor variables], we only explain 3% of the variance; it’s a ‘bad’ model.

I paraphrased, of course. But I started to like low R2: They are a testament to the complexity of humans and their social world. They are a testament to the fact that we are not machines, we are in the world where quantitative analysis is about tendencies. Just imagine a world in which knowing your age and gender I could perfectly predict your political preferences… So there you have it: low R2 are great!

Causal Inference: The Mixtape

Here’s a nice overview of causal inference by Scott Cunningham. Yes, you get an entire book as a free download, and it’s got you covered from probability to Pearl’s directed acyclical graphs, from instrumental variables to synthetic control. It comes across quite friendly, but has enough econometrics to scare many off. I quite enjoyed the historical bits thrown in here and there to explain where the methods came from.

Correlations Graphics in R

Correlations are some of the basics in quantitative analysis, and they are well suited for graphical examination. Using plots we can see whether it is justified to assume a linear relationship between the variables, for example. Scatter plots are our friends here, and with two variables it is as simple as calling plot() in R:

plot(var1, var2)

If we have more than two variables, it can be useful to plot a scatter plot matrix: multiple scatter plots in one go. The pairs() command is built in, but in my view not the most useful one out there. Here we use cbind() to combine a few variables, and specify that we don’t want to see the same scatter plots (rotated) in the upper panel.

pairs(cbind(var1, var2, var3, var4) , upper.panel=NULL)

A more flexible method is provided in library(car) with the scatterplotMatrix(). If this is not flexible enough, we can always split the plot and draw whatever we need, but that’s not for today.

library(car)scatterplotMatrix(cbind(var1, var2, var3, var4))

If we have many more variables, it’s necessary to draw multiple plots to be able to see what is going on. However, sometimes after having checked that the associations are more or less linear, we’re simply interested in the strength and direction of the correlations for many combinations of variables. I guess the classic approach is staring at a large table of correlation coefficients, but as is often the case, graphics can make your life easier, in this case library(corrplot):

corrplot(object_with_many_variables, method="circle", type="lower", diag=FALSE)

This is certainly more pleasant than staring at a table…

For all these commands, R offers plenty of ways to tweak the output.