There is no real data on this, and likely never will be.
The size of the codebase, the development effort involved, and the experience required for a meaningful empirical test. Experienced developers are too busy paying their bills to spend 1,000 hours developing a toy project.
I’ve seen some tests carried out at universities, but no matter how large the projects are, they don’t involve experienced developers. I’ve gone through many studies and have always been disappointed.
Maybe a wealthy company like Apple or Google might conduct a large-scale study as the results could yield immediate real-world gains for them.
It probably would have been a useful exercise for Twitter employees before they were nixed.
That being said, pretty much every language is good enough to use to make something useful.
The desire for increased productivity is such an industrialized way of thinking about coding (especially when it’s for personal projects). What’s most important is mindful consistency.
You’ll get far more done by committing to completing projects than obsessing over benefits.
Pick a language you like. Make something you want to exist. Enjoy.