It’s impossible to measure programmer productivity.
We don’t even know what the right measure is.
It doesn’t mean we should throw the baby out with the water and measure nothing.
One thing we can measure is: how many lines of code did you write?
Some people are adamant that lines of code written is a terrible metric.
I get the argument, I disagree with the conclusion.
If the minimum lines of code needed to implement X is 10 thousand lines of code, someone who writes 10 lines of code per day will take 3 years to complete the task.
Someone who does 100 lines of code, all else being equal, will take 100 days. Big difference.
What you shouldn’t do is to reward programmers for how many lines of code they write. That’s lazy management that inevitably leads to programmers gaming the metric and creating worse code as a result.
I like to know a ballpark of number of lines of code written.
Then there’s curiosity. Some people tend to vastly over-estimate how many lines of code they can write. I have heard people talking about thousands lines of code per day.
To track my performance, I wrote a script to calculate number of lines of code written per day.
It takes the latest commit of the day and compares lines of code with the latest commit in a previous day.
It only counts the days with at least one commit.
At 31 days, Filerion consists of 12k lines of code, 388 per day.
Is 388 per day a lot? I have no idea. I don’t have the data for other people.
My most productive day so far was 927 lines of code.