Google Answers receives two kinds of payments for its efforts in facilitating matches between askers and answerers.
Answerers' experience on the job is easily measured: Answerer experience is the number of questions previously answered by each answerer, which I call "contemporary answerer experience" or just experience.
In principle, the higher ratings of more experienced answers could result from a selection effect wherein higher quality answerers both enjoy higher ratings and elect to participate more or longer.
Answerers adjust their behavior to suit asker preferences for length and URL count.
But group norms and the limited "lock" function induce a race among answerers.
However, I have no reason to think the bias varies substantially across different kinds of questions or answerers, so this overstatement of effort does not suggest bias in my estimation of factors affecting pay per minute.
Measuring answerer time as detailed above, the base pay for answerers with no experience is on the order of $0.
The answerers who provide longer answers earn higher pay per minute even after controlling for experience.
As answerers gain experience, they often specialize in particular kinds of questions.
I find a statistically significant positive coefficient on the specialization index when predicting experience, implying that on the whole, more experienced answerers are more specialized.