1 new of 29 responses total.
Here's a theory you can boot around: Public schools were badly damaged by the feminist movement. Before the 1970s, teaching and nursing were among the very few professions which were wide open for talented women. Because the public schools had a more-or-less captive labor supply, schools didn't have to pay competitive salaries -- teachers, mostly women, couldn't get into other, better-paying fields. Once the rest of the economy opened to women, education lost its hold on the supply of talented women. I keep seeing solutions which I characterize as "fiat economics:" attempting to declare that teachers will become better.
You have several choices: