[p2p-research] Is the "lump of labor fallacy" itself a fallacy?

Paul D. Fernhout pdfernhout at kurtz-fernhout.com
Sat Nov 28 23:17:24 CET 2009


Michel Bauwens wrote:
>> I found this, but it is not exactly the same:
>  http://www.livingwagesonoma.org/
> 
> Paul: this is the site where I found it, in one of the articles they
> mentioned that if the New Deal social contract had persisted (i.e.
> productivity growth is shared through higher wages), then wages would have
> been so much higher.

Searching:
http://www.google.com/search?hl=en&q=site:www.livingwagesonoma.org+productivity+%22minimum+wage%22

One item from there, from 2004:
http://www.livingwagesonoma.org/news_read.php?id=73
"""
Yet the federal minimum wage remains stalled for the eighth year in a row at 
$5.15 an hour--a shocking $10,712 annually for fifty-two weeks of full-time 
work. This, of course, at the same time the Bush Administration unblushingly 
escorts the wealthiest Americans onto the tax-break gravy train.
   "Certainly $5.15 an hour is not a living wage," scoffs Robert Pollin, 
economics professor at the University of Massachusetts and one of the 
nation's leading experts on the economics of living-wage law. If the minimum 
wage had been raised with inflation and the productivity rate since 1968, 
when the minimum wage was at its peak, Pollin says, it would be $14.50 an 
hour. Activists have long recognized the stalemate at the federal level that 
has arrested the minimum wage at subpar standards, and the living wage was 
an attempt to remedy that.
"""

Perhaps you rememered US$14.50 as US$40? Or maybe they have another calculation?

I think they may make a mistake in starting from 1968. I think that figure 
would be higher if they worked from 1938 and factored in 30 years of 
productivity gains not fully accounted for in the 1968 wage (as I did to get 
US$14.56 starting from 1938, but then fudged up to US$20 based on recent 
productivity increases).

Of course, then one can argue about whether inflation is properly accounted 
for (the exponential improvement in computing at lower cost is being used to 
offset some of things like enormous housing costs and college costs), or 
whether productivity is properly accounted for. These are very political 
numbers in that they tie to  things like cost of living increases for 
pensions and social security. Can one really trust what mainstream 
economists say about these numbers? And some of this is subjective because 
the quality of some things improved (cars are better) while the quality of 
other things has worsened (food in general is less nutritious now and it 
almost *all* used to be "organic" in a way).

For example, the productivity of the prime industries like agriculture and 
industry have gone up enormously over the past century. Most farmers are 
literally 50 to 100 times more "productive" (ignoring fossil fuel use and 
environmental damage and lower food quality) than they were 100 years ago 
(and those negatives could be improved fairly easily as I've mentioned 
before with organic methods and ground rock dust), considering how everyone 
used to be in agriculture and now almost no one is, and even then, most of 
agriculture goes to produce meat, and about 50% of what is produced is left 
to rot in the fields or otherwise discarded.
http://www.csmonitor.com/2009/1102/p07s01-lign.html
"""
"There's so much food available in fields. It's astounding how much is 
wasted," says Almquist, who graduated from nearby Middlebury College in June 
with a degree in environmental studies. A 2004 report from the University of 
Arizona in Tucson estimates that 40 to 50 percent of all the food that could 
be harvested from fields will never be eaten.
"""

Manufacturing is also several times more productive in a variety of ways 
(maybe even now approaching fifty times or more for many specific things 
like producing paperclips or nails by machines). So, services may be 
dragging down that productivity increase perhaps. If we considered 
productivity in only manufacturing and agriculture, perhaps a productivity 
adjusted minimum wage should be more like US$100 an hour or more? :-) Just 
guessing. At some point these numbers become meaningless.

Consider this:
http://en.wikipedia.org/wiki/Gross_world_product
"Gross world product (GWP) is the total gross national product of all the 
countries in the world. This also equals the total gross domestic product. 
See measures of national income and output for more details. The per capita 
GWP in 2008 was approximately $10,500 US dollars (USD).[1] The 
Intergovernmental Panel on Climate Change (IPCC), in their Third Assessment 
Report (TAR), predicts a maximum per-capita gross world product in 2100 of 
approximately $140,000 (in year 2000 dollars)."

OK, so people are, with a straight face, projecting that everyone (on 
average) in 2100 will be earning US$140K a year per person (in 2000 
dollars), so a family of four will have an income on average of US$480,000 a 
year, but given even now such a family can OK on US$50K a year (about the 
median for a family of four). So, with all families on the planet predicted 
to be making ten times as much as now, there is no discussion of how our 
entire economic system will change if most people decide to opt out of the 
"rat race" (where, as a comedian said, if you win you are still a rat)?

Also, in such a world, with average families having incomes of half a 
million dollars a year, would a "minimum wage" of, say, US$10 or US$20K a 
year make any sense?

Anyway, clearly the numbers and the policies they connect to begin to stop 
making sense at some point (if they ever did).

--Paul Fernhout
http://www.pdfernhout.net/



More information about the p2presearch mailing list