Motivated by an article in yesterday's NY Times,
here is the linkage between real wages and
productivity as a matter of economic theory. Here is the basic logic
taught in economics textbooks:
Economic theory says that the wage
a worker earns, measured in units of output, equals the amount of
output the worker can produce. Otherwise, competitive firms would have
an incentive to alter the number of workers they hire, and these
adjustments would bring wages and productivity in line. If the wage were
below productivity, firms would find it profitable to hire more
workers. This would put upward pressure on wages and, because of
diminishing returns, downward pressure on productivity. Conversely, if
the wage were above productivity, firms would find it profitable to shed
labor, putting downward pressure on wages and upward pressure on
productivity. The equilibrium requires the wage of a worker equaling
what that worker can produce.
Why don’t real wages and productivity always line up in the data? There are a several reasons:
1.
The relevant measure of wages is total compensation, which includes
cash wages and fringe benefits. Some data includes only cash wages. In
an era when fringe benefits such as pensions and health care are
significant parts of the compensation package, one should not expect
cash wages to line up with productivity.
2. The price index is
important. Productivity is calculated from output data. From the
standpoint of testing basic theory, the right deflator to use to
calculate real wages is the price deflator for output. Sometimes,
however, real wages are deflated using a consumption deflator, rather
than an output deflator. To see why this matters, suppose
(hypothetically) the price of an imported good such as oil were to rise
significantly. A consumption price index would rise relative to an
output price index. Real wages computed with a consumption price index
would fall compared with productivity. But this does not disprove the
theory: It just means the wrong price index has been used in evaluating
the theory.
3. There is heterogeneity among workers. Productivity
is most easily calculated for the average worker in the economy: total
output divided by total hours worked. Not every type of worker, however,
will experience the same productivity change as the average. Average productivity is best compared with average
real wages. If you see average productivity compared with median wages
or with the wages of only production workers, you should be concerned
that the comparison is, from the standpoint of economic theory, the
wrong one.
4. Labor is, of course, not the only input into
production. Capital is the other major input. According to theory, the
right measure of productivity for determining real wages is the marginal
product of labor--the amount of output an incremental worker would
produce, holding constant the amount of capital. With the standard Cobb-Douglas production function,
marginal productivity (dY/dL) is proportional to average productivity
(Y/L), which is what we can measure in the data. (See Chapter 3 of my
intermediate macro text for a discussion of the Cobb-Douglas production
function.) Keep in mind, however, that the Cobb-Douglas assumption of
constant factor shares is not perfect. In recent years, labor’s share in
income has fallen off a bit. (Between 2000 and 2005, employee
compensation as a percentage of gross domestic income fell from 58.2 to
56.8 percent.) From the Cobb-Douglas perspective, this means that the
marginal productivity of labor has fallen relative to average
productivity. This modest drop in labor’s share is not well understood,
but its importance should not be exaggerated. The Cobb-Douglas
production function, together with the neoclassical theory of
distribution, still seems a pretty good approximation for the U.S.
economy.
An old article about wages and productivity