For any company, the most valuable asset they have is the people they employ to make their stated missions come to life. Even as automation continues to eradicate many jobs that were once done by humans, the benefits that human minds bring to the table are still ever-valuable.
Unfortunately, many companies are coming to this realization much too slowly, especially in the fast food and retail industries. It’s a truth universally acknowledged that working in both of these sectors are essentially thankless jobs. Part-time employees in these positions work long hours at an abysmal minimum wage, no tips, and sub-par working conditions. As a result, it can be difficult for companies to hold on to valuable employees for long.
To combat this, some major companies are starting to find other ways to keep employees on their payroll. Earlier this year, retail giant Walmart, which typically has a dreadful retention rate (estimated to be around 50 percent, per The New York Times), announced a plan to subsidize at least a part of their 1.4 million employees’ college educations. More recently, a Sacramento, California, Chik-Fil-A location made headlines by raising the minimum wage for employees to $17 an hour, a significant increase from California’s current minimum wage of $11 an hour.
Chik-Fil-A in Sacramento is raising their minimum wage to $17 an hour. I'M QUITTING MY JOB AND MOVING TO SACRAMENTO
— Elizabeth (@trillizabeth) May 31, 2018
But these two examples are outliers. For most of the country’s part-time workers, the federal minimum wage remains at a measly $7.25 an hour, where it’s stayed since it was last raised in 2009. It would be easy to assume that the answer would be to just pay workers more, but it’s actually much more complicated than that.
The Complicated Wage Issue
In the United States, the minimum wage is a bewildering topic. “One quickly learns that it’s a very messy area of economic or social science inquiry,” says Jeff Clemens, PhD, an assistant professor of economics at the University of California at San Diego. Clemens has been studying minimum wage since 2008, when the Great Recession impacted what he calls “low-skill” workers in the workforce.
“I had taken an interest in whether, or to what extent, [minimum wage] contributed to the significant declines in employment that occurred among low-skilled individuals during the period surrounding the Great Recession,” he says. “That was the recession during which employment among groups that would include teenagers, high-school dropouts, and other low-skilled folks went down pretty dramatically—more so than you might have expected it to during regular recessions.”
Curiously, this was also a period when the federal minimum wage rose from $5.15 an hour, where it had been stuck for a decade, to $7.25 an hour. According to Clemens, this happened “in a way that was differentially binding across U.S. states, depending on how high their minimum wage had been to begin with,” he says. “So there was a natural opportunity to go in and investigate whether being exposed to a particularly large increase had led to a particularly large change in employment.”
The resultant paper, written by Clemens and his colleague Michael Wither, had some surprising results. It found that minimum wage increases during that time period actually reduced the employment-to-population ratio nationwide by 0.6 percentage points, accounting for 12 percent of the overall decline in employment from 2006 to 2012. The thinking is that as “low-skilled” workers’ output is devalued by business cycles, trade patterns, and technology, raising the minimum wage will likely find those people increasingly competing against higher-skilled individuals for the same jobs. They then either look elsewhere for work or don’t work at all.
There’s also been a significant decrease in people ages 16–24 working summer jobs over the last 30 years. According to the Chicago Tribune, the last peak rate was 77.5 percent back in 1989. Last July, only four in ten teens had a summer job; back in 1978, that number was seven in ten. For some jobs in particular, like lifeguarding, whose entire labor force typically depends on young adults looking for summer work, this decline has forced managers across the country to turn to older adults to fill out their staff.
Teens in high school can't get jobs because you have 30 year olds working in all the mcdonalds and mall spots
— coco mama. (@_nononini_) February 17, 2015
There are a number of factors at play regarding why teens are less likely to join the labor force, and employers are constantly forced to get creative on how to keep their employees. But Clemens says that companies need to think about more than just wages when it comes to retention.
Beyond the Wage
Getting paid a living wage is nice, but there are plenty of other things that employers need to provide employees in order to keep them around.
“Firms are effective providers of a bunch of things that workers, or just people in general, might want,” Clemens says. These things that workers might want are known as benefits. Benefits include things like health insurance, of course, but also a gym membership and other things you might not immediately think of—even how nice the building is.
“The tax benefit comes from the fact that many of these things like health insurance, being the most famous of them, are purchased by the employer with pre-tax dollars,” says Clemens. “So if you get insurance through your employer, and say your tax rate is 25 percent, then you could essentially get the same insurance plan for 25 percent less than you would get if your employer gave you the money and then you had to turn around and buy insurance out on the market.”
Buying into an employer’s benefits is also advantageous to workers because of what economists call economies of scale. “The market that you would pay as an individual if you have to purchase insurance is famously dysfunctional relative to the market that your employer buys insurance in,” Clemens says. “That’s just a matter of your employer having a large scope, being able to handle the administrative cost.”
“And also in the insurance context,” he continues, “it’s the fact that it’s a big pool of people [buying the insurance]. The insurance company doesn’t have to worry that you’re a sick person that keeps showing up and being expensive to them.”
With these kinds of factors in play, it is beneficial to both the employer and the employee to compensate workers with more than just cash wages. And while the number of firms offering health insurance coverage to their part-time employees jumped 8 percent from 2014 to 2017, it still lags far behind the number for full-time employees. Thirty-four percent of the organizations surveyed offered health insurance coverage to their part-time workers, but 99 percent offered it to their full-time workers.
“You should expect that different workers and different [firms] might find it mutually advantageous in one context to ramp up the generosity of the health insurance, and in another context, they might decide that the higher wages are the way to go,” Clemens says.
Some food for thought: Cammi Caramella of Business News Daily pointed to Chipotle, Starbucks, Lowe’s, and Costco (among other companies) as employers who offer solid benefits packages to part-time employees. On the wages side of things, Richard Best of Investopedia pointed to Nordstrom, Costco, and Lowe’s for their higher-than-average hourly wages.
What does this mean for the consumer?
Of course, paying workers more means that eventually, consumers will have to share some of that burden as well. Over the last several years, there has been a major push to raise the federal minimum wage to $15 an hour, prompting studies and analysis on how doing so will the labor force and consumers. According to a study by Purdue University, raising part-time wages to $15 an hour would increase fast food prices by up to 4.3 percent.
Other research suggests that raising minimum wage might not actually have the desired effect as employers struggle to keep up with the rising costs of doing business. Clemens says that for companies, this all points back to the bottom line.
“One of the things that seems very important to firms is the extent to which they are able to take the higher labor cost associated with the minimum wage and pass that on to consumers by raising prices,” he says. “And that, in turn, depends on how able consumers are to substitute away from the firm.”
Simply put, will people still want to buy a Big Mac if the price increases from $3.99 to $4.16? It all depends on what you’re buying and whether you can get it elsewhere.
That might be a more difficult problem for retail outlets than it is for fast food restaurants. “The way economists tend to think of fast food is that it is a local service,” Clemens says. “Part of the experience is that you’re there at the fast food restaurant. When you’re in that context, consumers are sort of stuck with the firms that are there, and all of them are going to get exposed to whatever minimum wage increase the state or the local area has enacted.”
“That’s very different from retail,” Clemens continues, “where you might think that the kind of expanding role of Amazon and internet retail is making it increasingly likely that consumers can substitute away from going to the local brick-and-mortar retail outlet and towards Amazon, who might be buying wholesale from anywhere in the country or even the world.”
Good managers who value their employees won’t deny that they’d like to pay people more. But there are many factors besides wages that make a workplace valuable. Benefits are crucial, but so is a general quality of life that makes people want to get up every morning and go to work.
As retail and fast food companies around the country struggle with how to stay competitive in a rising market while paying their employees more, they’ll have to continue to find creative ways to keep employees coming back year after year. Call it the cost of doing business.