This is my archive

bar

Some Lessons from a Just War Fought Unjustly

The horrible “unintentional” murder of seven aid workers in Gaza carries many lessons. One is the importance of signaling the moral principles that are expected to be followed. One week after the barbaric attack against Israeli civilians, I wrote: The basic individualist ethic nurtured by Western civilization rejects group identities and tribal intuitions that justify collective punishments. In times of war, individualist ethics may be difficult to uphold by those waging a just defensive war, but its recognition is essential as a standard to distinguish collectivist barbarians and civilized individualists. Just as it makes no sense to hold the Israelis responsible for the barbaric attack of which they were victims, it is nonsensical to think of ordinary Gazans, under the yoke of Hamas thuggery, as collectively guilty and artisans of their own misfortune. Two months later, after three Israeli hostages were killed by the army supposed to liberate them, I wrote, more explicitly (my apologies for quoting myself again): A man has got to do what a man has got to do, within certain moral constraints. After October 7, the Israeli government should have proclaimed the principle as loud as the calls for revenge were heard, and should have endeavored to lead by example. It would not have been too late for the Israeli government to state clearly, and to start repeating, that they were not after a collective punishment irrespective of guilt, and that they intended to respect the laws of war, Western individualist ethics, and simple human decency towards civilians. It was probably still time for the Israeli government to keep the moral support of decent people in the international community and to transmit to its armed forces a strong message of moral restraint. Thousands of civilian lives in Gaza would have been saved. The probability of “unintentionally” killing aid workers would have been reduced. The Israelis would not have wasted so much of their capital of sympathy. Moral principles are a good strategy. There is also a lesson regarding the circulation of information if the Financial Times is correct: The fatal strikes followed a series of mistaken assumptions that could have been prevented had the military properly passed along the details of the humanitarian convoy to the commanders who ultimately ordered the strikes. World Central Kitchen had shared those details with the proper military authorities, but they were lost somewhere in the chain of communication, the investigation found. Students of bureaucracy know that there is much noise in the internal communications of any large organization. The military is a large bureaucracy, a point on which Gordon Tullock insisted. Anthony Downs, one of the early public choice economists, noted in his 1967 book Inside Bureaucracy: When information must be passed through many officials, each of whom condenses it somewhat before passing it on to the next, the final output will be very different in quality from the original input; that is, significant distortion will occur. In a bureau hierarchy, information passed upward to the topmost officials tends to be distorted so as to most reflect what he would like to hear, or his preconceived views, than reality warrants. Soldiers and officers on the ground don’t receive the exact same orders that were issued at the top of the pyramid. Signaling moral principles could have clarified communications about the required moral restraint. The Israeli military apologized to the World Central Kitchen, to which the killed aid workers belonged, and said it had “dismissed two officers and reprimanded three,” according to the Financial Times. This does not seem sufficient to compensate for the multiple faults committed by the Israeli government during this war. Of course, we are still waiting for the remaining leaders of Hamas to condemn those who have participated in or organized the butchery of October 7. Their refusal to do so as well as their use of their population as innocent shields should not make us forget that, as the American Secretary of State correctly said, the state of Israel should follow “higher standards.” All this assumes that the Israeli government does want to abide by higher standards. The Economist writes (“What Israel’s Killing of Aid Workers Means for Gaza,” April 3, 2024): Isaac Herzog, the Israeli president, called Mr Andrés [the founder of the World Central Kitchen] and expressed “deep sorrow”. The army chief pledged a thorough investigation (though Israel has a poor track record of those). Israel’s prime minister was less contrite: in a bizarre videotaped statement, a smiling Mr Netanyahu said that he was recovering well from hernia surgery and then acknowledged the “tragic event” in Gaza. “This happens in war,” he said. … For months Mr Netanyahu has refused to order the Israeli army to distribute aid in Gaza itself. ****************************** The featured image of my post is a courtesy of DALL-E, laboring under my instructions. As ze said zirself, “the images depict Ker, the goddess of death in Greek mythology, walking through the remnants of a city devastated by aerial bombing…” Ker, the Greek goddess of death, at work in Gaza (0 COMMENTS)

/ Learn More

Milei’s Message to the World

The first few months of Javier Milei’s administration have already had major consequences in Argentina as the country has started to move away from years of leftist, failed economic policies. This is of course relevant to Argentines, who were the ones who actually suffered because of their country’s dismal economic situation. But the interest that Milei has awakened throughout the world means that his government could have a major impact in Latin America and perhaps other regions as well. In that context, the opportunities that the Milei administration opens for the rest of the world were the main themes that I addressed recently in a panel on Argentina at Atlas Network’s Liberty Forum in San José, Costa Rica. The main problem is that Milei sometimes seems ‘many-faced.’ He likes to be compared to people like Donald Trump, Santiago Abascal and other right-wing nationalist leaders, but he also presents himself as an ‘anarcho-capitalist’ who is determined to destroy Argentina’s corporatist, state-based economic system and replace it with free markets. Indeed, Milei’s administration is probably to be found somewhere along a continuum between liberalism and conservatism. He is neither the angel that his supporters claim he is nor the demon that his critics bemoan. His own government seems sui generis: In his first hundred days, there has been radical change in some aspects (such as achieving a balanced budget in just one month or deregulating several markets via decree), moderate change in others (such as gradually reducing energy subsidies), and still no change in others (particularly in terms of taxes and privatization). But despite the way Milei governs, the message that is coming from Argentina is clearly liberal: In Latin America and other parts of the world, the president has become a symbol that is much more important that his own government. Argentines abroad constantly get asked about him in almost any context. (This also happened to me on my way to the Liberty Forum venue, by the way, as my Costa Rican Uber driver expressed to me his hope that Milei will succeed.) People outside Argentina clearly associate Milei with a series of liberal principles: that only capitalism can bring about progress, that neither a household nor government should spend more money than they have, that socialism hurts people. Milei knows that governing is harder than talking, which is probably why his public appearances have been limited since he took office. But Milei also knows that his government has a meaning beyond his own country, and it is in light of this that we must understand the message that he repeats before any audience that will listen to him. The message to al is the same: That progress and justice can only be achieved through freedom. Time after time, Milei repeats to whomever will listen that government intervention does not just destroy incentives for the creation of wealth, but also that the state itself is a criminal organization. Of course, denouncing a mafia and then being elected to lead it is probably an uncomfortable position to be in, but the message is explicit: The state is the problem. And there are no other leaders in the Western world who say this clearly today. The times when Reagan and Thatcher praised individual effort and vigorously called out socialism seem buried in the past. Meanwhile, nationalists and collectivists of all sorts gain ground throughout the world. So despite the fact that Milei sometimes surrounds himself with people who despise liberalism, his message is not that of a ‘right-wing extremist.’ There is no racism nor any calls for protectionism in his speeches, nor other illiberal elements in general. And his ideas also reflect his way of governing, as Milei has not turned authoritarian as his critics predicted. If liberals can complain about something, it is that he is too moderate, but never too extreme. Milei’s message certainly means very little to the millions of Argentines who suffer inflation, poverty, and more generally the hopelessness of being adrift until so recently. But if the president succeeds in bringing his country back from the abyss, it will not be just locals who will benefit, but also people throughout the world who will be able to point to a clear beacon of liberty. In any case, Milei’s success or failure will undoubtedly cross Argentina’s borders. His liberal message is already doing it.   Marcos Falcone is the Project Manager of Fundación Libertad and a regular contributor to Forbes Argentina. His writing has also appeared in The Washington Post, National Review, and Reason, among others. He is based in Buenos Aires, Argentina. (0 COMMENTS)

/ Learn More

“All the new jobs are going to immigrants”

Language can be quite ambiguous.  The quotation in the title of this blog could be viewed as being either true or false, depending on how one defines certain words. Consider the graph shown in this tweet by Matt Yglesias: It shows an increase of employment of slightly over 2 million (in green), with almost all of the gains going to immigrants.  (BTW, using the more accurate payroll data, employment has actually increased by nearly 6 million since the pre-Covid peak of February 2020.) Here’s another way of cutting up the data.  The following are my educated guesstimates of employment changes since the end of 2019: American boomers and immigrants:   Total employment is down significantly American Gen Xers:  Employment is relatively stable American millennials and Gen Zers:  Employment is up very sharply I believe those are true statements.  Do they suggest that, “All the new jobs are going to younger native born Americans”?  That statement is just as true as the statement that all the new jobs are going to immigrants.  And just as false. Here’s one way to think of the picture.  The boomer generation is very large.  Each year, several million boomers decide to retire.  Each year, a roughly equal number of young native-born Americans enter the labor force.  The net effect is relatively little change in total employment. (Maybe a slight increase.)  Thus the overall change in total employment will be roughly equal to the gain in employment among immigrants.  My example seemed weird because it lumped together boomers and immigrants.  But it’s also weird to lump together boomers and Gen Zers, into a group called “native-born”.  Their employment levels are moving in radically different directions. Context is everything.  Now I’ll provide a context where the claim made in the post title is basically true, and another context where the exact same claim is basically false.  Let’s start with a true context. Example A:  “I’m worried about America’s future.  Because of declining birthrates, there’s a real danger that we’ll enter a period of falling population.  This will make it difficult to confront the threat of rising powers like China, and will make the Social Security system even more insolvent.  In order to avoid this situation, all the net gain in employment would have to come from immigrants.” You may disagree with the policy preferences, but at least it’s a coherent argument.  The data really does suggest that if not for immigration the low birthrates would gradually reduce our workforce. Example B:  “I heard that all the new jobs are going to immigrants.  A new factory opened in my town, but I didn’t even apply for the job.  Why bother if employers are only interested in hiring immigrants, not American born labor?” This claim is clearly misleading.  Most new jobs in the US are filled by native-born workers.  The net change in total employment is similar to the net change in the stock of immigrant workers, but that doesn’t mean that individual Americans are unable to find jobs. The phrase “new jobs” could indicate individual jobs being filled, or net change in total employment.  Language is ambiguous. This sort of misuse of statistics occurs in all sorts of contexts.  Suppose you read that illegal immigrants to the US committed 100 murders in 2023.  That makes it sound like they made the country a more dangerous place.  But suppose that 120 illegal immigrants were murdered in 2023.  In that case, it’s quite possible (but not certain) that the risk of a native born person being murdered was actually reduced by the illegal immigration.  Net flows and gross flows are two very different concepts, and should be handled with care. (0 COMMENTS)

/ Learn More

My Weekly Reading for April 8, 2024

I’m at the Association for Private Enterprise Education (APEE) meetings in Last Vegas and my weekly reading roundup has been delayed by a day. Don’t Blame Decriminalization for Oregon Drug Deaths by Jacob Sullum, Reason, May 2024. Excerpt: In March, Oregon legislators overwhelmingly approved recriminalization of low-level drug possession, reversing a landmark reform that voters endorsed in 2020. Although critics of that ballot initiative, Measure 110, cited escalating drug-related deaths as a reason to reinstate criminal penalties, there is little evidence that decriminalization contributed to that problem. Deaths involving opioids have been rising nationwide for more than two decades. That trend was accelerated by the emergence of illicit fentanyl as a heroin booster and substitute, a development that hit Western states after it was apparent in other parts of the country. “Overdose mortality rates started climbing in [the] Northeast, South, and Midwest in 2014 as the percent of deaths related to fentanyl increased,” RTI International epidemiologist Alex Kral noted at a January conference in Salem, Oregon. “Overdose mortality rates in Western states did not start rising until 2020, during COVID and a year after the introduction of fentanyl.” That lag explains why Oregon saw a sharper rise in opioid-related deaths than most of the country after 2019. But so did California, Nevada, and Washington, neighboring states where drug possession remained a crime.   WHAT PRESIDENTIAL RANKINGS GET WRONG by Stephen F. Hayward, The Coolidge Review, March 25, 2024. Excerpt: Still, I have always thought historians who disliked Coolidge had a secondary purpose to attaching the Silent Cal label to him: they hoped you would ignore what he said—because if you read it, you might be persuaded by it. Take the 1922 speech Vice President Coolidge gave before the American Bar Association. Coolidge wrote his own speeches, and that address is a brilliant and prescient analysis of what today we call the administrative state and why it can’t give us effective government. When you read it, you realize he is contesting all the premises of Woodrow Wilson and the early administrative state before that term came into use. Note: An overall evaluation of Coolidge should include his and his Treasury Secretary Andrew Mellon’s reduction of marginal tax rates at all levels. Unfortunately, it should also include his support of tough restrictions on immigration and his allowing federal Prohibition enforcers to poison alcohol. Why Palantir Cofounder Joe Lonsdale Left California for Texas Interview of Lonsdale by Nick Gillespie, Reason, April 3, 2024. Excerpt: Gillespie: Can you ballpark what percentage of stimulus payments were either wrong or shouldn’t have been made? Lonsdale: I don’t have that information myself. Palantir is a nonpartisan company. I don’t even run it anymore. I’m close to a lot of people behind it. I remember at the time, even President [Barack] Obama agreed, for example, there’s lots of fraud in Medicaid. We should probably go after it. He visited us. He was going to do it. His office ended up stopping us from doing it. They didn’t want that. Their office doesn’t want us to, because they don’t want the narrative out there admitting how bad it is, which is frustrating, because we can actually fix most of it. Gillespie: Is there any way to change that political calculus? Lonsdale: You need a really strong, really competent president who’s willing to do it. Policy-wise, the Trump administration was willing to, but there’s a certain level of confidence that wasn’t always there on the follow through. And people have pushed back, and they drop it. Gillespie: I mean, every president’s like this. I’m not going to touch your Medicare. And that might mean I’m not going to touch your Medicare even if you’re getting it under the wrong circumstances. Lonsdale: It’s not even for people. I think most of the fraud goes to a lot of very sketchy doctors and health systems. Those places are very powerful special interests. And it just creates a huge headache to go after them. And you need a president who wants to focus on the issue. And listen, there are lots of things to focus on. I’m not telling you this is the most important thing. It does bother me as an American that we waste $100 billion or whatever it is on this nonsense.   (0 COMMENTS)

/ Learn More

Rituals Without Religion (with Michael Norton)

While religion may play less of a role in many people’s lives, rituals–the lifeblood of religion–remain central to the human experience. Listen as Michael Norton of the Harvard Business School explains how and why rituals remain at the center of our lives–they give meaning to life-cycle events and secular holidays, calm our fears, and give […] The post Rituals Without Religion (with Michael Norton) appeared first on Econlib.

/ Learn More

An Added Perversity of the $8 Cap on Late Fees

Co-blogger Vance Ginn has nicely laid out some of the perverse, probably unintended, but definitely foreseeable, consequences of the federal government’s proposed $8 cap on the amount that credit card companies are allowed to charge credit card holders when they are late on a payment. I want to point out two other consequences, both of which are perverse but one of which is especially perverse. First, though, my personal story. Every once in a while, while I’m traveling or particularly busy, I’ve let slip a payment date and paid a credit card balance late. It happened only a couple of times because the credit card company taught me virtue with a $30 to $35 late fee. Ouch!  I got very careful. Now to my point about consequences. If the regulation is implemented, then, as Vance points out, credit card companies will adjust. He names a few adjustments. One that he doesn’t mention is that they will, to the extent that can do it, try to figure out ways of charging more to people who are late. It might be by upping their interest rate once they’ve recorded x number of late payments over y number of months. It might be other adjustments that we don’t know but that some of the credit card companies’ best minds will think hard about. What that approach has going for it is that it targets the higher charges to those who are causing the problem. These other approaches they might take are presumably less efficient than high late charges, or else they already would have been taking them. But what if the credit card companies can’t figure out how to target the higher charges to those creating the problem? That’s when we get the kinds of adjustments Vance talks about, such as higher interest rates on everyone. This is particularly perverse because it causes people who were not creating the problem to pay more.   One other point that doesn’t relate to credit card fees but does relate to usury laws. Shortly after I moved to the United States, in the fall of 1972, I applied for a Visa credit card with a credit limit of–are you ready?–$250. That was the lowest amount you could apply for. I was turned down. My guess is that the reasons were twofold: (1) I was not a permanent U.S. resident, so the company might have feared having trouble collecting if I didn’t pay and moved back to Canada; and (2) I had no credit history–no car loans, no any kind of loans. I thought that living in the United States longer would help. So in 1974, I applied for a Mastercard with a limit of $400, the lowest that Mastercard granted. I was turned down. By 1975, I finally got a credit card. I think I know the reason: the change in usury laws. When credit card companies gave out cards in a state, they were under the usury laws of that state. If I recall correctly, the limit on interest rates in California at the time was 11%. That wouldn’t be a good rate, from a credit card company’s viewpoint, for an unknown risk who could easily leave the United States. But a federal court decision around 1975 established that credit card companies could charge an interest rate consistent with the usury laws of the state in which the credit company located. So a number of them located in South Dakota and other states that had no limits on interest rates. I finally got a card with a high interest rate. And I rarely had to pay it because I made it a point, except in extreme circumstances, to pay the full balance down each month. We often hear about the absurdly high interest rates that credit card companies charge to young people with no credit history. But they’re simply adjusting for risk. I would have rather had a credit card charging 24% interest in 1972 than no card in 1972. (0 COMMENTS)

/ Learn More

The Miners’ Strike

The Miner’s Strike is one of the most controversial events in modern British history. One version is that Margaret Thatcher sought the conflict, prosecuted it ruthlessly, and destroyed a viable industry to crush the powerful National Union of Mineworkers (NUM). The truth is very different.   The British coal industry peaked just before World War One, when it employed 1 million men in 3,000 pits producing 300 million tonnes of coal annually. But it faced increasing competition from cheaper foreign coal producers and new, cheaper fuels. When the industry was nationalized in 1947, 700,000 men in 958 pits were producing just 200 million tonnes annually.  In 1950, the Plan for Coal pumped £520 million into the industry to boost production to 240 million tonnes a year. This target was never met. In 1956, the record post-war year for coal production, 228 million tonnes were produced, too little to meet demand, and 17 million tonnes had to be imported.  The 1960s saw British Rail ditch coal and steam for oil and electricity. Improved technology also squeezed employment; Between 1955 and 1969, the share of coal which was power loaded rose from 9.2% to 92.2%. The industry’s decline accelerated. Between 1957 and 1963, 264 pits closed and between 1963 and 1968, 346,000 miners left the industry. In 1967 alone there were 12,900 forced redundancies. Under Harold Wilson’s Labour government, one pit closed every week.  Nineteen sixty-nine was the last year when coal accounted for more than half of Britain’s energy consumption. By 1970 there were just 300 pits left – a fall of two thirds in 25 years. By 1974 coal accounted for less than one third of British energy consumption.  Wilson’s government published a new Plan for Coal promising to increase production from 110 million tonnes to 135 million tonnes annually by 1985. This target was never met.   Elected in 1979, Thatcher’s Conservative government attempted to limit industrial subsidies. The NUM threatened to strike, and Thatcher gave in; £200 million was pumped into the industry and £50 million went to industries which switched from oil to British coal. Companies which had bought coal abroad were banned from importing it and 3 million tonnes of coal piled up at Rotterdam, costing the British taxpayer £30 million annually. Seeing a showdown with the NUM as inevitable, Thatcher began stockpiling enough coal and coke around Britain to keep the country supplied for at least six months. The industry was now losing £1.2 million daily with annual interest payments of £467 million. The National Coal Board needed a grant of £875 million and the Monopolies and Mergers Commission found that 75% of British pits were losing money. The reason was obvious. By 1984 it cost £44 to mine a metric ton of British coal when the United States, Australia, and South Africa were selling it on the world market for £32 a metric ton. Productivity increases were 20% below the level set in the 1974 Plan for Coal. Taxpayers were subsidizing the mining industry to the tune of £1.3 billion annually, not including the cost to taxpayer-funded industries such as steel and electricity which were obliged to buy British coal. But when Arthur Scargill, president of the NUM, was asked by a Parliamentary committee at what level of loss it was acceptable to close a pit he answered, “As far as I can see, the loss is without limits.” On 6 March 1984, the government announced the closure of 20 mines and another 70 in the longer term. Peter Walker, energy minister, suggested offering miners at pits slated for closure the choice of a job at another pit or a voluntary redundancy package with another £800 million ploughed into the industry. He told Thatcher “I think this meets every emotional issue the miners have. And it’s expensive, but not as expensive as a coal strike”. Thatcher replied, “You know, I agree with you”. Scargill rejected the offer. While Thatcher viewed a showdown as unwelcome but unavoidable, Scargill – who once said “I do not believe compromise with the capitalist system of society will achieve anything” – actively sought one. After Thatcher’s landslide reelection in 1983, he said he would not “accept that we are landed for the next four years with this government”. Knowing his membership wouldn’t support a strike, he didn’t call the required ballot, instead declaring the NUM’s support for regional strikes. Thus were the miners dragged into a bitter strike which would last a year and end in their total defeat.    It was Scargill, not Thatcher, who sought the Strike, though when it came she prosecuted it ruthlessly. But it didn’t destroy a viable industry. The tragic truth of the Miner’s Strike is that by the time it came the British mining industry had been dying for decades.    John Phelan is an Economist at Center of the American Experiment. (0 COMMENTS)

/ Learn More

Public Choice vs. Homo Politicus

Old habits of thought are difficult to shred. Public choice theory entered economics three quarters of a century ago, but many analysts and journalists have barely noticed. A Financial Times column is a case in point (Gillian Tett, “Snickers Wars Reveal the Enduring Perversity of Human Behaviour,” April 4, 224): First, business competition does not always deliver true efficiency; markets can fail. Second, this market failure arises because consumers are not the all-knowing rational agents that they appear in economic models. They have cognitive biases that lead them to make poor choices and leave them ill-equipped to make judgments about inflation. The public choice intellectual revolution started with a simple analytical assumption: just as the typical individual generally seeks his own interest in private choices, he continues to do the same when he enters the public-choice sausage machine as a politician or government bureaucrat. (His voting behavior can be different because he has no influence on the outcome of elections and referendums, so he can be altruistic or otherwise ethical at no cost.) The self-interest assumption has proven very useful in explaining how governments actually work, as opposed to assuming a nirvana government acting benevolently and with perfect knowledge to correct “market failures.” In reality, government failures are generally worse for most individuals than market failures. In short, politicians and government bureaucrats are just ordinary individuals with ordinary incentives but to whom immense coercive power is granted over other ordinary individuals. That this discovery waited 300,000 years—less than 3,000 years of intellectual history—to be correctly formalized is not surprising. During nearly all these centuries and over nearly all the surface of the globe, individuals of the Homo Sapiens genus thought that political authority figures were part of a superior sort of mankind. Such beliefs probably had evolutionary (survival) benefits. As Bertrand de Jouvenel wrote, individuals obey authority because it “has become a habit of the species.” Typically and boringly, the columnist’s solution to “market failures” is to give governments—homo politicus and homo bureaucraticus—more power to control individual and private choices. As if it were obvious that a free consumer is less rational than a coercive politician. As if the former’s individual choices were less economically efficient and more dangerous than, say, what Joe Biden, Ronald Trump or Katherine Tai would impose on him. As for the behavior of the voters themselves, public choice theory explains the reasons of an observation of Joseph Schumpeter in his famous 1950 book Capitalism, Socialism, and Democracy: [The private citizen] is a member of an unworkable committee, the committee of the whole nation, and this is why he expends less disciplined effort on mastering a political problem than he expends on a game of bridge.” … Thus the typical citizen drops down to a lower level of mental performance as soon as he enters the political field. He argues and analyzes in a way which he would readily recognize as infantile within the sphere of his own interests. He becomes a primitive again. Those who already had an ideological preference for collective choices over private choices, for authority over liberty, for command over contract, were more likely to miss the public choice revolution. But they later quickly embraced “behavioral economics,” which ignores the individual’s cognitive biases when he enters the political realm. Such is the main cognitive bias of behavioral economists or, at least, of their admirers. ****************************** DALL-E described as follows the image I, somewhat tendentiously, instructed him to generate: “Here are the images depicting the stark contrast between the hardship under market failures and the transition to a political land of plenty and happiness, under the guidance of a loving political leader.” Glorious politician rescuing his people from market failures (0 COMMENTS)

/ Learn More

Which price indices are most useful?

I don’t believe that it makes sense to speak of the true rate of inflation. After all, no one seems to know what inflation is supposed to be measuring. Some economists might argue that it represents the increase in pay you’d need so that you are not worse off in terms of utility. But what does that mean?Suppose I met a Gen Zer who said he rather earn $100,000 today than $100,000 in 1955 (when I was born.) After all, today he can have better medical care, better Asian cuisine, better TVs, the internet, smart phones, etc., etc.  Would that imply that there had been no rise in the “cost of living” since 1955, at least for that individual? In my view, that would be a silly way to view inflation. But given textbook definitions, how can I say he’d be wrong?If I’m correct that inflation is somewhat subjective, I’d still insist that some price indices are more useful than others.  Josh Hendrickson directed me to a paper by Marijn A. Bolhuis, Judd N. L. Cramer, Karl Oskar Schulz, and Lawrence H. Summers (BCSS), which estimates recent inflation using the techniques that were used prior to 1983.  They put more weight on things like financing costs, which have risen sharply during a period of rising interest rates: In their revised estimates, 12-month CPI inflation peaked at 18% in November 2022, and remained at 9% even in November 2023.  (The official figures show CPI inflation peaking at only 9.1%.)  Unless I’m mistaken, the revised data implies a 28.6% total increase in the CPI between November 2021 and November 2023.  Let’s compare that to some other data points: Revised CPI:  +28.6% between 11/21 and 11/23 Nominal GDP:  +13.4% between 2021:Q4 and 2023:Q4 Nominal consumption:  +12.9% between 11/21 and 11/23 Nominal average hourly earnings:  +9.6% between 11/21 and 11/23 Taken at face value, a 28.6% rise in the price level at a time of much slower nominal growth implies that the US fell into one of the deepest depressions in US history.  In fairness, it’s not quite right to compare the CPI with nominal GDP, as the CPI only measures the price of consumer goods.  You need the GDP deflator. But notice that nominal consumption rose even more slowly than nominal GDP (although both are actually growing rapidly by 21st century standards).  So if the revised CPI figures are true, then it seems as though real consumption must have plunged at an astounding rate—comparable to a major economic depression such as the 1930s. I suppose one could argue that the same techniques that BCSS used to adjust the CPI might also impact nominal aggregates such as consumption and NGDP.  Even so, it’s hard to believe that any plausible adjustment in aggregate consumption growth could even come close to closing the gap with the revised CPI inflation estimate. In addition, any problems with nominal consumption would not bias the estimate of nominal average hourly earnings, which rose by only a total of 9.6%.  I suppose it is technically possible that nominal wages rose by 9.6% at a time the cost of living rose by 28.6%, but what would that imply about the rest of the economy?  Wouldn’t that imply a major economic crisis where workers were unable to afford anything more than the most basis necessities?  And yet, everywhere I look I see evidence of a booming economy.   To take one example, car sales tend to fall sharply during “hard times”.  And yet car sales have increased sharply during this period of rising interest rates: And car sales are generally far more cyclical than other types of consumption like health care, education and haircuts.  Why have they risen sharply since November 2021? All our economic data points suggest strong output growth. The job market is extremely strong, with low unemployment and very robust growth in total employment.  If you adjust for demographics (the aging population), then the employment-population ratio is back near the peak levels of 1999-2000. If you ask people why we need inflation estimates, they’ll typically say something to the effect that inflation adjustments allow us to figure out how the economy is actually performing, without the distortions created by a declining purchasing power of money.  In other words, we use inflation to convert nominal variables into real variables.  But when I try to apply the BCSS inflation estimates to any sort of plausible nominal variable in the US economy, I come up with real variables that literally make no sense. To be clear, this is not a criticism of the BCSS paper, which focuses on one very narrow question—why is consumer sentiment so poor, despite a strong labor market?  They may be correct in claiming that rising financing costs largely explain the public’s surprisingly sour mood. Rather, my argument here is that these inflation estimates are not useful in a conventional sense.  If we try to use them to convert nominal variables into real variables, we end up with nonsense.  What am I missing?   (0 COMMENTS)

/ Learn More

Biden’s War on Credit

Through the Consumer Financial Protection Bureau (CFPB), the Biden administration has proposed a regulation to cap how much credit card companies can charge us when we’re late on a payment to just $8.  This sounds great on the surface, right?  Lower fees mean less stress when we’re struggling to make ends meet, as inflation-adjusted average weekly earnings have been down 4.2 percent. But, as with many things that seem too good to be true, there’s a catch.  This well-meaning price control could make things the most challenging for those it’s supposed to help. First, why do credit card companies charge late fees? It’s not just about making an extra buck. These fees support more credit available for everyone and encourage us to pay on time, which helps the credit system run smoothly. Now, the CFPB is shaking things up by setting a price ceiling on these fees at $8. While it could save us some money if we slip up and pay late, credit card companies will find ways to compensate for this lost income.  And how do they do that? Well, they might start charging more for other things, tightening who they give credit to, or increasing interest rates. That means, in the end, credit could be more expensive and harder to get for all of us. Not just individuals who could feel the squeeze, but small businesses, too.  Many small businesses rely on credit to manage their cash flow and growth. If banks start being pickier about who they lend to or raise their fees, these small businesses will find it more costly to get credit. This isn’t just bad news for them; it’s bad news for everyone, as the result will be higher prices for consumers, lower wages, and fewer jobs for workers. Remember that small banks and credit unions are a big deal for the local economy. These institutions often depend on fees to keep things running. If they can charge less for late payments, they might not be able to lend as much. This could hit communities hard, making it tougher for people to get loans for starting a small business, buying a home, or building a project. Economists have long warned about the dangers of well-intentioned but poorly thought-out regulations. By setting a one-size-fits-all rule for late fees, the government would make credit more expensive and less accessible for everyone. The idea is to protect us from unfair fees, but the real-world result would be different if access to credit were limited for those who need it most. History proves that often the biggest challenge is to protect consumers from the consequences of government actions. In trying to shield us from high late fees, the government will set us up for a situation where credit is harder to come by and more expensive. This doesn’t mean we shouldn’t try to protect consumers. Still, we need to think carefully about the consequences of our actions and let markets work, which is the best way to protect consumers as they have sovereignty over their purchases. While capping credit card late fees sounds like a simple fix, the ripple effects would be complex and wide-reaching. It’s crucial to keep credit accessible and affordable, support small businesses, and ensure the financial system remains robust.  Let’s look at the implications of this price control regulation before rushing into it. Price controls never work as intended, as history has proven. Instead, we should ensure people in the marketplace determine what’s best for them rather than the Biden administration’s top-down, one-size-fits-none approach.    Vance Ginn, Ph.D., is the president of Ginn Economic Consulting, host of the Let People Prosper Show, and was previously the associate director for economic policy of the White House’s Office of Management and Budget, 2019-20. Follow him on X.com at @VanceGinn. (0 COMMENTS)

/ Learn More