Most Americans think gun violence has increased, even though it has actually decreased significantly since the 1980s, The Economist writes. The magazine explains this with a rise in highly publicized mass shootings, which may give people a wrong impression of general gun-violence trends. But it’s also possible that this reflects a widespread sense that things are getting worse and America is in decline. The right tends to complain about an alleged erosion of American values and dismantling of the free market, while the left sees a growing income gap as a sign that the U.S. is getting a worse place to live in. Realizing that some things are actually getting better might do America’s political discourse good.