Interest rates in the U.S. are set by the central bank, which is the Federal Reserve (or “The Fed”). The Fed either raises or lowers rates depending on the health of the U.S. economy. Since the Great Recession, we have been in a historically low interest rate environment. The Fed lowered rates in 2008 in order to help “stimulate” the economy, or encourage borrowing, and move us forward from the financial crisis. In 2007, a 3 month U.S. Treasury Bill was yielding about 3.4%, and since the bottom of the crisis, it took until the summer of 2017 to climb back above 1%.
For a historical reference, from 1900-2012 the average rate of the 3 Month Treasury bill has been about 3.8%. The highest rate we’ve seen was 15.5% in 1980, and the lowest has been 0.0% during the 1930s and 1940s and for roughly the past seven years. You’ll notice that the two periods of extremely low interest rates occurred during the two worst economic times of U.S. history. This goes back to the idea of the Fed lowering rates with the expectation of stimulating the economy.
Savers benefit from higher interest rates because they are able to demand a higher return on their capital. A person in the 1980s would have had no problem finding a bank CD yielding above 10%. Nowadays you’re lucky if you find anything close to 1%! On the other hand, consumers benefit during times of lower interest rates. For example, currently the average 30-year mortgage rate is right around 4%. Back in the 1980s, the average 30-year rate hovered around 15%.
Where do we go from here?
Since 2015, the Fed has raised rates by a quarter of a percent four times. They have taken their time raising rates because the economic recovery hasn’t been as robust as most had hoped since the Great Recession. More recently, we are seeing slightly stronger economic growth in the U.S. economy, therefore many analysts are expecting another 0.25% rate hike before 2018. Time will tell.