Do you think this is the end of America playing a significant role on the world stage?
America keeps sinking deeper and deeper into this mess as the countries get on their feet. Most countries don't want american travelers. Some countries are considering looking for different sources of trade. I think we are seeing the end of the American empire.
The stock market? What a terrible indication of how people are doing. 85% of all stocks are owned by 10% of people. Tens of millions are jobless and the Fall is going to be worse. A vaccine won't happen till next year. Deaths are increasing and testing is decreasing (unless you live in an opposite reality?) This is the kind of thinking that is running america into the ground. This false sense of security and this delusion