What are you even criticizing? The idea of democracy being able to change how the country works? We've seen that happen several times in my life.
Obviously "America Bad" is always going to be true to some extent because it's basically the most powerful and violent empire in history, but that doesn't give you a blank check to hate it incoherently.