Do Americans feel their country has developed into a better place because of independence?

Posted by Duke_Tristan@reddit | AskAnAmerican | View on Reddit | 26 comments

My friend and I were talking about this due to the King's visit. If the US had stayed a colony, then it would likely have developed into a Dominion and then gained independence in the end anyway like Australia, New Zealand, and Canada.

Would we likely see big differences in the US today?

Maybe a political system where the elected head has less executive power like the prime ministers of commonwealth countries? No civil war as the slave trade would have been banned already by Britain in 1807? No constitution means no gun culture perhaps? Some kind of universal healthcare?

On the other hand, perhaps the US doesn't expand beyond the original east coast colonies, and therefore doesn't become a global superpower?

What do you guys think?