The British explored and colonized America for a few reasons. Some individuals or groups of individuals went to America for opportunity. Some went to America to have freedom of religion; they wanted to be able to practice their religion freely without being persecuted. But the main reason for England colonizing America was for mercantilism. Mercantilism is the economic concept of exporting more than they had to import. This meant that they would sell more than they had to buy from other countries. This made England a wealthy country. England made colonies in America and in the Caribbean and would only allow their colonies to trade with England, no one else. Because of this, as the colonies grew, they began to realize that England was monopolizing their trade. This later lead to the Declaration of Independence, along with other things, and well... This is beyond answering your question, sorry. I hope that cleared things up for you.