Are indigenous Americans now fully integrated into American society?
Like they have a right to education, a right to employment, a right to vote, and a right to own a house and land in America?
Like they have a right to education, a right to employment, a right to vote, and a right to own a house and land in America?
Anonymous
A "right" to own a house and land in America? A right? Where does that right come from?
Money??
Anonymous
The same reason as blacks cause of their dnc issued victim card.
NPG Starlett
Yes, indeed. In fact, some Native American communities have legal ownership of certain areas in America (such as Nevada), and a lot of them are into running casinos proving their sharp business sense.