When people discuss decolonization, they may mean many different things. Generally, decolonization is described as the process in which colonized areas and nations are returned to their pre-colonized state. Colonizers in the past have taken the lands of others, destroyed foreign cultures, and forced their own cultures on natives. Decolonization strives to undo as much of this damage as possible, both politically and culturally. Women have been disproportionately affected from colonization, as colonizers in the past have largely been patriarchal and believed in limited rights for women. Some scholars of decolonization focus on the movements in colonies challenging independence, such as creole nationalism. Successful decolonization can result in the establishment of independent states. However, it doesn't need to be that large-scale. Often, colonization is present in the way victims act and conduct life. People have been forced to drop their cultures and follow another. Histories are being forgotten. Decolonization could refer to the simple act of learning about one's own culture instead of following cultural norms around oneself.