I'm reading a very interesting and thought provoking book at the moment - Roots by Alex Haley. Whilst reading a section about how the black captives were lined up in cages whilst the white purchasers laughed at them and made monkey sounds, I began to think what is a civilised culture.
In the West we are dominated entirely by money. We work to earn money so we can buy food and pay for a roof over our heads. We distance ourselves from other human beings - there is clearly no community spirit left, apart from in a few odd situations, and the family unit is ever shrinking. We destroy everything we touch, and kill anyone who doesn't agree with our perspectives on life. We have a long history of trying to force our views, opinions and culture on others that we deem less intelligent.
Compare this to a society where people live at one with nature, grow their own food, keep their own animals. Where the community is central to life, and the family is an important factor in everyone's life. Where violence is not tolerated, and crimes are punished swiftly and effectively. Where everyone works hard for the benefit of everyone, not just for their own selfish needs.
Where did we go so wrong? Or, in your opinion, did we even go wrong?
Social Networking Bookmarks