Is India still dominated by western culture? Question Is India still dominated by western culture? General Bandana 4 years 2019-02-18T16:21:48+00:00 2019-02-18T16:21:48+00:00 5 Answers 0
Answers ( 6 )
It is true that India is still ‘imperialized’ by western culture and are themselves responsible
for either exporting self to the west or importing west here.
Notwithstanding with the fact that India is so rich and diverse, with a plethora of festivals, religion
is still succumbing to first world countries.
Constant hype of the advancement, technology, schools, standard Western countries and lack of
self-belief are all leading to this.
Western culture cannot be belittled and negated completely, there are a few things which can be
imbibed like self-esteemed, gender equality and not to be afraid of anyone.
It would not be wrong to say that Indian people are suffering from ‘an identity crisis’.
ما الفرق بين العلاج الطبيعي والعلاج الفيزيائي في التشيك ؟