When Western Medicine Bites at the Roots

The Western Medicine profession is no longer just the profession of medical doctors, but is also the profession that is in the process of becoming a black hole for black doctors, as well as a culture of whitewashing.The profession has become an integral part of Western civilization, but this transformation has been slow, with many …