Western Medicine

Western Medicine – Western or allopathic medicine is recognized in the United States as mainstream or conventional medicine, as opposed to alternative and complementary medicine. Western medicine is the educational approach followed by the vast majority of American medical schools and residency programs.