How did western medicine get its name?

article The word western is an English word meaning “of western” and it refers to any of a group of people who live in the Americas and Europe.In the 19th century, the term “Western medicine” was coined to describe the practice of treating the sick by treating them with a mixture of herbs and remedies […]

Read More »

What is a dictionary dictionary of western medicine?

The western medicine dictionary is a collection of definitions for some of the most common terms used in western medicine.The dictionary was created by the Oxford English Dictionary and is the first of its kind in the world, said Dr. Andrew Stokes, the dictionary’s director.The Dictionary is a place to put words into context and […]

Read More »