Nursing is One of the Most Dangerous Jobs in the United States
Nursing is probably not the first profession that comes to mind when you think about dangerous jobs. Unfortunately, it has become one of the riskiest occupations in the country. According to a study cited by The Washington Post, nurses, and…