MONDAY, Dec. 26, 2011 (HealthDay News) -- Cold weather is drier and can have damaging or negative effects on skin, but there are steps people can take to look and feel better, according to Dr. Amy McMichael, a dermatologist at the Wake Forest Baptist Medical Center in Winston-Salem, N.C.
McMichael offered the following tips for protecting skin during the winter months:
More information
The U.S. National Institutes of Health provides more information on dry skin.