Health Tip: If You Have a Wart
Signs that you should visit a doctor
(HealthDay News) -- Warts are skin growths that usually are harmless and can be treated without a doctor's care.
But sometimes you should see a doctor about a wart. The American Academy of Dermatology explains these warning signs that an office visit is needed:
- Suspecting that the growth may not actually be a wart.
- Having a wart on the genitals or face.
- Developing a lot of warts, or warts that bleed, burn, itch or cause pain.
- Having a weakened immune system or being diabetic. Diabetics should never try to remove warts on the feet.