Wounds to the skin and tissue can heal on their own over time. But scars will remain after those wounds heal. Once you have a scar, it will not go away completely. However, proper care can help wounds heal with less scarring. And once the wound has closed, you can take steps to help improve how a scar looks or feels.Talk to your doctor about how to care for your skin to reduce the appearance of scarring, and about what you can expect for any scar care he or she may recommend. Want to improve self-confidence due to scars that are very visible on the face, hands or arms.Have pain, itching, swelling, scabs or other discomfort as the skin heals.Have pain or discomfort of underlying tissues, tendons and nerves because of how the scar heals.Tend to scar easily.