Yes, Hot Flashes Can Last 15 Years For Women
DEAR DR. ROACH: I started going through natural menopause in my late 50s. I will be 65 in September. Should I still be having hot flashes? — B.G.
ANSWER: Unfortunately, some women (perhaps 25 percent) will continue to have hot flashes 15 years out from the start of menopause. There are many treatments for hot flashes, but although estrogens are the most effective, the risk of starting hormone replacement at age 65 is considerably higher than at age 50, so most clinicians are uncomfortable prescribing them.
Before thinking about medication, however, many women (and a few men, such as those on hormonal treatment for prostate cancer) have found non-medicinal ways to deal with them. Layers of clothing are key — take extra layers off when hot, but recognize that so much body heat can be lost, a person actually can shiver when the flash is over. Keeping the room cooler will help as well. Anyone living with you may just have to deal with that. Handheld fans are great for some.
Prescription medications include medicines normally used for depression (like venlafaxine, citalopram or paroxetine) and for seizures (gabapentin often is prescribed, especially if symptoms are worse at night). Various supplements are used, but estrogen-containing ones, even plant estrogens, may have significant though poorly quantified risks. A new class of medicines (NK3 inhibitors) still is being studied and may be very beneficial, but it seems like I’ve been saying that for a while now. Food and Drug Administration approval sometimes can take years.
DEAR DR. ROACH: I am wondering why the medical profession seems to be ignoring the potential of silver in combating the rise of drug-resistant bacteria. It seems that silver was used to combat bacteria for thousands of years, up until the advent of antibiotics, which supplanted its use.
I have heard that silver vessels were used to prevent spoilage of food and wine by the Greeks and Romans, that silver was placed in water barrels by pioneers crossing the prairies, and that silver nitrate was placed in my eyes at birth just in case my mother had gonorrhea (I’m 74).
My understanding is that silver impedes the metabolic process in bacteria, and as such prevents reproduction even if it doesn’t kill them, but that it is much less toxic to “higher” organisms such as humans. Furthermore, I read in 2013 in a prestigious journal that low levels of silver render antibiotics much more effective. Why aren’t we using silver? — B.C.
ANSWER: Topical silver is indeed an effective antibacterial (it has effectiveness against some fungus as well). That is, when silver is ionized, such as in silver nitrate or silver sulfadiazine. However, metallic silver — say, coins or cups — is not the most effective form.
I read the 2013 paper you mentioned, and it is clear that silver is helpful when added to antibiotics. However, the use of systemic silver in humans has shown problems. A silver-coated heart valve was effective at preventing infection, but was so toxic to heart tissue that it needed to be taken off the market. Excess silver ingestion causes argyria, a permanent blue-gray discoloration of the skin. Silver is a heavy metal, and the body does not have a way of removing it, so it can accumulate. Because of these known risks, a high level of certainty is required before using silver systemically. Topical silver remains a potent tool used clinically daily.
I should note that commercially available “colloidal silver” preparations are of uncertain purity, and the FDA has ruled that colloidal silver is neither safe nor effective for any condition. I recommend strongly against using it, while I support continued research into legitimate medical uses of silver.
ı ı ı
Dr. Roach regrets that he is unable to answer individual letters, but will incorporate them in the column whenever possible. Readers may email questions to ToYourGoodHealth@med. cornell.edu or send mail to 628 Virginia Dr., Orlando, FL 32803.