Informed consent is what doctors and researchers must get from patients and research subjects before starting clinical trials or any kind of invasive medical treatment. "But how many patients truly understand the alternatives or the risks and benefits of the test or treatment they are undergoing? Are patients really being informed?" asks Scientific American's Deborah Franklin.