The Women’s Health Initiative

The Women's Health Initiative (WHI) is a long-term national health study that focuses on strategies for preventing heart disease, breast and colorectal cancer and fracture in postmenopausal women.