Do Vaccines Make Us Healthier?

Three groundbreaking studies examined what federal health officials admit they’ve never investigated — whether vaccines improve overall health. The results were “jaw-dropping.”

Do vaccines make us healthier?

Everyone, no matter where they stand on the vaccine debate, should want to know the answer to that question.

Yet our national health agencies have never tried to find out.

In the video below, you’ll learn about three groundbreaking studies that set out to answer this most basic question.

The studies (herehere and here) were conducted independently, using different methods — but all three compared the overall health of vaccinated people versus the overall health of the tiny fraction of people in the U.S. who have never been vaccinated.

The results, revealed in the video below, are “striking.”

Watch here:

LATEST VIDEO