Finding 6: The Public Polls Got It Wrong

What the polls were saying

“Poll: Majority in Calif. Back Gay Marriage”
May 28, 2008, Associated Press

“Poll Shows Majority in State Opposed to Prop 8”
July 17, 2008, KTVU

“Opposition to Prop. 8 up to 55%, poll shows”
Sept 28, 2008, San Diego Union

“Prop 8 trails, yet gap keeps narrowing”
October 31, 2008, San Diego Union

The two major public polls in California created the misimpression that No on 8 was favored to win.

The misleading public poll results were a significant bad break for No on 8. For many in the LGBT community and its allies, the public polls masked the urgency of the situation.

·         The Public Policy Institute of California (PPIC) and the Field Poll—both reputable institutions—consistently showed No on 8 ahead by comfortable margins from May through Election Day.

·         They published their numbers without caveat, even though PPIC’s polling from 2003 through 2008 found California voters almost evenly divided on the issue of same-sex marriage, with neither side possessing majority support for its position.

·         All along, the Lake polling showed No on 8 behind, the election close and difficult, and the situation urgent.

·         No on 8’s executive committee and the second campaign manager made a smart choice to overrule the consultants and go public with the less optimistic internal poll results.

·         The overly optimistic public polling very likely made it harder for No to raise money. When No on 8 went public with the more accurate internal polling numbers, online fundraising immediately surged; in two weeks, it went from small potatoes to sometimes more than $1 million per day.


Public Polling Firms and Prop 8

Why Worry, Be Happy

Chart 28: PPIC and Field's Standard Horse Race Questions

In September, when No on 8’s financial situation was most dire, both PPIC and Field showed No on 8 with a double-digit lead. Although the impact of publication of the polls is not quantifiable, the lopsided win they seemed to predict may have allowed some No on 8 supporters to feel no urgency to donate when in fact the money was desperately needed. At the time, members of the executive committee who were fundraising reported resistance from potential donors who cited this very reason for declining to give.  For a detailed discussion on how to get the truth out of a poll, see Appendix J.

No on 8’s Consultants Misjudged the Situation

From May through September, No on 8 consultants consistently advised that it would be harmful to the campaign to share publicly its much more pessimistic—and in the final analysis, more accurate—internal polling results from Lake Research. Some of the No on 8 consultants reasoned that donors would more likely support a campaign that looked like a winner; they felt that publicly sharing the Lake poll results would depress fundraising.

Their viewpoint may have been informed by their experience working for incumbent candidates seeking reelection, or by operating in a milieu where the norm for consultants is to routinely withhold all “negative” information about their campaigns. For incumbent candidates, inevitability of victory attracts money absent any urgency; donors to leading candidates hope to buy future access.

Donors to ballot measure campaigns, however, do not have this motive to fund favored ballot measure campaigns. The dynamics are very different. LGBT voters and our allies take very personally the exploitation and arousal of anti-gay prejudice that is part and parcel of the anti-LGBT campaign. Awareness of the precariousness of the situation is therefore a powerful motivator for LGBT and allied donors to give, because on a gut level many know the price we pay if we lose.

For the No on 8 campaign, the reflexive withholding of “negative” information was a mistake. To credibly, clearly, and powerfully communicate urgency, No on 8 had to provide evidence; only then were more people going to dig deeply. The No on 8 campaign unquestionably asked for help early on and continuously. But for months, it did not share the internal polling data even though it was by far the strongest evidence that urgency was called for, and evidence unknown to all but a handful of potential donors. Until the campaign shared the information, it was not making its best case for people to help.

No on 8’s Smart Choice to Overrule the Consultants

The best evidence for these conclusions fortunately came in early October, when the No on 8 campaign committee and the new campaign manager overruled the consultants and went public with its underdog status. Telling the community the true situation immediately boosted online fundraising; in two weeks, it went from a relatively small part of the campaign’s income to as much as $1 million per day.

Unfortunately, until then the decision not to make the full case for urgency may have contributed importantly to the financial stress under which No on 8 operated at the crucial time when both sides were placing the TV buys for late September and early October. Yes on 8 used its financial advantage to get its message out first. Even though No on 8 technically went on the air first, its early buy for its first ad was too small to assure that it would penetrate, particularly in the L.A. media market. By contrast, Yes on 8 blanketed the state with its first two ads, “Newsom” and “Princes”, at the same time that No on 8 had to shave its buy to go on the air at all.

No on 8 fundraising would probably have improved to some degree in the closing weeks even without disclosure of its poll findings. Fundraising often improves in the late stages of campaigns. In Prop 8, as the later Field and PPIC polls showed the election getting closer, some potential donors would have recognized the urgency of the situation. But it is doubtful that the scale of improvement would have come close without the more frank disclosure of the internal polling, in part because the surge would have begun later than it did. Additional evidence that the actual No on 8 fundraising surge exceeded the norm is that the No on 8 consultants were astonished by the online fundraising improvement; it greatly surpassed anyone’s preexisting expectations and exceeded by more than 1,000% the goals set in the fundraising budget.

For all details on the ad buys of both campaigns, see Appendix E. For more information on the PPIC, Field, Lake, and other polling, and the contrasts among them, see Appendix D, the charts chronology. For more details on the factors contributing to No on 8 fundraising success, see Finding 9.