1. Joined
    03 Feb '07
    Moves
    193785
    11 Nov '12 06:08
    Gallup was the worst. ARG and Rasmussen were also well off.

    Of the 20 or so pollsters rated, only 4 had polls in which Obama underperformed, and the underperformance could have been caused by the loss of votes in NY and NJ due to the hurricane.

    http://fivethirtyeight.blogs.nytimes.com/2012/11/10/which-polls-fared-best-and-worst-in-the-2012-presidential-race/#more-37396
  2. Standard membersh76
    Civis Americanus Sum
    New York
    Joined
    26 Dec '07
    Moves
    17585
    11 Nov '12 16:35
    As much as I love Nate, this line has me scratching my head:

    Our method of evaluating pollsters has typically involved looking at all the polls that a firm conducted over the final three weeks of the campaign, rather than its very last poll alone. The reason for this is that some polling firms may engage in “herding” toward the end of the campaign, changing their methods and assumptions such that their results are more in line with those of other polling firms.


    The election is not a snapshot of the country during the last 3 weeks. The election is a snapshot of the country on election day. It is plain from all the data that there was a shift towards Obama in the final days. To say that a pollster was inaccurate because it predicted Obama underperforming with 2 weeks to go seems silly.
  3. Joined
    06 Aug '06
    Moves
    1945
    11 Nov '12 21:49
    Originally posted by sh76
    As much as I love Nate, this line has me scratching my head:

    [quote]Our method of evaluating pollsters has typically involved looking at all the polls that a firm conducted over the final three weeks of the campaign, rather than its very last poll alone. The reason for this is that some polling firms may engage in “herding” toward the end of the campaign, ch ...[text shortened]... llster was inaccurate because it predicted Obama underperforming with 2 weeks to go seems silly.
    This might be valid, if the comparison is to "consensus polls" of that period.

    For example,

    3 weeks before the election, the average poll gives candidate A a 10 point advantage, polling firm X gives him a 15 point advantage.

    On election day, the average poll gives candidate A a 5 point advantage, which is also the actual difference in the election. Polling firm X also gave him a 5 point advantage.

    In this case, you could argue that the average of the polls were accurately showing the voters intentions. If that is the case, then the poll from X 3 weeks before the election was of by 5 points, so that should be reflected in a polls 'report card'. It gets a bit more complicated if polls as a whole are shown to have a bias in an election and of course there is the chance that this bias has changed over the election cycle. Maybe the difference at election-3 weeks was in fact 15 points and there was bias in the average of polls which dissipated before election day, so firm X was right all along.
  4. Standard membersh76
    Civis Americanus Sum
    New York
    Joined
    26 Dec '07
    Moves
    17585
    11 Nov '12 22:14
    Originally posted by Barts
    This might be valid, if the comparison is to "consensus polls" of that period.

    For example,

    3 weeks before the election, the average poll gives candidate A a 10 point advantage, polling firm X gives him a 15 point advantage.

    On election day, the average poll gives candidate A a 5 point advantage, which is also the actual difference in the election. P ...[text shortened]... in the average of polls which dissipated before election day, so firm X was right all along.
    Okay, but I don't see in the article that this is what he was doing.
  5. Standard memberno1marauder
    Naturally Right
    Somewhere Else
    Joined
    22 Jun '04
    Moves
    42677
    11 Nov '12 22:48
    Originally posted by sh76
    As much as I love Nate, this line has me scratching my head:

    [quote]Our method of evaluating pollsters has typically involved looking at all the polls that a firm conducted over the final three weeks of the campaign, rather than its very last poll alone. The reason for this is that some polling firms may engage in “herding” toward the end of the campaign, ch ...[text shortened]... llster was inaccurate because it predicted Obama underperforming with 2 weeks to go seems silly.
    What he's saying is that some firms "cook" their results close to Election Day if it becomes apparent they are outliers at this point. The most obvious example this year is Gallup which consistently showed in the weeks before the election that Romney was up by 5-7 points; a result no other poll was close to. Then on the day before the election, Gallup produces a poll showing Romney up by 1 point which was at least close to other polling. To ignore the weeks of inaccurate polling would be to give Gallup too much credit; polls before Election Day are supposed to be a "snapshot" too and where one is consistently an outlier that should be reflected in the evaluation of the pollsters' accuracy.
  6. Standard memberSoothfast
    0,1,1,2,3,5,8,13,21,
    Planet Rain
    Joined
    04 Mar '04
    Moves
    2701
    13 Nov '12 08:07
    I'd love to know what "secret recipe" Gallup was using to come up with its ridiculous poll results in October. Maybe they were getting money under the table from Richie Mitt & Friends.

    What's interesting to me is that of the top eight most accurate polling firms on Nate's list, only two were ever accepted by Real Clear Politics as being worthy of inclusion in their poll averages.

    Maybe in 2014 RCP will drop its objection to Internet polls.
  7. Joined
    03 Feb '07
    Moves
    193785
    14 Nov '12 23:08
    Originally posted by sh76
    As much as I love Nate, this line has me scratching my head:

    [quote]Our method of evaluating pollsters has typically involved looking at all the polls that a firm conducted over the final three weeks of the campaign, rather than its very last poll alone. The reason for this is that some polling firms may engage in “herding” toward the end of the campaign, ch ...[text shortened]... llster was inaccurate because it predicted Obama underperforming with 2 weeks to go seems silly.
    But some pollsters have a clear pattern of "herding" and it looks suspicious - like someone trying to manipulate the election, but moving the results to the mean at the last minute to preserve the reputation to be able to manipulate the next time around.
  8. Standard memberSoothfast
    0,1,1,2,3,5,8,13,21,
    Planet Rain
    Joined
    04 Mar '04
    Moves
    2701
    14 Nov '12 23:15
    Originally posted by Kunsoo
    But some pollsters have a clear pattern of "herding" and it looks suspicious - like someone trying to manipulate the election, but moving the results to the mean at the last minute to preserve the reputation to be able to manipulate the next time around.
    Those bastards.
Back to Top

Cookies help us deliver our Services. By using our Services or clicking I agree, you agree to our use of cookies. Learn More.I Agree