Ah, summer camp. For those of us who were generally allergic to the outdoors as kids, summer camp did not necessarily mean cabins, building fires, and outdoor recreation. In my case, summer camp usually meant summer orchestra, which was the best. Not only did I get to play music for hours every day, I also got to spend time with a bunch of other orchestra kids who would not even bat an eye if I wanted to discuss which Suzuki book they were on or the pros and cons of catgut strings.
In many ways, AAPOR feels similar to those years of summer orchestra—conference attendees discuss sampling issues and response rates as if they were discussing the weather. This year over one thousand people met at AAPOR to discuss all the intricacies of public opinion research. Beth and I had a really difficult time choosing which talks to go to because they all sounded so relevant and interesting.
This conference in particular gives us a good sense of the changing landscape of survey research and of the big issues in the field. One big issue was the mode to use for survey research. Phone surveys have been the gold standard for opinion research for a while, but with the increasing rate of cell phone use (especially cell phone only households) and the decreasing response rates, it has become harder and more expensive to maintain phone survey quality. There was at least one presentation about a survey that has transitioned over to cell phone only. Other talks discussed the quality of online panel surveys or using multiple survey modes (e.g., a paper survey with phone, an online survey with paper, etc.) to increase response rates.
Another big issue was incorporating all the available data into public opinion research. New research has started looking at whether social media and other big datasets can be analyzed as another way of measuring public opinion. A final big issue had to do with transparency in survey research. Although there is a ton of survey data floating around, it is not always clear who is following correct and ethical survey methodology standards and who is not. Moreover, a lot of survey data reports do not include enough information to even judge what standards are being used. AAPOR’s Transparency Initiative encourages organizations to include with public release of data certain survey methodology information that allows people to judge the quality of the data.
On top of all the great discussions of survey methodologyy and of the future of public opinion research, Beth and I also got to meet (briefly) Nate Silver, who received an award from AAPOR this year. (Someone may have texted this photo to her mom.) After the awards reception, AAPOR held a casino night to raise money for student awards. Silver participated in the poker tournament and kindly posed for many photos. He also has posted his own thoughts about some issues raised at the conference.
PHOTO CREDIT: DAVID QUACH