Let me tell you about something that happened to me yesterday. I was reading the latest article about a study of Twitter users on Media Post News.
http://www.mediapost.com/publications/?fa=Articles.showArticle&art_aid=104808#commentscomments
In this article, it stated that Twitter users, according to the survey, are motivated by learning new things. Like you perhaps, I thought yes, that sounds like me. So I wanted to read more and even searched for the real source of the study and found it.
As I was reading the complete details of this study, I came across an interesting statistic -- the study said that the average tweeter spends 2.75 hours a day on twitter. Now I don't know about you but I thought that seemed awfully high and wondered immediately who they surveyed. So I went to the very bottom of the study to the "about the study" section and lo and behold I found my answer.
This study was not to a random cross section of Twitter users by any means. It was to a self-selecting group of Twitter users that responded to some tweets asking for participation in a survey. So in other words this study is to a sampling of "power tweeters." I hate to tell you, that negates the results for this survey. Bottom line here is we simply cannot use the results of this survey to make statements about all Twitter users.
Lesson learned: be careful reading all those statistics being tossed about lately based on studies being conducted. Check the sources carefully and how the studies were conducted.
For the full study see:
http://www.marketingcharts.com/interactive/tweeters-motivated-by-learning-immediacy-8864/
Perry
This has been an ancient debate among market research circles. And I have run across those same problems when surveying our own students and prospects to determine demand for a particular program. If we send out a survey inquiring about an interest in a course, certificate or degree, those who are interested are more likely to respond favorably. Among those who were not interested, could have easily opted NOT to answer the survey at all, which means those answers were not considered in the final tabulation. The results, therefore, become misleading.
ReplyDeleteWhat we've learned to do in these circumstances is ask additional questions that determine the "likelihood" of a student to register for the course. In one particular survey we conducted to determine who among our graduate students would be interested in an intensive foreign language noncredit certificate, the results in favor of the “idea” where overwhelming -- about 98 percent. But when asked about their motivation for registering and the barriers that would prevent them from registering, the percentage of “likelihood” was less than half - 48%.
Thus, interpretation and conclusion is clearer, more meaningful, and reliable. While interest level was high, barriers such as cost, course load, and scheduling flexibility affected motivation. In our final report, we recommended that SCPS carefully consider these factors in its decision to develop (or not) the certificate program.
Rebecca MB. Pearson, Research Associate, NYU-SCPS
www.rebecca.pearson@nyu.edu