There's nothing like gaining a load of retweets for a cheap ego boost, but it also has more tangible benefits in terms of driving site traffic and increasing exposure of your business.
There are tactics we assume to work (e.g. put 'Justin Bieber in the headline), but a new report by Hewlett Packard and UCLA tries to create a model that allows news agencies to predict how many retweets they will get.
It suggests that it can be done to an accuracy of 84%, which is a bold claim.
This fosters "the possibility of appropriate decision making to modify an article and the manner of its publication".
We've previously blogged about how to optimise Twitter headlines and 20 ways to get more retweets, but the HP report looks at four different factors:
- The news source that generates and posts the article.
- The category of news this article falls under.
- The subjectivity of the language in the article.
- Named entities mentioned in the article.
The data from the study was collected from Feedzilla, with popularity of articles measured as the number of times a news URL is posted on Twitter.
Hot Topics
News related to technology was found to be most prominent within the study in terms of the sheer number of links posted. However categories such as health had a lower number of published links but higher rates of tweets per link.
These categories perhaps have a niche following and loyal readers who are intent on posting and retweeting its links.
This graph shows the average number of tweets per article (t-density) in each category compared to the number of links posted.
Traditional News vs. New Media
The report's authors also looked at whether news sources featured more often on Google News (using ratings service NewsKnife) are also more popular on Twitter.
It found that while more traditionally prominent news agencies such as Reuters and the Wall Street Journal tend to perform better on Google, on Twitter the top ten sources include marketing and tech blogs Mashable, Search Engine Land and the Google Blog.
It is also worth noting that there is a bias toward news and opinion on web marketing, indicating that these sites actively use their own techniques to increase their visibility on Twitter.
A comparison found that The Christian Science Monitor received on average 16 tweets, while Mashable gains an average of nearly 1,000 tweets.
Predicting zero tweet articles
As you might expect, the most significant feature in predicting which links will get zero retweets is the source, followed by its category. This is perfectly logical, as major sites such as Mashable have spambots that automatically retweet their content.
However, it also found that name-dropping a well-known celeb or the tone of language did provide more information for this prediction.
Conclusion
The authors admit that while it is still impossible to predict the exact number of retweets, they can provide "a range of popularity for the article on Twitter".
It also concludes that while everyone is looking to create content that goes viral, a great number of articles also spread in medium numbers.
These medium levels can target highly interested and informed readers and thus the mid-ranges of popularity should not be dismissed.
That said, the most important factor for predicting popularity is something that news organisations can't affect in the short-term, the source of the article.
The news category classification in their algorithm didn't perform well as a predictor on its own. Also the language of an article (subjective vs. objective) or including a famous "named entity" wasn't nearly as important as the source.
So the likes of Mashable, the Google Blog and Allfacebook have a built-in advantage on social media that other organisations can only overcome with several years hard work in community management.
No hay comentarios:
Publicar un comentario