Facebook has been quietly experimenting with reducing the amount of political content it puts in users’ news feeds. The move is a tacit acknowledgment that the way the company’s algorithms work can be a problem. The heart of the matter is the distinction between provoking a response and providing content people want. Social media algorithms – the rules their computers follow in deciding the content that you see – rely heavily on people’s behavior to make these decisions. In particular, they watch for content that people respond to or “engage” with by liking, commenting and sharing. As a computer scientist…
This story continues at The Next Web
No comments:
Post a Comment