
Meta has announced a new feature for Threads, allowing users to manage the amount of fact-checked content displayed in their feed.
The intention behind this move is to empower users to control their exposure to misinformation, giving them the ability to regulate the presence of controversial topics on the platform.
The control settings offer three levels of adjustment; "Don't reduce," "Reduce," and "Reduce more." Although none of these options completely hide content, they impact the ranking of posts identified as containing false or misleading information.
Threads fact-check: How to setup
Accessing this feature requires users to follow a specific path within the Threads interface. By tapping the two lines in the upper-right corner, navigating to the profile tab, and selecting Account > Other account settings (leading to Instagram) > Content preferences > Reduced by fact-checking, users can customise their experience.
Read more: Threads conducts tests to expand post availability on other platforms
Meta positions these controls as a means to grant users more influence over the algorithm determining post rankings in their feed. The company emphasises its responsiveness to user demands for greater autonomy in shaping its app experience.
Threads fact-check: Concerns
While the concept seems appealing on the surface, some users express concerns about potential censorship. NBC News highlighted a post suggesting the tool could be used to censor content related to the Israel-Hamas War, raising questions about the limits and implications of such user-controlled features.
Meta employs third-party fact-checkers to assess the accuracy of content on Instagram and Facebook. While fact-checkers cannot directly rate Threads content, Meta plans to transfer ratings from Instagram and Facebook to "near-identical content on Threads."
This indirect application raises questions about the consistency and accuracy of the fact-checking process across Meta's platforms.