Facebook is building a hidden, bot-only platform to learn about trolls and scammers – The Verge
Facebook desires to cease individuals from abusing its system, so it’s making a world of bots that may imitate them. Company researchers have released a paper on a “Web Enabled Simulation” (WES) for testing the platform — principally a shadow Facebook the place nonexistent customers can like, share, and good friend (or harass, abuse, and rip-off) away from human eyes.
Facebook describes building a scaled-down, walled-off simulation of its platform populated by faux customers modeling completely different sorts of actual conduct. For instance, a “scammer” bot is likely to be educated to join with “target” bots that exhibit behaviors related to actual-life Facebook rip-off victims. Different bots is likely to be educated to invade faux customers’ privateness or search out “bad” content material that breaks Facebook’s guidelines.
Software program simulations are clearly widespread, and Facebook is increasing on an earlier automated testing instrument called Sapienz. But it surely calls WES programs distinct as a result of they flip a lot of bots unfastened on one thing very shut to an precise social media platform, not a mockup mimicking its features. Whereas bots aren’t clicking round a literal app or webpage, they ship actions like good friend requests by way of Facebook code, triggering the identical sorts of processes a actual consumer would.
That might assist Facebook detect bugs. Researchers can construct WES customers whose sole objective is stealing info from different bots, for instance, and set them unfastened on the system. In the event that they immediately discover methods to entry extra information after an replace, that would point out a vulnerability for human scammers to exploit, and no actual customers would have been affected.
Some bots may get learn-solely entry to the “real” Facebook, so long as they weren’t accessing information that violated privateness guidelines. Then they might react to that information in a purely learn-solely capability. In different circumstances, nonetheless, Facebook desires to construct up a whole parallel social graph. Inside that giant-scale faux community, they will deploy “fully isolated bots that can exhibit arbitrary actions and observations,” and they will mannequin how customers would possibly reply to adjustments within the platform — one thing Facebook typically does by invisibly rolling out assessments to small numbers of actual individuals.
Researchers do, nonetheless, warning that “bots must be suitably isolated from real users to ensure that the simulation, although executed on real platform code, does not lead to unexpected interactions between bots and real users.”
Facebook calls its system WW, which Protocol plausibly pegs as an abbreviation for “WES World.” However as that sentence makes clear, Facebook isn’t building Westworld right here in any respect. It’s making a simulacron: a world of synthetic persona models designed to train us extra about ourselves. Whereas researchers are presumably limiting these interactions for the sake of actual customers, they’re additionally helpfully stopping any catastrophic existential crises amongst bots. Which is solely well mannered, as a result of should you’re building a faux universe stuffed with tiny beings who don’t know their true nature, you’ve principally assured that you simply’re starring in a remake of World on a Wire and dwelling in a simulation your self.