Ouvrir le menu principal

HOPE Étudiant β

Examiner des modifications individuelles

Navigation du filtre antiabus (Accueil | Modifications récentes des filtres | Examiner les modifications précédentes | Journal antiabus)

Cette page vous permet d'examiner les variables générées pour une modification individuelle par le filtre antiabus et de les tester avec les filtres.

Variables générées pour cette modification

VariableValeur
Si la modification est marquée comme mineure ou non (minor_edit)
Nom du compte d’utilisateur (user_name)
AlfonzoKuefer23
Groupes (y compris implicites) dont l'utilisateur est membre (user_groups)
* user autoconfirmed
Si un utilisateur est ou non en cours de modification via l’interface mobile (user_mobile)
Numéro de la page (article_articleid)
0
Espace de noms de la page (article_namespace)
0
Titre de la page (sans l'espace de noms) (article_text)
Facebook Gets About 500 000 Reports Of Revenge Porn A Month Report Says
Titre complet de la page (article_prefixedtext)
Facebook Gets About 500 000 Reports Of Revenge Porn A Month Report Says
Action (action)
edit
Résumé/motif de la modification (summary)
Ancien modèle de contenu (old_content_model)
Nouveau modèle de contenu (new_content_model)
wikitext
Ancien texte de la page, avant la modification (old_wikitext)
Nouveau texte de la page, après la modification (new_wikitext)
id="article-body" class="row" section="article-body" data-component="trackCWV"><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>has been working for years on tools to prevent and remove revenge porn on its apps, but that apparently trying to share these images. Facebook, which also owns popular apps Instagram, [http://nude-milf.top/ http://nude-milf.top/] Messenger and WhatsApp, has to  each month, according to a report Monday from News. <br>Facebook, the world's largest social network, earlier this year that can spot , also known as nonconsensual intimate images, before being reported by users. In 2017, the company also launched a pilot program that let users  in an effort to prevent them from being shared on the social network. <br>.shortcode.newsletter h2 width:100%!important;<br>However, Facebook's Radha Plumb told NBC News that the initial explanation of the pilot wasn't clear enough, and after negative feedback the company launched a research program in 2018 to explore and support victims. <br><br>"In hearing how terrible the experiences of having your image shared was, the product team was really motivated in trying to figure out what we could do that was better than just responding to reports," Plumb, head of product policy research at Facebook, told NBC News. <br><br>Facebook reportedly now has a team of 25 people, not including content moderators, focused on preventing the nonconsensual sharing of intimate photos and videos.<br><br>Facebook didn't immediately respond to a request for comment. <br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>Now playing:<br>Watch this:<br><br>Zuckerberg introduces Facebook Protect, Pixel 4 reviews...<br><br><br><br><br><br><br>1:08
Diff unifié des changements faits lors de la modification (edit_diff)
@@ -1,1 +1,1 @@ - +id="article-body" class="row" section="article-body" data-component="trackCWV"><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>has been working for years on tools to prevent and remove revenge porn on its apps, but that apparently trying to share these images. Facebook, which also owns popular apps Instagram, [http://nude-milf.top/ http://nude-milf.top/] Messenger and WhatsApp, has to  each month, according to a report Monday from News. <br>Facebook, the world's largest social network, earlier this year that can spot , also known as nonconsensual intimate images, before being reported by users. In 2017, the company also launched a pilot program that let users  in an effort to prevent them from being shared on the social network. <br>.shortcode.newsletter h2 width:100%!important;<br>However, Facebook's Radha Plumb told NBC News that the initial explanation of the pilot wasn't clear enough, and after negative feedback the company launched a research program in 2018 to explore and support victims. <br><br>"In hearing how terrible the experiences of having your image shared was, the product team was really motivated in trying to figure out what we could do that was better than just responding to reports," Plumb, head of product policy research at Facebook, told NBC News. <br><br>Facebook reportedly now has a team of 25 people, not including content moderators, focused on preventing the nonconsensual sharing of intimate photos and videos.<br><br>Facebook didn't immediately respond to a request for comment. <br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>Now playing:<br>Watch this:<br><br>Zuckerberg introduces Facebook Protect, Pixel 4 reviews...<br><br><br><br><br><br><br>1:08
Lignes ajoutées lors de la modification (added_lines)
id="article-body" class="row" section="article-body" data-component="trackCWV"><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>has been working for years on tools to prevent and remove revenge porn on its apps, but that apparently trying to share these images. Facebook, which also owns popular apps Instagram, [http://nude-milf.top/ http://nude-milf.top/] Messenger and WhatsApp, has to  each month, according to a report Monday from News. <br>Facebook, the world's largest social network, earlier this year that can spot , also known as nonconsensual intimate images, before being reported by users. In 2017, the company also launched a pilot program that let users  in an effort to prevent them from being shared on the social network. <br>.shortcode.newsletter h2 width:100%!important;<br>However, Facebook's Radha Plumb told NBC News that the initial explanation of the pilot wasn't clear enough, and after negative feedback the company launched a research program in 2018 to explore and support victims. <br><br>"In hearing how terrible the experiences of having your image shared was, the product team was really motivated in trying to figure out what we could do that was better than just responding to reports," Plumb, head of product policy research at Facebook, told NBC News. <br><br>Facebook reportedly now has a team of 25 people, not including content moderators, focused on preventing the nonconsensual sharing of intimate photos and videos.<br><br>Facebook didn't immediately respond to a request for comment. <br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br><br>Now playing:<br>Watch this:<br><br>Zuckerberg introduces Facebook Protect, Pixel 4 reviews...<br><br><br><br><br><br><br>1:08
Horodatage Unix de la modification (timestamp)
1657099244