Cyberbullying is intimidation and harassment behavior carried out through different digital channels. through social networks, instant messaging applications, online gaming platforms or mobile devices.

According to the United Nations Children’s Fund (UNICEF), This behavior is often constant and works to create fear, fear, and distrust in others. Acting in a way that harms the integrity of the person.

This can be evidenced by behavior that even invades the privacy of the affected person. such as sending hurtful messages, images or videos that are abusive or threatening Spreading misleading content about that person through messaging applications or on social networks.

(You might be interested in: Learn best practices for dating in the LGBTIQ+ community here).

This inappropriate behavior on dating ‘apps’ may be further intensified, according to a survey conducted by app ‘Bumble’ in collaboration with consultancy firm Ipsos. ‘Cyberstalking’, online sexual harassment, offensive insults and revenge porn are the most common negative practices in this digital environment.

Research conducted in Colombia and Mexico found that women (97%) and the LGBTIQ+ community (82%) These are the populations most affected by dating conversations. 92% of survey respondents think cyberbullying occurs more frequently in these populations than in men.

“It is a brutal and daily form of violence that confronts those who experience it. Often women feel distressed, abused and vulnerable online” said Pyton Iheme, Bumble’s vice president and head of Global Public Policy, at a presentation of the results at the app’s headquarters in Austin, Texas.

(

The platform uses:machine learning, A field of artificial intelligence (AI) focused on the development of computer systems to automatically improve the performance of specific tasks through experience.

(Continue reading: How to use OpenAI’s ChatGPT to create engaging bios on your dating app)

Thanks to algorithms, the dating app is trained to identify negative patterns and create predictions, In order to quickly and automatically prioritize such behavior and thus reduce the margin for action.

“‘Bumble’ has a history of combating misogyny, harassment, and toxicity online. We’ve implemented safety features within it such as: ‘Private Detector’, an AI tool that helps protect our community from unwanted sexually explicit images“It says ‘application’ portal.

This means that the harasser’s arbitrary sending of photos such as ‘sick pack’ is also tracked by the app, In order to prevent such cyber attacks when sexually explicit photographs are taken without prior permission.

The aim of this company is to get more platforms to join this methodology in order to solve this problem and promote a safe and secure online space.

NATHALIA GÓMEZ PARRA
DIGITAL SCOPE WRITING
TIME

Source: Exame

Previous articleYandex introduced YandexGPT 2 with improved quality of neural network responses
Next articleHow will Comet Nishimura be observed in the sky in the next few days?

LEAVE A REPLY

Please enter your comment!
Please enter your name here