In 2010, Media Lab students Karthik Dinakar SM ’12, PhD ’17, and Birago Jones SM ’12 partnered for a class project aimed at developing a tool to assist content moderation teams at firms such as Twitter (now X) and YouTube. The widely acclaimed project led to an invitation to demonstrate their innovation at a cyberbullying…
In 2010, Karthik Dinakar and Birago Jones, students at MIT Media Lab, built a tool for their class project aimed at aiding content moderation teams in companies like Twitter and YouTube. The duo was invited to demonstrate their invention at a cyberbullying summit at the White House. The night before the event, Dinakar found that…
In 2010, Karthik Dinakar SM ’12, PhD ’17, and Birago Jones SM ’12, students at the Media Lab, embarked on a project aimed at helping content moderation teams at major companies like Twitter (now X) and YouTube. Their work ignited a great deal of interest and they were invited to present their project at a…
In 2010, Media Lab students Karthik Dinakar SM ’12, PhD ’17, and Birago Jones SM ’12 wanted to develop a tool to assist content moderation teams at companies like Twitter and YouTube. The project prompted excitement, earning them a demo at a White House cyberbullying summit. When Dinakar struggled to create a working demo, Jones…
In 2010, Media Lab students Karthik Dinakar SM ’12, PhD ’17 and Birago Jones SM ’12 began a project to develop a tool to help content moderators at Twitter (now X) and YouTube spot troubling posts. While preparing for a cyberbullying summit at the White House, they realized that their model was missing nuances in…
In 2010, Media Lab students Karthik Dinakar and Birago Jones began a class project to create a tool for content moderation teams at major corporations like Twitter and YouTube. The technology, which generated significant interest, was presented at a White House cyberbullying summit as a mechanism for identifying concerning posts on social media platforms. During…
In 2010, Karthik Dinakar and Birago Jones began a project to develop a tool helping content moderation teams at companies like Twitter and YouTube. The aim was to assist these teams in identifying inappropriate or harmful content. The project created considerable interest and the researchers were invited to present their work at a cyberbullying summit…