Some of the most recognizable names in tech such as Snap, Google, Twitch, Meta, and Discord unveiled Lantern, a pioneering initiative aimed at thwarting child predators who exploit platform vulnerabilities to elude detection. This vital project marks a significant stride forward in safeguarding children who on average spend up to nine hours a day online. It underscores a critical truth: protecting our children in the digital realm is a shared responsibility, necessitating a unified front from both tech behemoths and emerging players.
The digital landscape must be inherently safe, embedding privacy and security at its core, not merely as an afterthought. This philosophy propelled my journey to Discord, catalyzed by a profound personal reckoning. In 2017, harrowing discussions about children's online experiences triggered a series of panic attacks for me, leading to a restless quest for solutions. This path culminated, with a gentle nudge from my wife, in the creation of a company that used AI to detect online abuse and make the internet safer for everyone, which was later acquired by Discord. This mission has been and remains personal for me!
Discord, now an eight-year-old messaging service with mobile, web, and stand-alone apps and over 150 million monthly users, transcends being just a platform. It's a sanctuary, a space where communities, from gamers to study groups, and notably LGBTQ+ teens, find belonging and safety. Our ethos is distinct: we don't sell user data or clutter experiences with ads. Discord isn’t the place you go to get famous or attract followers. We align our interests squarely with our users, fostering a safe and authentic environment.
However, our non-social media identity doesn't preclude collaboration with social media companies in safeguarding online spaces. Just as physical venues like airports, stadiums and hospitals require tailored security measures, so too do digital platforms need safety solutions that meet their unique architecture. With all the different ways that kids experience life online, a one-size-fits-all approach simply won’t work. This is not a solitary endeavor but a collective, industry-wide mandate. Innovating in safety is as crucial as in product features, requiring open-source sharing of advancements and knowledge. For example, our team at Discord has independently developed new technologies to better detect child sexual abuse material (CSAM), and have chosen to make them open-source so that other platforms can use this technology without paying a penny.
The tech industry must confront head-on the most severe threats online, including sexual exploitation and illegal content sharing. Parents rightfully expect the digital products used by their children to embody these safety principles. Lantern is a pivotal advancement in this ongoing mission.
Yet, as technology evolves, introducing AI, VR, and other innovations, the safety landscape continually shifts, presenting new challenges that no single entity can tackle alone. The millions the tech sector invests must be used to ensure users on our platforms are actually safe, not to persuade Washington that they’re safe. Our industry can lead by sharing with each other and working collaboratively with legislators, advocacy groups, and academics. Collaboration is key—among tech companies, legislators, advocacy groups, and academics— to ensure a universally safe online experience for our children.
The essence of online safety lies in an unprecedented collaborative effort globally. The launch of Lantern is not just a step but a leap forward, signaling a new era of shared responsibility and action in the digital world. It’s time for us to focus on a safer tomorrow!