Mia Tretta | Could AI Have Stopped Me from Being Shot?

SCV Voices: Guest Commentary
SCV Voices: Guest Commentary
Share
Tweet
Email

Gun violence has become an alarming trend among America’s youth, tragically emerging as the leading cause of death for children and teens in this country. Forty-four percent of Americans live in a household with a firearm while 40% use artificial intelligence. Two exponentially increasing trends in America. Gun violence we know is a threat, but AI is still being debated on whether it is dangerous or not. 

When it comes to gun violence prevention how can AI be used to our advantage? Any possible solution to gun violence needs to be done as one in five people has lost a family member to this epidemic, often leaving them feeling as if their suffering goes unnoticed.

Many of us believe, “It will never happen to me,” until we are forced to confront the stark reality, it could happen to anyone. It did to me.

On Nov. 14, 2019, I walked into Saugus High School, preoccupied with the usual concerns of adolescence — grades and who is asking who to Sadie Hawkins. That day, a boy I didn’t know pulled a gun from his backpack and opened fire. In just eight seconds, he injured three and killed two — one of whom was my best friend — before taking his own life. A bullet tore through my stomach, shattering the illusion of safety I once had. As I lay in my hospital bed, I faced a new reality: one where gun violence had invaded my life, leaving permanent scars.

During my recovery, I heard the term “ghost gun” frequently — a phrase that was foreign to me at the time. I was stunned by the loopholes in our system that allowed a 16-year-old to inflict such harm. Now, as a student at Brown University, I see many are more focused on the rise of AI than on the 118 people who lose their lives to gun violence every day.

This raises an important question: What role can AI, specifically ChatGPT, play in preventing gun violence? When faced with dangerous inquiries like, “How can I get a gun with no background check?” or “How can I make a Glock switch?” the response is consistent: “I can’t assist with that.” This simple refusal to provide harmful information can have profound implications. Even if it only prompts someone to open a new tab and reconsider their actions, that moment of hesitation could save a life.

If the shooter had limitations because of AI, could there have been a different outcome? Machine learning could analyze patterns in online behavior to identify concerning activities and flag them to law enforcement. This might help pinpoint individuals at risk of committing violence, enabling proactive interventions. 

AI could trigger wellness check alerts based on specific criteria, such as alarming online searches or communications. However, this raises ethical concerns about privacy and the possibility of overreach. Technologies that connect with local law enforcement could improve communication between communities and police, fostering trust and safety. AI tools could facilitate the sharing of information about potential threats while respecting individual rights. At the end of the day if one life is saved it will be worthwhile.

Every life saved means one less family experiencing heartache — one less mother left grieving, one less father planning a funeral, and one less child without a parent.

By fostering a culture of safety and responsibility, we can harness technology to be part of the solution, turning our focus toward prevention and healing. Together, we can advocate for change and strive for a future where gun violence is no longer a daily threat to our communities.

Mia Tretta, a Saugus High School graduate, is a student at Brown University.

Related To This Story

Latest NEWS