We Need to Do a Better Job Protecting the Vulnerable in a Technologically Advanced World
Scripture is full of warnings concerning how what we see can have harmful effects on us. Jesus Himself said in Matthew 6:22-23, “The eye is the lamp of the body. So, if your eye is healthy, your whole body will be full of light, but if your eye is bad, your whole body will be full of darkness. If then the light in you is darkness, how great is the darkness!”
And yet, we live in a technologically advanced world, making it nearly impossible to fully protect our eyes from the threats we hardly see coming. Social media, in particular, is a unique environment. Can it be used for good? Certainly. However, in many ways, it has become a petri dish of online pathogens. How many people, both young and old, have unwillingly stumbled across pornographic content online? How many intentionally look for it?
How many young girls have been roped into trafficking schemes through online chatting platforms? How many have purposefully exploited themselves as part of a desperate desire for attention or extra money? The cyber-bullying that takes place online is surely not tracked to the extent it ought to be, and the mental harms are through the roof with the constant comparisons, comment-section arguments, and whatever else we have shoved in our faces throughout the day — you name it!
Something particularly disturbing is the way online toxins have caused many to do irreversible harm on themselves. Take the story of Chase Nasca, for instance, which is not just heartbreaking, but immensely eye-opening to the dangers of social media and its influence on vulnerable users.
In 2022, 16-year-old Chase ended his life by stepping in front of a train. In 2025, court documents now suggest that TikTok’s algorithm may have had a lot to do with it. Chase’s parents filed a lawsuit against TikTok’s parent company ByteDance in March 2023 on the grounds that the social media app helped stir Chase to taking his life by targeting him with an algorithm that promoted “railroad themed suicide videos.” In December 2024, the company tried to dismiss the case, claiming it went against their First Amendment right to “protected speech.” However, they did acknowledge “using geolocation data to push ‘relevant’ content to users.”
The social media giant, which has over two billion active users, also “argued that product liability laws couldn’t be brought against it since it doesn’t provide a ‘tangible’ product.” But Chase’s parents are not willing to let them off the hook.
As the court documents allege, “Some of the videos TikTok directed to Chase, who lived a quarter mile from the LIRR tracks, encouraged young people to end their lives by stepping in front of a moving train. This was no coincidence.” It also highlighted how “TikTok used Chase’s geolocating data to send him railroad-themed suicide videos both before and after his death.”
Perhaps most tragically, Chase went to TikTok with the hope of finding “uplifting and motivational” content. However, the China-owned platform pushed morbid videos “focused on his proximity to the Long Island Rail Road” time and again, without anyone else being aware of what Chase was exposed to. Before long, it was too much. On February 18, 2022, on his walk home from the gym, Chase sent this message to his friend just moments before stepping on the tracks: “I can’t do it anymore.”
Chase is not the only one who’s become a victim of the mental distress social media often leads to. Just last year, 14-year-old Sewell Setzer III took his own life in an act that appeared to be motivated by artificial intelligence (AI). “For months, Sewell had become increasingly isolated from his real life as he engaged in highly sexualized conversations with the bot, according to a wrongful death lawsuit filed in a federal court,” the Associated Press reported. The boy began to feel he had a deep affection for the bot. The wildly inappropriate conversations led to one final dialogue, in which the chat showed Sewell saying he was “coming home.” Just after the bot said to “come home,” the boy took his life.
The fact that incidences like these are occurring more frequently points to a much deeper problem than what a mere cursory glance will expose. The case-by-case instances are undeniably heartbreaking, but so is the fact that social media giants are willing and able to push suicidal content onto the phone of a 13- or 16-year-old child in the first place. It’s unsettling that several states have yet to pass legislation that would require age verification measures to protect kids from explicit content. The fact that some parents are mocked for being cautious of how much technology their children have proves that not enough people are concerned about this very serious topic.
As Christians, we understand that even the Bible, God’s inerrant word, explains the importance of filling our mind with that which is good. Paul wrote in Philippians 4:8, “Finally, brothers, whatever is true, whatever is honorable, whatever is just, whatever is pure, whatever is lovely, whatever is commendable, if there is any excellence, if there is anything worthy of praise, think about these things.” And in verse 9, Paul emphasized, “What you have learned and received and heard and seen in me — practice these things, and the God of peace will be with you.”
We’re called to hold every thought captive, and that begins with being cautious of what we let in. Of course, there is only so much we can control. But that is why we must take every effort to control what we can. We avoid avenues that lead us directly into the realm of self-deprecation, suicidality, anxiety, anger, and other such emotions. We strive to live lives with our minds fixed on Christ, who is Himself all that is good, true, and beautiful. Focusing on Him is what allows our darkness to be cast out by His glorious light. As Christians, this is what helps us get through this earthly endeavor, and we’re called to share this hope and this light with those around us. I don’t see how social media is any different than the other corrupt aspects of this fallen world.
As I mentioned before, of course social media can be used for good. Since it likely isn’t going away, we should utilize it in healthy and fruitful ways. Each and every one of us has the power to do so. And yet, I dare say that is not enough. We must stand up and say, “no more.”
No more losing children to suicide because their online platforms encouraged them to end their lives. No more letting people, specifically minors, casually stumble across explicit, inappropriate content. We must call out the companies that allow this to happen and put pressure on our lawmakers to put kids first, to put basic well-being and safety first. It doesn’t matter if no one around you is doing it. You have the power to use your voice, and it is my encouragement that you do so.
Especially for the church, I pray we lead by example. Jesus said we will be known for our love, and time and again we are called to stand firm in the truth. May we be the first to love these people who have become victims of a technologically advanced world by standing firm in the truth, and never resting until those in authority hear us and do something about it.
Sarah Holliday is a reporter at The Washington Stand.