A Christian entertainment platform for children, TruPlay, was blocked by Google and told it could not advertise on the Google Play store, the platform’s founder and CEO, Brent Dusing, told Breitbart.
Dusing told the outlet that they were informed by Google that they “can’t promote religious content.” Despite this, Google allows the gaming platform Roblox “to advertise with images of pentagrams, drawn in blood with dismembered bodies. They have transgender games for kids on Roblox — I think you have to have quite a bit of faith to believe in that,” he explaiend. “They also allow Buddhist products to advertise.”
“So it’s not that they’re against religion, it’s that they’re against Christianity,” Dusing added, noting that “AI models are programmed with a moral code,” so “Google’s AI system reads Christian values as dangerous and harmful.”
Google informed TruPlay that it would not approve an update to the app after it determined there was a violation of Google Play policy.
“We determined that your app content is not appropriate for children,” the message read, saying “Apps that depict or encourage gratuitous violence or dangerous activities involving the intended audience” and “Apps that include violence, gore, or shocking content not appropriate for children” are prohibited.
“For example, your app contains content which is inappropriate for the intended audience,” Google claimed, showing an image of Jesus on the cross.
While Google went after TruPlay, it did not issue a statement against a platform with “pentagrams and blood to be shown and displayed to children,” Dusing said.
TruPlay has since received an email from Google that their appeal of the decision was accepted. “Developers are able to appeal enforcement if they believe an error was made,” a Google spokesperson told Breitbart. “In this case, the developer appeal was approved, and their app update is live on the Play Store.”
Roblox has been the subject of investigations and lawsuits after being caught in a predator scandal. Kentucky Attorney General Russell Coleman filed a lawsuit against the platform in October, alleging that it failed to protect minors. For example, “assassination simulators” began appearing on the platform following the death of Charlie Kirk. Children as young as five had access to “animated bloody depictions of the September 10 shooting.”





