Every morning, residents of Daejeon, South Korea, receive emergency disaster alerts on their smartphones—typically for typhoons, fires, or other real hazards. But on April 8, a different kind of warning appeared. A wolf that had escaped from a local zoo was spotted near a road intersection, the message said. The alert was based on a single photograph.
Police Arrested a Man in His 40s for Spreading an AI-Generated Image
Daejeon police announced the arrest of a man in his 40s (name withheld) on charges of obstructing the search for "Newk-gu," a wolf that had escaped from the O-World zoo. The suspect allegedly used a generative AI program to create and distribute a fake image of the wolf. The photo, which showed the animal walking near an intersection, spread online just hours after Newk-gu went missing on April 8. Police identified the suspect by analyzing CCTV footage and the man's AI program usage records. During questioning, the suspect reportedly said he did it "for fun." He is being investigated for obstruction of public duty through fraud, a felony that carries a maximum sentence of five years in prison or a fine of up to 10 million won (approximately $6,700).
What Used to Require Photoshop Expertise Now Takes Anyone Minutes
In the past, fake photos required professional editing tools and skill. This case shows how dramatically generative AI has changed that calculus. The suspect used an AI program to produce a realistic image of a wolf resembling the actual zoo animal in seconds, with no technical expertise. The city of Daejeon used the AI-generated image as the basis for its emergency disaster text alert and even presented the photo as official material during a press briefing. Search teams rushed to the location shown in the image, but the real wolf was nowhere near there. Two-year-old Newk-gu was finally captured safely last week, nine days after the escape, near a highway.
The Real Vulnerability Isn't a Prank—It's the Lack of Verification
For developers and public safety officials, the takeaway goes beyond a single hoax. This incident exposed a critical weakness in how public institutions handle AI-generated content. The city of Daejeon used the photo without verifying its authenticity, both in the emergency alert and in official briefings. That means no verification protocol for generative AI content existed at all. Meanwhile, Newk-gu survived and returned to the zoo, where he has become something of a local mascot. A video of the wolf eating meat posted by the zoo has surpassed 1 million views, though the zoo has decided not to post further updates to allow the animal to recover.
If a single AI-generated photo can redirect a real search operation, the next target may not be a wolf.




