Analysis

People are scrolling past perfect images to find proof a human made them

The hunger for imperfection has become the defining creative tension of this moment — felt in the way people linger on a grainy concert photograph, skip past a suspiciously smooth face, or drive across a city to buy a handmade booklet they could have read as a PDF.
Molly Se-kyung

This is not a nostalgic impulse. It is a detective’s impulse. Across underground music scenes, photography communities, fan cultures, and branded content, audiences have developed an informal but increasingly reliable skill: the ability to detect when a creative object was not made by a person who was actually present in the moment it describes. And what they are doing, behaviorally, is moving away from anything that fails that test — and toward anything that passes it, regardless of technical quality.

The data behind this shift is striking. Consumer enthusiasm for AI-generated creator content collapsed from 60 percent in 2023 to 26 percent by late 2025, according to research by Billion Dollar Boy across six thousand consumers in the United States and United Kingdom. The Sprout Social Q4 2025 Pulse Survey found that more than half of social media users are now actively concerned when brands post AI-generated content without disclosing it. The term “AI slop” was named Word of the Year 2025 by both Merriam-Webster and the Australian National Dictionary — a linguistic marker that a new category of distrust had found its name. Research by Kapwing estimated that between 21 and 33 percent of YouTube’s feed may already consist of AI-generated or semi-automated content.

Against this backdrop, four distinct behavioral patterns have emerged across different demographics, geographies, and creative contexts — each describing the same underlying search for what theorists might call indexical truth: evidence that something happened to a real person, in a real place, at a real moment in time.

In Warsaw, a loosely networked community of musicians and visual artists documenting Poland’s dream pop and post-rock scenes has spent the past two years deliberately shooting shows on expired 35mm film. The technical results are often unpredictable: color shifts, halation, the occasional blank frame. The community publishes its photographs in limited-run zines sold at show doors for a few dozen złoty. They do not post high-resolution versions online. The photographs circulate as photographs — as objects with a traceable relationship to a specific night, a specific band, a specific room. The deliberate refusal to optimize is a form of testimony. It says: a person with a camera was in this basement at this hour, and this is the chemical record of what the light looked like.

In Helsinki, a smaller but equally intentional scene has formed around the documentation of car culture — specifically the circuit-driving and slow-cruising tradition practiced in provincial Finnish towns on summer weekends. The photographers who document these gatherings favor point-and-shoot cameras and flat flash. The images look, to an outside eye, technically unambitious. But within the community, this visual grammar carries enormous cultural weight. The photographs are understood to be documents rather than artworks — proof of presence at a specific social ritual that has no mainstream cultural representation. Several collections have been exhibited at independent spaces in Helsinki and Tampere, where the rough visual quality is explicitly presented as part of the meaning. The imperfection is the evidence.

The same dynamic is playing out, with significant friction, inside the K-pop industry — but the stakes there are vastly higher. K-pop fan communities have developed sophisticated informal protocols for detecting AI-generated promotional imagery: too-even skin tone, lighting that does not correspond to any physical space, expressions that read as composited rather than captured. When agencies deploy these images — and the evidence of deployment is now common — fan communities document the detection and circulate it widely. The emotional response is not merely aesthetic disappointment. It is closer to betrayal. The parasocial investment K-pop audiences make in their artists depends on an implicit belief that what they are consuming is traceable to a specific person who was present, who felt something, who had that particular expression on that particular day. AI imagery destroys that traceability. A senior creative at a Seoul-based agency told Dazed Digital that the fan objection is not to technology per se: it comes down to humanism and authenticity, and more specifically, to the feeling of being deceived.

In the United States and United Kingdom, the behavioral shift has entered brand contracts. Talent and marketing agencies are now inserting contractual language prohibiting AI-generated imagery in creator deals. Some clients require full disclosure of any AI tool involvement, even in scripting or ideation. The motivating factor is consumer detection. Audiences who can spot AI imagery in a K-pop promotional image can also spot it in an influencer’s product post — and when they spot it, they disengage. The creative director of one agency summarized the market reality with unusual directness: audiences can tell when somebody’s used a ChatGPT script, and creators who outsource their creativity to AI are not using it as a tool to accelerate their creativity.

The human cost embedded in this shift is not simple. It asks something uncomfortable of the creative class: that they not only produce human-made work but make that human origin legible. The old standard was that technical quality spoke for itself — a beautiful image was a beautiful image, regardless of how it was produced. The new standard adds a provenance requirement. A beautiful image also needs to be demonstrably the product of a person who was somewhere specific, doing something real, bearing witness to something that could not have been generated from a statistical model of prior images. That is a profoundly different creative obligation.

It also puts pressure on a longstanding assumption about what polish communicates. For decades, high production value was coded as professional credibility. In music, in photography, in advertising, in editorial content, the smoothed and perfected output was the prestigious output. That assumption is now unreliable. Overpolished, overmanicured content increasingly resembles generative AI output, as Kara Redman, CEO of brand strategy agency Backroom, observed — and in a market where resemblance to AI is a credibility problem, the strategic value of imperfection has inverted entirely. Less polish is now the signal of more effort, not less.

What is left, as this inversion settles into creative culture, is a premium on a specific and very old quality: the sense that something was made by a person who had something at stake in making it. Not technical correctness. Not visual optimization. The visible trace of a human being who was present — in the basement in Warsaw, in the parking lot in Finland, in the studio in Seoul — and chose to record what they found there, imperfections included.

The audiences who are moving toward this quality are not rejecting technology. They are using it — using precisely the algorithmic infrastructure of social platforms — to find the things that cannot be algorithmically reproduced. That paradox is unlikely to resolve soon. If anything, as generative tools become more capable, the search for indexical truth will become more deliberate, more specific, and more culturally valuable — because what is being searched for is not a style. It is proof of life.

Discussion

There are 0 comments.

```
?>