The always-sharp P.J. Doland has one of those five-dollar-bill-on-the-sidewalk ideas that, once you’ve read it, seems so naturally sensible that you almost can’t believe it hasn’t already been implemented: The Not Safe for Work HTML attribute. The idea is that coders adopt, and browser makers begin to recognize, a “nswf” tag that can be applied to hyperlinks, div blocks, paragraphs, images, whole pages, whatever. Users (or their employers, or their parents) then configure their browsers to occlude content or disregard links that Web authors have determined might be problematic. And search engine filters like Google’s Safe Search could in turn use the flags as input in calculating the likelihood that a given site has content people might want filtered.
The only—though perennial—fear is that if it becomes sufficiently standard, some wise legislator will decide that failure to include the tag on Objectively Naughty content is tantamount to an attempt to slip raunch to unwilling or unwitting viewers and make it mandatory, For The Children. But it’s probably a bad idea to shy away from smart self-regulatory schemes just because they’d be objectionable if implemented coercively. In fact, when you think about it, this provides a sort of benign buffer for censorious impulses—much as the existence of filter software, however imperfect, persuaded judges that more aggressive censorship was not the “least restrictive means” of keeping “harmful” content away from minors. (Leaving aside, of course, the irksome question of exactly what evidence has been adduced to suggest that exposure to pixelated video of a blowjob will actually harm anyone.) In other words, if we ever do lose a fight over Internet censorship, I’d rather it be at the edge of a ditch like this—where private users still have ultimate control over how their browsers handle the tag—than along some deeper chasm. Though as my lovely housemate recently observed, such attempts are farcically doomed enterprises on a global Web in any event.