Filtering

The Internet safety policy must include what the law defines as a “technology protection measure,” i.e., a software filter or blocker that prevents the display of certain visual depictions, photographs, and illustrations. No particular brand of filter is required, however, but the filter must perform specific duties. It must govern Internet access by both adults and minors and block three types of visual depictions.

  • Obscenity.
  • Child pornography.
  • Material that is “harmful to minors.”

The law does not provide an express definition of obscenity. Under Miller v. California in 1973, the Supreme Court laid out its famous three-part “community standards” test now typically used to determine what is obscene. The test requires a court determination of three parts:

  • Whether “the average person, applying contemporary community standards,” would find that the material, taken as a whole, appeals to the prurient interest.
  • Whether the work depicts or describes, in a patently offensive way, sexual conduct specifically defined by the applicable state or federal law to be obscene.
  • Whether the work, taken as a whole, lacks serious literary, artistic, political, or scientific value.

The Internet law goes into some specific detail as to what constitutes material “harmful to minors,” who are defined as anyone under the age of 17. It states that the term “means any picture, image, graphic image file, or other visual depiction” that has the following characteristics:

  • Taken as a whole and with respect to minors, appeals to a prurient interest in nudity, sex, or excretion.
  • Depicts, describes, or represents in a patently offensive way with respect to what is suitable for minors, an actual or simulated sexual act or sexual contact, actual or simulated normal or perverted sexual acts, or a lewd exhibition of the genitals.
  • Taken as a whole, lacks serious literary, artistic, political, or scientific value as to minors.

Adults are not subject to the restrictions on material harmful to minors. Text on web pages is not regulated under the law like visual depictions are.

The law makes a further distinction between what is “harmful” to minors and what is merely “inappropriate” for minors. While it carefully defines harmful material, it leaves the definition of inappropriate material up to local community control. The FCC declined to be more specific in this area in its rulemaking capacity, instead leaving such definitions up to school boards, library boards, and other local authorities.


Inside Filtering