|Click graphic for full view|
Civil and Criminal Liabilities for Libraries Related to Using or Failing to Use Internet Filtering Software or Other Content Screening Mechanisms
by Jenner & Block
by Jenner & Block
reprinted under US Copyright § 107 Fair Use
and obtained thanks to Archive.org
and obtained thanks to Archive.org
SUBJECT: Civil and Criminal Liabilities For Libraries Related to Using or Failing to Use Internet Filtering Software or Other Content Screening Mechanisms
The American Library Association has asked us to prepare a memorandum summarizing the potential liability if a public library uses filtering software or other content screening mechanisms for Internet access, or decides not to use filtering software or other content screening mechanisms.
Before we begin that analysis, we must caution that this memorandum is merely a general discussion of these issues, and is not an opinion letter. Because laws differ from state to state, this memorandum necessarily cannot serve as the basis for legal judgments for any library deciding whether to use filtering software or other content screening mechanisms. Additionally, the law related to Internet use and filtering is changing rapidly as new legislation is adopted and new court challenges are filed. Libraries that offer Internet access should seek legal advice for an analysis of their own particular situation and the current laws of their own state and jurisdiction.
In Reno v. American Civil Liberties Union,117 S.Ct. 2329 (1997), the United States Supreme Court held unanimously that the governmentcould not criminalize the "display" on the Internet of material that would be "indecent" or "patently offensive" for minors, because such a ban would restrict adult access to constitutionally protected material. The Supreme Court held that the Communications Decency Act was unconstitutional because the government had failed to demonstrate that less restrictive alternatives were unavailable, such as user-based filtering software that parents could use to block or filter their child's access to certain materials.
In some jurisdictions, government authorities are now mandating use of such filtering software by libraries, sometimes out of concern that libraries will be liable under existing statutes if a minor accesses material deemed "obscene," "indecent," or "harmful to minors." The use of filtering software by public libraries rather than by parents raises serious First Amendment concerns.
It is important to note that this memorandum is a discussion merely of potential liability. There are no final, definitive judicial rulings at this time. These issues are percolating throughout the country. Indeed, the Library Board of Trustees in Loudoun County, Virginia was sued on December 23, 1997 for installing filtering software on library computers. A patron has sued the Free Library of Philadelphia on the ground that Internet screening practices interfere with his First Amendment rights. There is likely to be further litigation in other jurisdictions.
There is a serious risk that libraries will be found to have violated the First Amendment if they use filtering software to prevent access to constitutionally protected speech. Given that serious risk, libraries in jurisdictions where "harmful to minors" statutes exist must work with state and local legislatures to ensure that they can protect important First Amendment rights while at the same time avoiding potential civil or criminal liability.
- CIVIL LIABILITY FOR INSTALLING FILTERING SOFTWARE ON LIBRARY COMPUTERS
Many courts have held that public libraries are limited public fora, open to the public for "the communication of the written word."1 The First Amendment prohibits the state from discriminating based on content in public fora unless it can demonstrate that the restriction is necessary to achieve a "compelling" government interest and there are no less restrictive alternatives for achieving that interest.2 Thus, libraries cannot ban books on abortion, even if the ban encompassed both books supporting and opposing abortion, because the ban would be based on the content of the material, i.e., the subject of abortion (rather than other content-neutral criteria such as quality of writing, placement on best sellers' lists, etc.).3 Filtering and other screening mechanisms are, at the least, "content-based" because the filter is based on the sexual, violent or other (presumed) content of the material, and they are often "viewpoint" based because they block certain viewpoints (i.e., anti-semitic speech). These mechanisms can only be utilized, therefore, if the government (in this case, the library) can demonstrate both a compelling need and that the means utilized is the least restrictive means available to achieve that purpose. In a recent challenge to filtering software in Loudoun County, Virginia -- where the library has installed X-Stop filtering software on all terminals -- a district court held in a preliminary ruling that libraries are limited by the First Amendment in their ability to block patron access to the Internet. The Library Board had argued at oral argument that "a public library could constitutionally prohibit access to speech simply because it was authored by African- Americans, or because it espoused a particular political viewpoint, for example pro-Republican." Feb. 27, 1998 Hearing Transcript at 48. The district court rejected that argument, holding that "the First Amendment applies to, and limits, the discretion of a public library to place content-based restrictions on access to constitutionally protected materials within its collection." Slip Opinion at 26. The district court held:
- By purchasing Internet access, each Loudoun library has made all Internet publications instantly accessible to its patrons. Unlike an Interlibrary loan or outright book purchase, no appreciable expenditure of library time or resources is required to make a particular Internet publication available to a library patron. In contrast, a library must actually expend resources to restrict Internet access to a publication that is otherwise immediately available. In effect, by purchasing one such publication, the library has purchased them all. The Internet therefore more closely resembles plaintiffs' analogy of a collection of encyclopedias from which defendants have laboriously redacted portions deemed unfit for library patrons. As such, the Library Board's action is more appropriately characterized as a removal decision. We therefore conclude that the principles discussed in the . . . [Board of Education v. Pico, 457 U.S. 853 (1982)], plurality are relevant and apply to the Library Board's decision to promulgate and enforce the Policy.
- The following are various options libraries are considering and some of the potential liabilities associated with each.
- Library uses commercially available blocking software to block broad categories of material on all terminals.
Commercially available blocking software typically blocks broad categories of materials, including materials that are not obscene and do not constitute child pornography and thus are protected by the First Amendment. Adult users could successfully sue public libraries who use filtering software on all terminals because the software would prevent adults from accessing constitutionally protected material. In concluding that the Communications Decency Act was unconstitutional, the Supreme Court held:
- It is true that we have repeatedly recognized the governmental interest in protecting children from harmful materials. . . . But that interest does not justify an unnecessarily broad suppression of speech addressed to adults. As we have explained, the Government may not "reduc[e] the adult population . . . to . . . only what is fit for children."
Older minors also could sue on First Amendment grounds. Minors have First Amendment rights.4 Filtering software that broadly blocks access to content would almost certainly prevent minors from accessing material that has serious value and would be constitutionally protected for older minors, even if that material might be considered harmful and unprotected for younger minors. For example, the Fourth Circuit upheld a Virginia law restricting the display in newsstands of material deemed "harmful to minors" only after the Virginia Supreme Court had interpreted the law to exclude from its prohibition any material "found to have a serious literary, artistic, political or scientific value for a legitimate minority of normal, older adolescents."5 The Eleventh Circuit went even further, holding that a "harmful to minors" restriction could not constitutionally be applied to material in which "any reasonable minor, including a seventeen-year-old, would find serious value."6 Since there are almost certainly less restrictive means of keeping harmful material from young minors, see discussion below, courts would likely sustain an older minor's challenge to blanket use of filtering.
Even younger minors could potentially sue on First Amendment grounds because filtering software would prevent them from accessing constitutionally protected material that is neither obscene nor "harmful" to anyminors. Commercially available software often restricts very broad categories of material. Thus, filtering software often blocks sites with the name "sex" or "breast." As a result, sites with information on "breast cancer" and "safe sex" might be blocked -- even though such sites plainly would not be "obscene," "indecent," or "patently offensive" (under most definitions) and plainly would contain constitutionally protected material for everyone.
- Library installs filtering software on all terminals with option for adults to request that software be turned off.
- Library installs filtering software on terminals in children's section of library with unfiltered access on other terminals in library.
Although under this scenario adults would have unfettered access to everything on the Internet, minors still could sue if they were banned from using the non-filtered terminals, because they would not have access to constitutionally protected material of serious value for minors. Because this area is untested, it is not certain whether a minor would be successful in such a suit.
The Supreme Court long has recognized that minors enjoy First Amendment rights. Tinker v. Des Moines Independent Community School District, 393 U.S. 503 (1969). It is also well established that minors' First Amendment rights include the right to receive information. Board of Education v. Pico, 457 U.S. 853 (1982) (plurality opinion). The First Amendment rights of minors are limited in two ways. First, the Supreme Court has held that school officials have significant latitude, in the context of curriculum decisions, to restrict the rights of minors.7 Of course, that curriculum rationale would not extend to officials at public libraries or, presumably, school libraries. Unlike classrooms, libraries serve to provide the community with a broad array of information and are not entrusted with fulfilling specific educational goals, or with inculcating values.
Second, the rights of minors also are limited to the extent that states may deem certain materials "obscene" for minors even if the materials are not deemed "obscene" for adults.8 Accordingly, most states have enacted "harmful to minors" statutes in an attempt to restrict minors' access to materials that would be protected by the First Amendment for adults. There are, however, limits on restricting minors' access to "harmful" material. For example, the Supreme Court has made clear that states may not simply ban minors' exposure to a full category of speech, such as nudity, when only a subset of that category can plausibly be deemed "obscene" for them.9Additionally, some decisions have suggested that states must determine "obscenity" as to minors by reference to the entire population of minors -- including the oldest minors. Accordingly, any court that reviews a filtering policy would have to take minors' First Amendment rights into account.
- Library provides minors unfiltered access to Internet only with parental permission or provides unfiltered access only if the parents affirmatively object to filtered access.
While some proponents of restricted Internet access have advocated that no minor should have unfiltered access to the Internet in a public library -- even if that minor's parent would permit such access -- others have proposed parental permission requirements in lieu of mandated filtering for every child. Parental permission requirements also pose significant constitutional concerns. Minors have some constitutional rights that exist regardless of whether their parents want them to exercise those rights. For example, even where states require parental notification or consent if a minor is seeking an abortion, those states have been required, as a constitutional matter, to provide a bypass mechanism for a young woman who demonstrates either that she is mature enough to make the decision herself or that parental notification is not in her best interests. Lambert v. Wicklund, 117 S. Ct. 1169 (1997). Surely the potential consequences of an abortion are far greater, and of more legitimate concern to parents, than the consequences of reading a website discussion about abortion.
The American Library Association Library Bill of Rights recognizes that parents, not librarians, should be responsible for what their own children read or borrow from the library.10The Library Bill of Rights recognizes that librarians should not be in the position of determining what minors can and cannot read. Libraries and librarians should not be instruments of control over the reading selections of minors. Of course, it also is not technologically or administratively possible for libraries to install the infinite range of filtering mechanisms that would be needed to reflect the range of parental choices regarding what would be appropriate for their children. Some parents might want their children to have unlimited access to the Internet; others might want their children to have no access. Some parents might want to filter all "sexual" materials, but nothing else; others might want to filter all "violent" materials, but nothing else. Some parents might want to filter any violence or sexual material; others might want to filter only "excessive" violence or sex. Some parents might want to filter all references to drugs; others might want to filter all references to political information. Obviously, in the context of Internet access, the ALA's policy supporting parental control does little to answer the difficult questions libraries must confront. Libraries will of necessity be forced to make whatever filtering decisions they deem most consistent with the primary library mission of providing access to a broad and diverse range of materials.
But although libraries cannot provide whatever filtering options individual parents might desire, librarians certainly can facilitate parental decision-making by providing clear notification to parents about whatever choices the library has made regarding Internet use and filtering. Thus, for example, if parents are notified that a particular public library has decided to offer unfiltered access to the Internet for all patrons regardless of age, parents can decide not to send their own minor children to the library, or could decide to accompany their children to the library.
There are no judicial cases addressing the constitutionality of parental permission requirements for unrestricted Internet access. Thus, imposing parental permission requirements could subject a public library to lawsuit alleging that such requirements infringe minors' First Amendment rights.
- The Academic Library
Different standards likely will apply in academic library settings as opposed to public library settings. Academic libraries will certainly be given greater latitude than public libraries, on the basis of educational objectives, when restricting access. In Loving v. Boren, 1997 U.S. Dist. Lexis 2921 (W.D. Ok. Jan. 28, 1997), a district judge approved the University of Oklahoma's segregation on its news server of certain newsgroups the university believed might contain obscenity. The university makes the excised newsgroups available only to adults who state that they are using the material for academic or research purposes. Although Loving did not thoroughly address the many difficult First Amendment issues surrounding filtering or access restriction, and is therefore not a strong precedent, the court found that the First Amendment did not prevent the university from allowing access to newsgroups only for "the very [purposes] for which the system was purchased." Id. at *6. It also found that the university computers are not a public forum, see ibid., a finding that may limit the decision's relevance for public libraries and that suggests that some courts might treat public libraries and school libraries differently.
In Urofsky v. Allen, 1998 U.S. Dist. Lexis 2139 (E.D. Va. Feb. 26, 1998), Virginia had enacted a law prohibiting state employees from using state-provided computers to access sexually explicit content on the Internet. The law allowed agency heads to grant waivers in cases where an employee's work required access to forbidden material; information about all waivers would be available to the public. Several state university professors challenged the Act, claiming it interfered with their research. Judge Leonie M. Brinkema (the same judge presiding over the Mainstream Loudoun case) struck down the law on First Amendment grounds. Judge Brinkema analyzed the law as a mass restriction on public employee speech; accordingly, she applied the balancing test of Pickering v. Board of Educ., 391 U.S. 563 (1968), as modified by United States v. National Treasury Employees Union, 513 U.S. 454 (1995). She concluded that the law significantly burdened public employees' access to information and the public's right to benefit from public employees' use of that information in their work. On the other side of the balance, Judge Brinkema found the law both overinclusive and underinclusive as to the government's stated interests of preventing workplace distractions and avoiding hostile work environments, and she noted the existence of content-neutral alternatives to serve those interests. She dismissed the waiver process as unduly stringent, designed to intimidate agency heads into denying waivers, and unworkable in practice.
- CIVIL LIABILITY FOR INSTALLING FILTERING SOFTWARE THAT FAILS TO FILTER OBSCENE MATERIAL OR MATERIAL DEEMED HARMFUL TO MINORS
The use of filtering software also might subject libraries to other types of litigation. If a library uses filtering software and a minor is nevertheless able to access obscene or "harmful" material on the Internet, a parent might sue the library for failing to "protect" the minor. Although the parent ultimately may be unsuccessful in demonstrating that the library had a duty to "protect" the minor from accessing certain materials on library computer screens, or that the minor was actually harmed by such exposure, the litigation itself would be costly and time-consuming.
In the context of interpreting the provisions of the Communications Decency Act, some groups took the position that the "good Samaritan" provision of that Act would provide blanket immunity from civil liability for anyone who, in good faith, uses filtering in order to protect children from inappropriate material. This theory is untested. A library contemplating the use of filtering software should, therefore, seek the advice of legal counsel to determine whether state "harmful to minors" statutes provide such protection from civil suit for the failure of filtering software to filter out material later deemed to be harmful to minors. There also remains the unanswered question of how much filtering is sufficient to provide blanket immunity to the library. Must the library filter out material that would be deemed harmful for all minors or only for young minors? Neither the courts nor the legislatures have answered this question.
- CIVIL LIABILITY FOR FAILURE TO USE FILTERING SOFTWARE
There is also the potential that a parent might sue a library that fails to use filtering software, alleging that his or her child was harmed by being able to access obscene or harmful material from the Internet through library computers. It is not clear whether the tort law of any state would impose a duty on the library to protect minors from harmful material in the library. Nor is it at all certain that a parent suing the library could demonstrate that actual harm occurred because of such Internet access. Laws will differ from jurisdiction to jurisdiction. Libraries must seek legal advice in their own jurisdictions to determine whether such civil liability exists.
Although there are no guaranteed methods for avoiding civil liability, it would be prudent for libraries to take the following precautions in order to foster positive community relations with patrons and to alert patrons to library policies. Libraries should post their Internet policies in a clear and conspicuous manner, thereby alerting parents to the fact that filtering software is not used (if that is the case). Libraries should offer minors access to Internet sites and direct links to sites developed for children, thereby reducing the already slight chance that a minor would accidentally access any obscene or harmful material.11 Libraries can also adopt and conspicuously publicize a clear policy statement -- posted at computer terminals with Internet links -- that library policy prohibits the use of library equipment to access material that is obscene, child pornography (or, in the case of minors, harmful to minors).
- CRIMINAL LIABILITY FOR FAILURE TO USE FILTERING SOFTWARE UNDER OBSCENITY, CHILD PORNOGRAPHY AND HARMFUL TO MINORS STATUTES
States prohibit the distribution of obscenity and child pornography. Such material is also prohibited by federal criminal statutes. In many jurisdictions, it also violates state statutes to display material that is deemed "harmful to minors." State statutes governing obscenity, child pornography and "harmful to minors" material differ from jurisdiction to jurisdiction. Some jurisdictions provide a library exemption. Others do not. Many libraries are concerned that the existence of obscenity, child pornography and "harmful to minors" statutes requires them to use filtering software to limit their potential criminal liability under such statutes. Librarians in such jurisdictions must carefully examine their potential liability. However, it is far from certain that such statutes would apply in the context of Internet access from a library computer.
First, such statutes will have scienter requirements. In other words, these criminal statutes will require some "knowledge" on the part of the accused. The library must determine whether the criminal obscenity, child pornography and harmful to minors statutes in its jurisdiction require that the library "know" that harmful material or obscene material was displayed to minors, or whether it would be sufficient to impose liability simply because the defendant failed to take reasonable precautions to prevent the display of such material to minors. The library must explore with counsel whether it would be sufficient to provide notice to Internet users that the library facilities may not be used to access illegal material, and that the privilege of using the Internet at the library is subject to being revoked if the policy against accessing illegal materials is violated.
Second, the library must seek legal advice to determine whether the definitions governing criminal conduct are vague or overbroad. If the statute does not provide a clear definition for the type of material that would be deemed obscene or "harmful" to minors, then the library probably could not be held liable for violating the statute.
Third, the library must determine whether the obscenity, child pornography and harmful to minors statutes in its jurisdiction apply to Internet displays. The library must determine whether providing access to the Internet is sufficient to satisfy "display" or "distribution" criteria in the criminal statutes.
Finally, in some jurisdictions, a library can avoid criminal liability by taking "reasonable precautions" to prevent the display of "harmful" or obscene material to minors. Arguably, under such statutes there must be precautions short of resorting to filtering software that would be deemed reasonable because, as explained above, using filtering software for all patrons would almost certainly result in First Amendment violations. This is, however, an untested area. We would suggest, therefore, that libraries vigorously urge their legislators and municipal leaders to draft legislation that provides libraries with exemptions from liability under criminal statutes.
While legislative efforts are pending, libraries also should consider seeking Attorney General opinions to clarify the extent of their obligations under obscenity, child pornography and harmful to minors statutes. Libraries -- perhaps through their State Library Associations -- can ask the Attorney General of their state to interpret their obligations under their criminal statutes. Libraries can ask for guidance on how to take reasonable precautions that protect the library from criminal liability, but do not put the library in the position of violating the First Amendment rights of its patrons. Although such opinions generally are only advisory and not binding on courts, they certainly will provide the library with useful guidance and will allow the library to show that it was acting in a good faith effort to comply with all statutory and constitutional obligations.
- CIVIL LIABILITY FOR AN ALLEGED HOSTILE WORK ENVIRONMENT RELATED TO PATRON OR EMPLOYEE INTERNET USE
Federal civil rights laws afford employees the "right to work in an environment free from discriminatory intimidation, ridicule and insult." Meritor Savings Bank FSB v. Vinson, 477 U.S. 63, 65 (1986). The argument has been made that Internet access to materials that are offensive due to their sexually explicit nature or their messages regarding race and ethnicity may subject the library to liability for creating a hostile work environment for its employees. Employers can be subject to liability for the actions of their employees in creating a hostile work environment for another employee. Courts recently have held that an employer can be held liable for the actions of a customer or non-employee when the employer ratifies or acquiesces in harassing behavior by failing to take prompt remedial action and the employer knew or should have known of the offending behavior.12
It is a fundamental principle of First Amendment jurisprudence, however, that a person has a constitutional right to send and receive non-obscene material that may be offensive to others. The government cannot ban speech merely because it is offensive. Libraries thus have the obligation both to provide a non-hostile work environment, and also to insure that patrons and employees can fully exercise their constitutional rights. There is very little guidance from the courts on how these potentially conflicting responsibilities can be resolved.
These issues can, of course, arise regardless of whether the Internet is involved. An employee or patron can bring materials from outside the library to harass an employee of the library. There are several principles that a library must consider in determining what policy is necessary. First, in determining whether behavior is harassment, a court will look to whether it is physically threatening or humiliating, or merely an offensive utterance, and whether it unreasonably interferes with the victim's work. Courts have held that where sexually explicit material was for private use only, read only in private places or shared consensually, the prohibition on possessing such material in the workplace in furtherance of a sexual harassment policy violated the reader's First Amendment rights. Johnson v. County of Los Angeles Fire Dep't, 865 F. Supp. 1430 (C.D. Cal. 1994). A court thus should conclude that an image on an Internet screen alone does not create a hostile work environment for the librarian that happens to see the screen. A hostile work environment might be created, however, if the patron taunted a library employee with images on the Internet screen or left printed images of the materials with the librarian for the purpose of harassing that employee.
Second, there are some steps a library can take to minimize its potential liability. For example, a library should adopt a policy making clear that the library does not condone, encourage or tolerate the harassment of employees by other employees or by patrons through the use of any means, whether or not those means include images from the Internet. The library also should establish a procedure for immediately addressing complaints of a hostile work environment by an employee. Such complaints should without delay be directed for investigation to an employment counselor or legal counsel. Finally, the library can take steps to minimize the exposure of employees to images from the Internet that might be deemed offensive by placing Internet terminals in more private areas of the library.
Filtering would not, however, appear to be the solution to this problem. Filtering cannot guarantee that offensive material will be blocked. Nor can filters prevent employees or patrons from using other materials to harass an employee. The obligation of the library as employer to provide its employees with a non-hostile work environment extends beyond the Internet.
Courts have suggested in some instances that requiring recipients to affirmatively request protected speech to which stigma might attach is unconstitutional. See generally Denver Area Telecommunications Consortium, Inc. v. Federal Communications Commission, 116 S.Ct. 2374 (1996). Proponents of filtering have argued that no stigma attaches if the adult must simply ask for unfiltered access rather than for access to a particular site. In fact, librarians have reported that patrons in their libraries plainly would be stigmatized by having to request unfiltered access, especially if they have to request access to particular sites. In the Loudoun County cases, the district court rejected the Library Board's argument that the ability of patrons to submit a written request to have a site unblocked eliminates any First Amendment problems. The court held that "the unblocking policy forces adult patrons to petition the Government for access to otherwise protected speech." Slip Opinion, at 35.
Additionally, permitting adults to request that filtering or blocking be removed would not address the constitutional rights of minors for whom filtering would almost certainly block constitutionally protected material. As explained above, minors have First Amendment rights that could be violated by broadly applied filtering software.
1Kreimer v. Bureau of Police, 958 F.2d 1242, 1259 (3d Cir. 1992); see also Concerned Women for America, Inc. v. Lafayette County, 883 F.2d 32, 34 (5th Cir. 1989) (library auditorium); Wayfield v. Town of Tisbury, 925 F. Supp. 880, 884-85 (D. Mass. 1996) (holding that state law created a due process right of access to library); Brinkmeier v. City of Freeport, 1993 U.S. Dist. Lexis 9255, *10 (N.D. Ill. July 2, 1993); but see AFSCME Local 2477 v. Billington, 740 F. Supp. 1, 7 (D.D.C. 1990) (holding Library of Congress not a public forum).
2Perry Education Ass'n v. Perry Local Educators' Ass'n, 460 U.S. 37, 46 (1983).
3Banning material because of viewpoint would be unconstitutional not only in a public forum, but also in a non-public forum. R.A.V. v. City of St. Paul, 112 S.Ct. 2538, 2545-46 (1992); City Council v. Taxpayers for Vincent, 466 U.S. 789, 804 (1994).
4Tinker v. Des Moines Independent Community School Dist., 393 U.S. 503, 511 (1969) (affirming minors' First Amendment rights).
5American Booksellers Ass'n v. Virginia, 882 F.2d 125, 127 (4th Cir. 1989) (quoting Commonwealth v. American Booksellers Ass'n, 372 S.E.2d 618, 624 (Va. 1988)) cert. denied, 494 U.S. 1056 (1990).
6American Booksellers v. Webb, 919 F.2d 1493, 1504-05 & n.20 (11th Cir. 1990) cert. denied, 500 U.S. 942 (1991). (emphasis added).
7Board of Education v. Pico, 457 U.S. 853, 871 (1982); Hazelwood School District v. Kuhlmeier, 484 U.S. 260 (1988).
8Ginsberg v. New York, 390 U.S. 629(1968).
9Erznoznik v. City of Jacksonville, 422 U.S. 205, 212-14.
10Free Access to Libraries for Minors, An Interpretation of the LIBRARY BILL OF RIGHTS (providing that "[l]ibrarians and governing bodies should maintain that parents -- and only parents -- have the right and the responsibility to restrict the access of their children -- and only their children -- to library resources").
11Of course, libraries should use care in selecting the sites to which children are directed because the risk to the library would increase substantially if those sites contained material that would be obscene or harmful to minors. The American Library Association recently compiled a list of sites for children at http://www.ala.org/parentspage/greatsites.
12See, e.g., Rodrigues-Hernandez v. Miranda-Velez, 132 F.3d 848 (1st Cir. 1998); Folkerson v. Circus Circus Enterprises, Inc., 107 F.3d 754 (9th Cir. 1997); Powell v. Las Vegas Hilton Corp., 841 F. Supp. 1024 (D.Nev. 1992).
Links to non-ALA sites have been provided because these sites may have information of interest. Neither the American Library Association nor the Office for Intellectual Freedom necessarily endorses the views expressed or the facts presented on these sites; and furthermore, ALA and OIF do not endorse any commercial products that may be advertised or available on these sites.
URL of this page:
Post a Comment
Please comment. Spam/trolling will be removed ASAP.