I find this article to be quite ironic. To understand the irony, let me back up just a bit and explain the back story. A few years ago the Australian government decided that they wanted to step up their efforts to protect families in their country from the barrage of pornography that invades our homes via the Internet. Toward this end, they set aside some government money to purchase an Internet Content filter for any family in their country that wanted one. It was actually a great idea.
However, as is always the case, the devil was in the details, and success of this type of program is entirely in the implementation. At the time I was working as the Chief Technology Officer of ContentWatch, makers of the well-known NetNanny filter. We had just recently purchased the NetNanny brand and had changed the underlying technology from a purely list-based filter (i.e., blacklist of URLs, or web addresses) to a dynamic content analysis engine (i.e., the technology “reads” the web page and makes a determination based on linguistic algorithm whether to block or not). We were working hard on educating the industry to the fact that a list-based filter would not be able to keep up with the new URLs that would appear on the Internet in the near future. Of course, I am a bit biased, but I believe that we were ahead of the curve.
When we submitted NetNanny to the government entity that was selecting the handful of Internet filters that would be available through this program, we were found to block 97% of the URLs that the Australian government had found over the years to be pornographic. As we looked into the 3% that we did not block, we found that many of them were websites that were once pornographic, but no longer hosted illicit content – the content had changed, and our algorithm recognized that, and did not block the page. We spent quite a bit of time discussing the difference between a dynamic content filter and a list-based filter, in an effort to help them understand that a dynamic analysis of the content on-the-fly was better than a URL list. However, the rules had been set, and to be selected a filter had to block 100%, regardless of the content.
Now for the irony of the above story: NetNanny would have picked up the wikipedia change that is mentioned in the article, and would have blocked the page – because it looks at the content, not at the website address.
It seems that now, two years later, they are coming to the realization that a list-based approach is not the best way to filter the Internet, and they are now informing parents that a “watchful eye is better than filters”. This statement is not entirely true – I would say that a watchful eye is just as imporant as a filter – and that a dynamic analysis filter is better than a list-based filter. Neither is perfect, and both have their weaknesses (the pros and cons of both are outlined in my forthcoming book entitled “Cyber Safety: Maintaining Morality in a Digital World”).
It is certainly true that a filter will not block everything, and even that a dynamic filter will block some pages that it shouldn’t. Nothing takes the place of a parents’ watchful eye, but we need to be very careful not to throw out the baby with the bathwater – filters have their place, and provide a needed initial blockaid to the filth available on the Internet – but parents also need to know that a filter is not a “set it and forget it” type of technology.
The bottom line: Every home with children should have a filter on thier Internet connection, but having a filter installed doesn’t take the responsibility away from parents to stay involved in what their children are doing online.