Archive for October, 2008
Google’s editorial guidelines for sponsored links allow political advertising. This can sometimes have strange effects in its AdSense platform. Google Blogoscoped has a story about an ant-same sex marriage advertisement being served on the website of a gay community site:
When Google recently decided to allow anti-abortion advertising, because of a threat of litigation in the UK, I made this screenshot, showing an anti-abortion advertisment next to the story reporting about Google’s move:
As online media’s sophistication reaches new heights, maybe we should start taking freedom of Internet users more seriously again? I am not at all convinced this should be seen as ‘innovation’. Real innovation would be a technological tool that measures to what extent media are trying to influence their audiences subconsciously.
Yesterday, I contributed to EDRI-gram with this piece on a new Dutch Code of Conduct for Notice and Take-Down:
5. New Dutch Notice-and-Take-Down Code raises questions
Dutch government and leading market participants have adopted a new
Notice-and-Take-Down Code of Conduct. The Code seeks to clarify the
responsibilities of internet intermediaries (hosting providers in
particular) when confronted with a notice that online information is
punishable (under Dutch penal law) or unlawful. Reactions to the code
are mixed. Many hosting providers have not signed the Code. Others have
called it symbolic. In fact, the Code seems to obscure the current legal
obligations of internet service providers with regard to punishable and
unlawful material. Unfortunately, the Code does not even mention the
right to freedom of expression and the issue of censorship.
Although the code has no legal status, it goes further than the Dutch law in
a number of ways. The Code states that a notice of a public prosecutor
that material is punishable cannot be questioned by a provider, because
the public prosecutor has already established its illegal character.
However, a recent academic study of the Centre for Cybercrime Studies
(Cycris), commissioned by the Dutch government, revealed the inadequacy
of Dutch laws concerning Notice and Takedown. In particular, it found
that the public prosecutor does not have an adequate legal instrument to
order material to be taken down. The study concluded that “there are
insufficient guarantees built into the process to protect the interests
of Internet users and the information freedoms”. The Dutch government
has responded that it is reviewing the relevant laws, but it has
completely ignored the problem in the context of this new Code.
In the case of notices of punishable and unlawful material from others
than the public prosecutor the Code provides that an intermediary will
remove the material if it is ‘unequivocally’ punishable or unlawful. If
not, the party seeking removal can either seek involvement of law
enforcement agencies or start a civil procedure. There is no explicit
mention of a put-back procedure. The Code does state that intermediaries
have to be careful not to remove more content than the notice points to.
The Code does not change the circumstances under which rights holders
can retrieve identifying data of alleged infringers of copyright. For
this reason, BREIN, the representative of the rights holders in the
Netherlands, has made clear it sees the current Code as unsatisfactory.
To complicate matters, the Code introduces the concept of ‘undesirable’
or ‘harmful’ material. It defines this as material that is not illegal
or unlawful under Dutch law, but material that a provider itself does
not want to host, because of its ‘undesirable’ or ‘harmful’ character.
The Code states that the provider is free to develop such criteria and
treat notices of ‘undesirable’ or ‘harmful’ material the same way as
notices of illegal material. Clearly, government involvement in this
part of the Code of Practice is problematic from the perspective of
freedom of expression. The Code does not clarify which categories of
content can legitimately be considered as ‘undesirable’ or ‘harmful’ by
an intermediary. And unfortunately, the Code does not explicitly forbid
law enforcement agencies to send notices of ‘undesirable’ or ‘harmful’
material, whereas such notices would seem to be illegal.
The Code was adopted in the context of the National Infrastructure
Cybercrime, a public private partnership, which includes several branches of
the Dutch Government, major broadband providers such as KPN, XS4all, and
cable providers. There is no official list of participants in the Code.
Notice-And-Take-Down Code of Conduct (9.10.2008)
Dutch ‘Notice-and-Take-Down’ Code of Conduct issued (14.10.2008)
Cycris Research on art. 54a of the Dutch Penal Code (13.05.2008)
Hosters en Brein sluiten piraterij-compromis (In Dutch only) (9.10.2008)
(Contribution by Joris van Hoboken – EDRi-member Bits of
NYTimes Bits reports about Google promoting its G1 on its homepage!
Are this two worlds colliding? Engineers and Google employees that want to believe that they are work for some kind of special force for good and business people that want to run a successful business? Below is a statement of Google taken from Bits, of course pointing to the good cause Google is all about, i.e. earthquake relief efforts promotion. Please, get real Google…
“We are currently running a homepage promotion. These promotions appear when we launch a major product or service that users might be interested in, or to support a cause that users care about, including the May Sichuan earthquake relief efforts. Because these promotions appear on our homepage for only a few days at a time, we don’t consider them in our official homepage word count.”
Following up on my last post and trying to answer the question Siva Vaidhyanathan is asking: Should we care about Google’s First Click Free and the possible centralizing force of Google’s policy in this regard?
If there is evidence that Googlebot plus users are treated ‘better’ than others I would say it is problematic. There does not seem to be such evidence and I can think of no obvious incentives.
More advanced, what Google could (try to) do is to ‘contract’ with publishers for special treatment of some users by selecting them. This would allow Google to extract rent from the knowledge about its users, selecting users that are more likely to be turned into a paying customer for the publisher. However, Google does not do so or be willing to do so. This is more the kind of behavior it might be showing on a site like YouTube.
I would love to see evidence if exclusion of websites for certain users happens on a global scale, for instance Google.com using Europeans finding a payment form for a US online newspaper and US citizens finding a First Click Free version. I can imagine Google might make an exception in these cases and not consider this cloaking, but I have to find the answer still.
In some countries (France being an example I believe) there seem to be special deals between Google and publishers in the context of Google News. I have never seen or heard about the details of these contracts. There is an initiative (ACAP) in Europe that has developed a more detailed robots instruction protocol that tries to solve some of the conflicts between search engines and publishers, in my opinion not always in the interest of users. It is based on the idea that robots.txt relates to copyright licensing.
Finally, opaque personalization and geo-targeting of results has in practice done what Carr seems to point to. There is no baseline and there is not one Web for people using Google. Today for instance I found that Google.com shows different results for [mccain] in the US and in the Netherlands. The french fries / potato corporation mccain shows up quite prominently in the Netherlands but not in the U.S., where I am currently located. I plan to post some screen-shots later.
The European Court of Human Rights has an interesting case pending dealing with the right to private life, freedom of expression and effective remedies in the context of the Internet. The case facts are fairly simple but raise a series of complex questions under the European Convention of Human Rights, for instance whether the victim of an online infringement of his right to private life has a right to be able to identify the offender through an intermediary that holds identifying data.
The facts are as follows. An unknown person placed an online solicitation for a sexual relationship under the name of a 12 year old boy, listing the boy’s name, phone number, date of birth and a picture. A second person contacted the boy and was later identified and prosecuted for it. The notice was taken down but the publisher of the notice remained unknown, except for his or her dynamic ip address at the time the notice was placed. The victim tried to identify the publisher -with the help of law enforcement- through the ISP that had issued the dynamic ip address. At that time, however, Finish law did not give the police the authority to order the ISP to hand over the data to the police, because of the low punishment for the crime of acting under a fake identity. Finish courts affirmed this impossibility. The complainant claims Finish law does not give him an effective remedy (Art 13 ECHR) under the Convention with regard to an infringement of his private life (Article 8 ECHR).
How will and the Court proceed? It seems clear there was an actual infringement of the applicant’s right to private life. Importantly, Article 8 ECHR puts Finland under a positive obligation to ensure the respect to private life between private parties (see Von Hannover v. Germany). But how far does such an obligation go? The same Article 8 ECHR lies at the basis of the EU Privacy Directives and restricts the availability and accessibility of identifiable traffic data such as ip addresses. And finally there is a question about the possible protection of anonymous communications under Article 10 of the Convention.
The German Data Retention Working Group has sent comments to the Court (link does not always work).
The Dutch Senate is holding its expert meeting on data retention on 11 November 2008. It also received the answers of the government to its questions relating to the legislative proposal implementing the data retention directive. The answers are a repetition of earlier positions and arguments by the government. The expert meeting will result in additional questions and answers after which it will be ready for a Senate plenary.
Just a few loose thoughts. In two interesting posts on his blog, Nicholas Carr writes about the centripetal forces towards Google. In the case of Google’s new First Click Free policy, for comments see this post at Google Blogoscoped (incl interesting discussion with Matt Cutts in comments), Google defines it policy in such a way that it enforces the centripetality towards its search operations.
Seth Finkelstein in a related post and comments states that it’s plausible that the highly ranked Wikipedia links in Google suck away some attention from better specialist results. I think I can agree with that. Google + Wikipedia is simply the sum of least possible efforts possible, explaining the prominence of the combination. I am not sure if we have to consider this to be a problem.
First, one has to be somewhat knowledgeable to be able to find and understand ‘specialist’ reports. That takes effort on the side of users, and I do not think a search engine could easily take away the need for that effort.
Second, attention may be the measure for success in terms of the market for eyeballs. From the perspective of real debate and valuable information exchange, such attention needs to be qualified however. Was attention meaningful, did the reader learn something new or used the information for subsequent action? Maybe 10 readers is sometimes better than 50,000. By measuring in terms of eyeballs just seems to explain Google’s choices for popularity as a measure of quality.
I am quite worried about the possibility of publishers ‘speaking’ to Googlebot and Google users differently than to other bots and users and in fact about all such intensified formal interaction between online publishers and Google. The idea that a newspaper gets to choose who to speak to in such a way just seems entirely illegitimate to me. It’s an interesting legal question whether it is legal in such cases to identify oneself (falsely) as a Googlebot or a Google user, if that is technologically possible. The question becomes different in case newspapers get paid for privileged access for Googlebot and Google users. That changes the character of Google’s service.
Advocate-General Yves Bot has offered his opinion in the case about the data retention directive challenge by Ireland. The European Court of Justice AG’s opinion is clear, but disappointingly superficial and flawed in some of its reasoning.
Currently, the EU has three pillars. The directive was adopted under the first pillar, which is the most integrated part of the EU, also called the (European) Community, governed by the EC-Treaty. The third pillar is reserved for police and judicial cooperation. In the third pillar every member state has a veto. The constitution and the Lisbon treaty would have changed this structure significantly but both have not been ratified.
The data retention directive (2006/24/EC) is based on article 95 of the EC-Treaty, which provides a legal basis for directives regulating the internal market. It amends the e-Privacy Directive (2002/58/EC), which harmonizes privacy in the market of electronic communications. The ePrivacy Directive is also based on Article 95 EC-Treaty. The constitutional problem is that a legislative measure can sometimes be adopted under the wrong legal basis for political strategical reasons (such as preventing a veto from the Irish government). A more fundamental problem is that the pillars in some way reflect (amongst many other things) that measures relating to law enforcement and criminal justice have a more fundamental impact on the relation between the State and its citizens than internal market regulations. Criminal procedural law such as data retention laws trying to guarantee tracebility are different from a law regulating roaming or consumer protection.
The AG agrees with the Council that the primary goal of the data retention directive is the harmonization of the Internal market. He finds evidence for this in the mentioning in the directive of obstacles for the internal market because some member states adopted data retention legislation and others not.
85. It follows that, in the absence of harmonisation, a provider of electronic communications services would be faced with costs related to the retention of data which differ according to the Member State in which he wishes to provide those services. Such differences may constitute obstacles to the free movement of electronic communications services between the Member States and may therefore create obstacles to the establishment and functioning of the internal market in electronic communications. They may, in particular, slow down the cross-border development of new electronic communications services which are regularly introduced in the information society. They may also give rise to distortions in competition between undertakings operating on the electronic communications market.
Note the use of the word ‘may’. It is not clear how different costs would obstruct the development of the Internal market. It is even more difficult to understand how the directive would prevent that because this is something the directive does not harmonize. This should have been known by the AG. Some member states have decided to let the industry pay all costs, some let the industry pay part and some such as the UK has decided to refund costs. The directive does not solve this problem of costs to any extent. This makes the following conclusions flawed, because they are both based on the cost argument:
86. As is clear from recital 6 in the preamble to Directive 2006/24, such disparities between the laws of the Member States ‘present obstacles to the internal market for electronic communications, since service providers are faced with different requirements regarding the types of traffic and location data to be retained and the conditions and periods of retention’.
87. In so far as Directive 2006/24 proceeds with harmonisation of national laws on the obligation to retain data (Article 3), the categories of data to be retained (Article 5), periods of retention of data (Article 6), and data protection and data security (Article 7), I take the view that it facilitates the development of the internal market for electronic communications by providing common requirements for service providers.
What the AG should have done here is consider the original purpose of the ePrivacy Directive 2002/58/EC. That directive protects the privacy of users of electronic communications network while ensuring the functioning of the internal market. As with the general Privacy Directive, the idea is that privacy legislation can be an obstacle for the internal market because it can block the free processing of personal data across the EU in an uniform matter. For this internal market reason these directives harmonize the protection of privacy.
The question the AG should have asked himself is to what extent the data retention directive ensures the free processing of the traffic and location data in question across the EU. Of course it does not and I am happy about it, precisely for the reason I think it should have been discussed (and vetoed) in the third pillar. Data retention is organized on a national level, country by country, with rather extreme differences. To think of it in terms of preventing obstacles for the internal market is simply flawed. The AG summarizes the test as follows:
In summary, in order to justify recourse to Article 95 EC as the legal basis, what matters is that the measure adopted on that basis must actually be intended to improve the conditions for the establishment and functioning of the internal market.
This test is simply not fulfilled.
Although this is unnecessary for the AG’s conclusion, the AG also takes the view that the directive does not provide at all for harmonization of access to data for law enforcement:
 Directive 2006/24 contains measures which relate to a stage prior to the implementation of police and judicial cooperation in criminal matters. It does not harmonise  the issue of access to data by the competent national law-enforcement authorities.
In my opinion this is wrong. The directive states that it “aims to harmonise Member States’ provisions concerning the obligations of the providers of [providers] with respect to the retention of [data], in order to ensure that the data are available for the purpose of the investigation, detection and prosecution of serious crime, as defined by each Member State in its national law.” So the directive harmonizes that the dataset as defined in the directive is available for the detection and prosecution of serious crime. On top of this, this can only mean that it is accessible as well.
The infringement of the right to privacy
As was to be expected the AG did not address the question about the legitimacy of the infringement of privacy. This issue was not before the Court. The following consideration addresses the issue, stating that the mentioning of a need to infringe privacy is vital for its justification. A rather formal approach:
the mention of such an overriding requirement of public interest is vital in order to justify the interference by the Community legislature in the right to privacy of the users of electronic communications services.