A lot of my time is spent thinking about the Internet as a public place. That may seem like an obvious and intuitive concept to grasp, but it is practically difficult for a number of reasons. Some of these reasons are legal, such as copyright law, and other are technical.
Many of Facebook's struggles are, at their core, symptoms of a public vs. private schizophrenia massive centralized platforms are beginning to suffer from. Wikipedia is one solid counter-example: most decisions and policies are the result of decentralized consensus or vote.
The current row over whether Facebook should allow Holocaust deniers the right to organize at first appears as a freedom of speech issue. This is certainly how the Facebook team has justified allowing certain groups to stay online. But because it is all happening on Facebook's servers, it is also (and perhaps singularly) a Facebook Terms of Service issue.
Facebook has the right to throw people off their service for reasons they deem appropriate just as Club Penguin has the right to censor children from cursing at each other when playing a video game. Facebook is not the United States government and it is therefore not subject to the same kind of first amendment scrutiny when censoring speech.
But Facebook is a government of some kind. With over 175 million users, the site is now more populous than most countries. They're also holding elections and convening debate over the rights and responsibilities of their users. It's clear that they are governing user's actions much in the same way that a government governs citizens' actions, but it is now totally unclear what inalienable rights Facebook users have when engaging with their friends and colleagues in what has become a public space. It is my hope that projects like Autonomo.us will help shift the debate towards greater user freedom and data portability in the long run, but we aren't there yet. More specifically, whether Facebook respects an external bill of rights (as drafted by Autonomou.us) is a separate issue of whether Facebook will ever legally be considered a public or private space. This battle has occurred in the physical world, and the law seems conflicted over whether massive private spaces can be considered public. In Iowa, malls are considered private property, but New Jersey's State Supreme Court disagrees, and the 1980s Supreme Court decision, Pruneyard Shopping Center v. Robins, the court decided that states like California could affirm free speech rights in places like malls.
The ToS modification fiasco is another example of Facebook's public vs. private schizophrenia. At the heart of the blow-up over the revised Terms of Service, was a sentence claiming that users content "will survive" on Facebook despite said user deleting an account. Consumerist rightly interpreted this phrase as allowing Facebook to exploit (if not behave as if they own) your content in perpetuity. This was a dire and cynical prediction, but not unfounded. Julius Harper did a masterful job of organizing the outrage over the modified ToS and was subsequently invited into the negotiations, which was certainly a step in the right direction.
A good-will interpretation of Facebook's new phrasing was that the sites administrators couldn't be absolutely sure that all of your content would be gone once you deleted your account. Consequently, Facebook's lawyers wanted to preclude liability (privacy, copyright and otherwise) if your content happened to show up somewhere in a backup or internally archived version of the site. Anyone familliar with running a user platform (and backing it up) will be aware of the complexity involved in keeping track of user data across many servers, so do not dismiss this challenge as an easy task until you talk to a server administrator.
But there was also a feature-based reasoning behind Facebook's ToS modification. Facebook did not want to be obligated to remove messages, wall posts, and photos from other users accounts and inboxes simply because one user deleted their account.
If Alice sent Bill a message on Facebook, and then deleted her account, should Facebook be obligated to remove Alice's message from Bill's Facebook Inbox? This is something the site could do very easily. We've all seen instances of our friends removing status updates, profile information, or photos, so there's no question Facebook can unilaterally perform the same action without our permission. But our intuition says that they shouldn't do this. Even though Bill may not own the copyright to reproduce Alice's content, he should at least be afforded the dignity of perpetually retaining a record of his communication with her, despite her desire to remove her presence from Facebook.
This is how the Internet works: if Alice and Bob were communicating over e-mail, there would be no question as to whether Bob would have the right to retain Alice's e-mail even if she deletes her e-mail account.
But Facebook is not the public Internet, where users have no control of servers across the world. Quite the opposite: Facebook does have control over everything and can actually unilaterally delete e-mails out of inboxes. This presents a unique liability and responsibility that the company's lawyers were interested in attenuating. I wouldn't be surprised it was motivated by threat of a lawsuit by an angry user wanting *all* of their content off the site, including messages sent to other users.
Ultimately, Facebook's desire to retain the metaphors of Internet communication is at odds with the company's power to unilaterally control that communication. While Facebook actually has the power to delete Alice's e-mails from Bob's Facebook Inbox, they choose not to, out of respect for norms established long ago on the public Internet. In other words, Facebook is attempting to behave like a public space while remaining a private company by crafting its own rules and laws.
There's also the issue of public disclosure of private facts on Facebook. American law prevents me from disclosing private facts about Alice that are not news worthy. However, if Alice had disclosed such private facts in a public space (perhaps in front of a large audience), I can pass on the facts to others and even publish them.
But what if Alice discloses her private fact on her Facebook profile? It remains private in the sense that only I and her friends can see it by logging into Facebook's private service, but it also arguably public in the sense that I and her friends are also an audience. Does it matter how many friends she has? What privacy settings did she have in place?
The public and private nature of Facebook feels very complicated.
In the end, I don't think the phrase "walled garden" suits the scale and character of these kinds of issues anymore, as we're no longer talking just about access to content. These issues are about government, control, public spaces, and censorship, so our freedom and laws should apply accordingly.