Tim Cook, Apple CEO, made a very interesting statement at Brussels’ International Data Privacy Day.
Technology does not need vast troves of personal data stitched together across dozens of websites and apps in order to succeed. Advertising existed and thrived for decades without it, and we’re here today because the path of least resistance is rarely the path of wisdom.
If a business is built on misleading users on data exploitation, on choices that are no choices at all, then it does not deserve our praise. It deserves reform.
We should not look away from the bigger picture and a moment of rampant disinformation and conspiracy theory is juiced by algorithms. We can no longer turn a blind eye to a theory of technology that says all engagement is good engagement, the longer the better, and all with the goal of collecting as much data as possible.
Too many are still asking the question, ‘How much can we get away with?’ When they need to be asking, ‘What are the consequences?’
What are the consequences of prioritizing conspiracy theories and violent incitement simply because of the high rates of engagement?
What are the consequences of not just tolerating but rewarding content that undermines public trust in life-saving vaccinations?
What are the consequences of seeing thousands of users joining extremist groups and then perpetuating an algorithm that recommends even more?
It is long past time to stop pretending that this approach doesn’t come with a cause. A polarization of lost trust, and yes, of violence.
A social dilemma cannot be allowed to become a social catastrophe.”
The New York Times has an interesting story about a company that uses facial recognition to completely eliminate privacy as we know it.
If we only had one goal, to deter crime at all costs, one of the best strategies would be to put all of us into individual cages. The reason we don’t choose that option is because we all agree that it comes at the cost of what we see as essential to living freely.
Where we sometimes disagree is whether privacy is essential to living freely.
1) One argument is that if you have nothing bad to hide you have no need for privacy. And that privacy only enables an environment for societies ills to fester. As an example, this was the initial philosophy behind Facebook and the trusted social graph it wanted to build. You can use your own legal name only. You can’t be anonymous etc. Anonymity leads to things like the YouTube comment threads that are bad. And worse, it enables criminals to get away with bad things.
2) The other argument is that privacy actually safeguards our freedoms. And it comes down to understanding how imperfect systems are at defining and managing our realities. Facebook learned that its original philosophy had to be changed when they were confronted with the needs of abuse victims, victims of social persecution, people in witness protection, identity theft, victims of government persecution etc. None of those people are bad people, but they have a legitimate need to their privacy and control of their identity. Their very freedom depends on it.
It’s a no-brainer that tools like this will deter crime. That tools like this have the potential to make life easier. But it only makes the world better for a few people who have the privilege of not needing privacy. And it’s all relative. A system that goes down that slippery slope keeps eating away at the weakest and most vulnerable and it becomes harder to fix it by the time the more privileged folks have the opportunity to understand first-hand what gets lost in the mix.
MSNBC has an interesting story on the differences in how social networks are used in Japan as compared to the US.
Welcome to Japan’s online social scene, where you’re unlikely to meet anyone you don’t know already. The early promises of a new, open social frontier, akin to the identity-centric world of Facebook and MySpace in the U.S., have been replaced by a realm where people stay safely within their circles of friends and few reveal themselves to strangers.
It reveals some interesting facts about people’s expectation of privacy.
People rarely give their first names to those they don’t know well. Spontaneous exchanges are uncommon even on the tightly packed trains and streets of Tokyo. TV news shows often blur the faces of those caught in background footage and photos to protect their privacy.
This is quite in line with the open letter to Google last month by a Japanese blogger pointing out the cultural inappropriateness of Google Street View
According to the morals of urban area residents in Japan, the assumption that “it is scenery [viewable] from public roads and therefore it must be public” is in fact incorrect. Quite the contrary, [these morals state that] “people walking along public roads must avert their glance from the living spaces right before their eyes”.
"The iPhone was welcomed here with long lines of gadget fans. But it’s also being seen as shockingly alien to this nation’s quirky and closed mobile world… For example, young people in Japan take for granted the ability to share phone numbers, e-mail addresses and other contact information by beaming it from one phone to another over infrared connections. Being without those instantaneous exchanges would be the death knell on the Japanese dating circuit," Kageyama reports. "While the iPhone has Bluetooth wireless links, it has no infrared connection." "Also missing from Steve Jobs’ much-praised design: a hole in the handset for hanging trinkets. Westerners may scoff at them as childish, but having them is a common social practice in Japan," Kageyama reports.
This is a good example of the tension between centralization and specialization of service and control. Making one device or service for all is a very cheap process; however, making it fit the long tail requires intense resources for customization and is harder to achieve.
I have always been very excited about this but have been putting it off. However, the IEEE doesn’t want to let us on-the-edge engineers rest in peace – they went ahead and dedicated an entire issue of Spectrum magazine on embedding RFIDs inside human bodies. Reading about the experiences of the few people who have done it helps reduce the anxiety around it and very strongly tempts me to go ahead and get it done. I have already looked up online about where I can order the RFID chips and readers from. I am at the last step – I need to order it and schedule an appointment with a doctor to perform the 3 minute insertion procedure.
I suggested this idea to fellow students at SI and have got many concerned responses – why do you want to make it easy for the Big Brother? Is it safe? What’s the point?
Big Brother (Privacy): It’s a passive device for identification and authentication, just like finger-prints, so it is not as scary as the potential scenario of a GPS enabled chip that radios in to Big Brother at intervals. Safety: well, the IEEE seems to endorse it, they haven’t made active and scary disclaimers about the risks involved, if any. And animals have been RFIDed since a long time now. The point: Well, to be honest, there is no point. It’s only a cool thing to do, like getting a tattoo; just a more geeky tattoo. There is absolutely no compelling reason about why the RFID should be under my skin – can I really not be ok with it being in my pocket?
I wanted to find out what Google expresses to do with the extremely large amount of information it gathers from its social networking website Orkut. When going through the terms, I was felt happy for a second while I was mid-sentence, but it immediately turned to frown. I will explain why. This was the sentence I was reading.
You can terminate your account at any time. To learn how, click here. If you terminate your account, your profile, including any messages in your inbox, will be removed from the site and deleted from orkut servers. Because of the way we maintain this service such deletion may not be immediate, and residual copies of your profile information may remain on backup media.
It starts out by assuring that when you delete an account from Orkut, the profile gets deleted completely, and all messages in the inbox will be removed from the site and deleted from Orkut servers.
It sounds good on first read, but if you are familiar with Orkut you would realize that there is so much more information in your friends network, your scrap-book, your usage and communication statistics etc which will stay forever. They never make any explicit mention of this data.
Secondly, even if they mean to encompass all data using the meta-term “message”, the latter part of the sentence flushes down the effectiveness of the first sentence.
“… residual copies may remain on backup media…”. For how long? No mention.
Why can’t Google be more open and frank about how they intend to use and process the data and let people make a conscious choice rather than trying to mislead them?