In the world of Big Data – privacy and security have two very distinct meanings and functions, but they are inter-related. Security is related to the enforcement of policies related to computer use and electronic communications. These would include identity and authentication; authorization; availability; confidentiality; integrity and auditability. A failure in security becomes a threat to privacy, but violations of privacy can exist without a failure in security.

The promise of Big Data collection and its analysis is that the derived data can be used for purposes that benefit individuals, the economy, national security, medical research and urban planning – to name only a few. However, it is the amount of data being collected as well as the unexpected uses of data that are causing concerns related to privacy.

Profiling, discrimination, over-criminalization and other restricted freedoms – such as what you do in the privacy of your own home become major concerns for privacy advocates – especially when data is being collected on just about everything, and in multiple forms including text, web data, tweets and other social media feeds, sensor data, audio, video, click streams, log files and more.

My House is My Castle

Historically, persons and their personal effects were kept private within their home – which is protected from intrusion by law. However, with the increasing use of smart phones, Wi-Fi, sensors, cameras, cloud technology and the whole “internet of things” environment – an individuals’ private behavior is becoming increasingly more public, challenging current policy, legislative and judicial approaches to privacy laws.

While technology is enabling many new conveniences – such as Smart Home and Smart Health applications – the data being collected through these applications may be used for purposes beyond their original intent – and without the users’ knowledge. Additionally, even though most data collected from these applications is relatively benign – there is a good chance more information is collected than minimally necessary for their function and may be used for secondary purposes. While this information could be beneficial, it could also be harmful depending on the context in which it is used.

An example is an environmental monitoring sensor (smoke detector) that can distinguish cigarette smoke and send this information to your health insurance provider – particularly if you claim a non-smokers deduction in your premium. But what if the smoker was a guest? It’s all about context.

This is where the age old statement of “just because we can, doesn’t mean we should” comes into play.

Privacy and Changing Social Norms

Another area affecting privacy is shifting social norms. Social norms are how members should behave in a given context and can vary significantly from culture to culture and generation to generation.

What may be perfectly acceptable to a 20-year-old “digital native” – may be abhorrent to a 50-year-old, such as the need to document every aspect of one’s life. Personally, I am happy that I was a teenager long before every action was photographed, posted, tweeted, pinned and tagged.

Consider ambient social apps, technologies that use mobile device location awareness to help users obtain information about people around them, such as Highlight. Highlight is a mobile app that helps you learn more about the people around you. If someone standing near you also has Highlight, their profile will show up on your phone. You can see their name, photos of them, mutual friends and anything else they have chosen to share. Additionally, it will show on a map which connections are nearby and even see if your connections are moving, and in which direction they’re going – kind of like the Marauder's Map in Harry Potter.

On one hand, it would be cool to know if one of my friends was traveling in the same city as me at the same time. On the other hand, I am not sure that I would want a job recruiter to know my whereabouts – or perhaps an old boyfriend. Granted, if I add this app I have provided my consent to its use – so I’ve chosen this path, but it would be better if I could sometimes just be anonymous.

The Good, The Bad and The Ugly

There is no argument that the availability of all of this data and technology can provide immense benefits. Fusing together different kinds of data and processed in real time, can provide a personalized message, product, or service to consumers before they even know they want it. This fusion of data can also be used to predict events, preferences and behaviors having substantial impact on society as a whole – such has being able to prevent a wide-spread disease outbreak.

Unfortunately, this same information also leaves room for subtle and not-so-subtle forms of discrimination in pricing, services, opportunities; alter the balance of power between government and citizen and discourage the exercise of free speech or free association; and reveal intimate personal details discerned even from seemingly anonymized data.

Proactive Prevention

Privacy protection as it relates to Big Data has taken multiple forms. The simplest is notice of consent – the key is for individuals to actually read the terms. If you don’t like them, don't use the app. Other initiatives include Do Not Track – which enables users to opt out of tracking by websites including analytic services, advertising networks and social platforms as well as the EU’s Right to Erasure – which allows individuals to request from service providers the deletion of certain information retained about them. And while these may be useful now – they are not likely to resolve the issues associated with privacy and Big Data going forward.

Big Data is only going to get bigger with the continuing growth of “smart” devices and applications. As such, privacy policy will need to be focused on “how” this data is used, rather than how it is collected, stored and retained.

But in the end, it is really all about individual choice or as they say “public by default, private by effort”.

You are what you post and that of everyone else connected to you.