Our data privacy dialogue
Most of us are aware that our use of social media, online shopping, email newsletter sign-ups, store loyalty cards and use of travel apps on our smartphones all generate data. Many of us, sometimes begrudgingly, accept marketing teams use this data to create offers and experiences tailored to us. Despite this, data security and privacy – particularly relating to personal data – are topics that generate concern.
Today is Data Privacy Day – an international endeavour that aims to create awareness about the importance of respecting privacy, safeguarding data and enabling trust. We want to explore these themes, and our responsibilities at the Urban Big Data Centre, in the first of a series of blogs on data privacy.
We all have a right to privacy - protected by laws and conventions - and want to ensure our personal information is used fairly, lawfully and only when necessary.
At Urban Big Data Centre, we want to ensure that - while protecting privacy - researchers have the amount and quality of data they need to identify trends and patterns. Typically, this includes the use of ‘de-identified’ data – where all information that can directly identify someone is removed, such as name, address and National Insurance number – and ‘anonymised’ data, which involves turning data into a form that does not identify individuals and where identification is unlikely to take place.
For example, our iMCD project participants wore lifelogging sensors that captured images as they went about their daily lives. Although they were willing for this photographic record to take place, people in the public spaces that they were in weren’t aware they were being photographed. We are now working on a sophisticated method of face detection to blur people’s faces in these images, before giving them to researchers to work with, to respect their privacy.
Data security is also vital in maintaining privacy and how we store and safeguard the data we hold is of utmost importance to us.
Information held by the Urban Big Data Centre is subject to a range of security controls. For example, controlled data – such as personal data – are held with the electronic Data Research and Innovation Service (eDRIS). This is a highly secure computing environment where it is possible to closely monitor who works on the data and ensure no personal data leaves the system. Datasets constructed for each project are also destroyed on completion of the research.
Security controls are not the only safeguards needed when working with data – we need to consider those who have access to the data. That is why we have an independent research approvals process to ensure that each research project is worthy of accessing the data. We also train researchers to use data safely, lawfully and responsibly. It can be a long process as we have a number of safeguards in place to protect people’s privacy and ensure the data are secure at all times.
Data and trust
We understand that saying we will respect privacy and safeguard data is not enough – you need to trust that the payoff of researchers having access to your data is worth the perceived risks.
Research company Gartner has predicted that 50% of people in large cities will be sharing their data for the benefit of smart city initiatives by 2019, which indicates that we are more willing to share our personal data if we can see a clear potential public benefit to doing this.
This is also backed up by the Economic and Social Research Council (ESRC) commissioned public dialogues on the (re)use of private sector data for social research. The dialogues “demonstrated that there is wide public support for the use and re-use of private sector data for social research...” and that “the benefits of using private sector data outweigh the risks for this specific purpose”. Participants in these dialogues appreciated, for example, social research using private sector data that helped to improve local, regional and national policies and had the potential to benefit local communities.
Data for good
We know that transparency is key when it comes to enabling trust, so this year we will:
- Publish more blogs and case studies showing examples of private data used for public benefit
- Continue to hold and attend events to raise awareness of the work we are doing
- Supply responses to FAQs to make sure we are answering your questions
- Keep providing high-quality data and help people to use it.
We want to carry on the data privacy conversation beyond Data Privacy Day. Do you trust that your data is secure and used for the right reasons? Please use the comments form below or contact us via our social media channels to let us know what you think.
Next up on the blog: An in-depth look at face detection for our iMCD project to safeguard privacy.