“Technology crashes against social norms”


Irish Data Protection Commissioner Helen Dixon has a long-term vision of how best to navigate and protect privacy in an increasingly post-truth world. She talks to John Kennedy.

While we are on the subject of data privacy, Irish Data Protection Commissioner (DPC) Helen Dixon is a well-researched and heavily annotated copy of the General Data Protection Regulation (GDPR) legislation which was activated in May. of this year. I suspect she never loses sight of him.

Since taking office in 2014, Dixon has presided over a quadrupling of the DPC’s annual budget, the opening of a new office in Dublin and an increase in staff to over 100 people. She has also spearheaded GDPR awareness and education for Irish businesses and citizens while simultaneously being at the heart of historic legal battles over social media and the transmission of data from Europeans to states. -United.

“The truth about technology is, you can’t just lock it in front of the school door”
– HELEN DIXON

The genesis of our conversation began with a panel I hosted at the recent Irish Government Data Summit in September. Along with Dixon were former Google Ads Director Sridhar Ramaswamy and Luukas Ilves of the Lisbon Council.

Earlier that week, I wrote an editorial advocating for the introduction of data privacy education in Irish schools. I thought that even though we have been in the digital revolution for decades now, we have to start somewhere because this kind of education is needed.

I did not realize at the time how fortuitous that would be, as the Office of the Data Protection Commissioner (ODPC) was at that time about to start piloting education modules on data privacy in Irish classrooms ahead of potential policy decisions in this area. Three schools in Dublin and Meath recently launched a pilot project of lesson plans designed by ODPC staff who are trained in education. It was organized with the support of the Office of the Ombudsman for Children in Ireland. Lesson plans are designed to engage with children of three different age groups: 9-10, 14-15, and 16+. The move is a first and potentially timely for a generation that never knew what life was like before the internet, social media and smartphones.

The initial pilot and feedback will inform the potential creation of a national lesson plan. “We hope the program supports it,” Dixon said. “We have written to the Secretary General of the Ministry of Education to inform him of what we are doing in this area. We worked with the office of children’s ombudsman Niall Muldoon, who is supporting us in this pilot project.

Privacy in the post-truth world

You feel that in addition to leading a nation through one of the greatest cultural and economic upheavals brought on by digital technology, Dixon is intellectually immersed and invested in the subject. His conversation is punctuated by references to A scary theory by Omer Tene and Jules Polonetsky, and the brilliant work of Yuval Noah Harari in Homo Deus: a brief history of tomorrow.

As we talk about how social media, for example, has infiltrated our being and our consciousness, Dixon likened comments angrily posted on Facebook or Twitter to tattoos. “The difference is you can get laser treatment to remove tattoos.”

She continued: “I remember a conversation I had with the IMF’s Christine Lagarde when she was in Dublin over the summer and we briefly talked about the whole data issue and she told me. very succinctly said, ‘Three things with data: competition, privacy and civic education.

“And you hit the less focused aspect and we need to talk about it more, and the hardest part is civic education.”

The problem Dixon is trying to solve is how technology is absorbed by society faster than how people figure out how to best protect themselves.

“It’s a problem because technology clashes with social norms and forces us to evolve. I have used the example on which Jules Polonetsky wrote in the Goosebumps theory, and one of the things that has been talked about in terms of changing social norms against the backdrop of this whole area is the caller ID example.

“Back when you got your first cell phone in the 1990s, caller ID was introduced and there was outrage that if someone were to call AA, for example, your number could be displayed and individuals sought to hide it.

“Now it’s come 360 ​​degrees and neither of us will answer our phone until we see and identify the number. If you’re a regulator deciding if it’s okay for the numbers to be displayed or not, it is a very difficult call. And, of course, as in many areas of data protection, it is a matter of the consumer’s choice to control under what circumstances they wish their number to be displayed, etc. .

This scenario of how sentiment can change as people become more digitally savvy is a priority for Dixon, as it takes into account how a society like Ireland can better protect its children in one more world. in addition digital.

“As an office we are very busy dealing with GDPR and we have many particularly large scale investigations underway as well as high profile data breach investigations, but we don’t want to lose sight of that.

“We can build on a lot of what we’ve put in place. The GDPR is of a very high level and principled, and while it helpfully indicates that children should have specific protections, there is no specificity on how to do this. And that’s what we want to get into now.

Dixon is realistic when it comes to the issue of data, privacy and children, especially in a year in which Ireland chose to put the numerical age of consent at 16.

“The truth about technology is that you can’t just lock it in front of the school door. I am interested to see what children themselves understand about privacy and what parts of their lives do they devote to it. It’s not about being prescriptive.

Regulate the blockchain

Another distinct issue looming on the horizon for data authorities around the world is the rise of blockchain technology. It’s inevitable that the myriad uses of blockchain will put it on the radar of data regulators.

“The very nature of the blockchain with its distributed and peer-to-peer aspects poses a challenge in terms of identifying the players we recognize under data protection regulations.”

Dixon said that while technology is changing rapidly, every conversation data protection authorities have had so far have been at a high conceptual level.

“At some of the conferences I have attended, there has been some very vigorous debate about how blockchain could be a very privacy-friendly technology. And then you have the counter-arguments that it is the opposite, based not on the minimization of data but on long retention periods, and means of control and exercise of uncertain rights.

“Our job is to monitor the application of the GDPR but also to help interpret it so that it can be applied in a way that protects rights. We will watch this space and watch it carefully. “

About Marjorie C. Hudson

Check Also

Pellerin: The real threat of the “freedom convoy” concerns social norms

Breadcrumb Links Columnists Everyone has the right to demonstrate, but we must not tolerate threatening …

Leave a Reply

Your email address will not be published.