Taking us up to the lunchbreak at PrivSec Dublin, Rob Leslie, CEO and Founder of Sedici discusses AI, Privacy and Data Ethics in the Privacy, in the Security and Emerging Technology theatre.
Recognising the huge swathes of data that key global industries thrive upon, Rob asks about the ethical questions that arise as AI becomes more commonplace.
What happens if you do not turn off your smartwatch? The device could channel data to an insurance company which makes insurance premium changes according to an ailment of which a smartwatch wearer may not be aware.
What happens when a self-driving car is faced with a life-changing decision in a crash situation? An algorithm might have to choose who survives – a driver, a pedestrian or a bystander.
Research coming from the US shows how drones can be deployed in swarms, with each device loaded with facial recognition software. These drones could be used to fly up to a person, and if citing a positive ID, carry out an execution.
Where do we draw the ethical line, Rob asks, before stating “the law is miles and miles and miles behind technology.”
Rob describes how our current markets are governed by a drive to get a product to the consumer, without worrying about privacy settings.
“Until regulation catches up and forces companies to take this on board, our situation is not going to change. We ignore the law at our peril,” he adds.
“Privacy doesn’t trump everything. The rule of law trumps everything. Technology has driven us down a path where we actually have little choice. We struggle to push back against the tech giants because we struggle to live without the services they provide. We need to think about how these big companies operate and we need to place controls around them.
“AI is a misnomer; there’s not a lot of intelligence. The machine does what it’s told based on what we tell it to do. We have to be really careful about how we write these algorithms.
“We’ve got to start shrinking the data pool. Could we create mechanisms whereby data is provided and used solely for the purpose of what it was required?
“Do we need a Hippocratic oath for technologists? I say we do and we need to get people signed up because the scope for misuse is too big,” he adds.