
We are living in the age of artificial intelligence (AI) and big data. While we are yet to see sentient robots, we do have powerful AI algorithms which can aid medical diagnoses, navigate driverless cars, and make criminal sentencing decisions.
These come with risks: you may have heard of the scandal involving Cambridge Analytica, who Facebook allowed to harvest the personal data of more than 80 million of its users (without their permission).
Cambridge Analytica then used these data to target American voters in the 2016 presidential election.
Other companies have also attempted to profit from our personal data, often without first getting our informed consent.
It is not surprising that many of us feel like we do not always have the greatest control over how our personal information is used.
It is our data after all, shouldn't we have some say about what happens to it?
While apps and websites often present us with consent forms, they are sometimes vague, hard to understand, and long.
No wonder an estimated 94 per cent of Australians do not read them all. This is not irrational behaviour, however, as we just do not have the time to read them all.
So how do we protect our personal information, while also enjoying the benefits of big data?
Perhaps it's time to re-think the way these consent forms are presented.
Instead of wordy and hard to read forms, pictorial contracts, often used for people who cannot read, could make the fine print easier to understand.
The easier the information is to digest, the more fully informed is our consent, thereby giving us greater control over our data.
New Curtin University research also suggests that we look to the universities for inspiration.
In Australia, whenever a psychologist runs a study involving human participants, they must first obtain ethics approval in order to make sure their participants are not taken advantage of.
Why not require tech companies, and other institutions, to first get ethics approval before using our data?
Ethics approval might prevent the sorts of data abuses that have recently come to light.
Since customers are increasingly making purchasing decisions based on their personal values, it may even be profitable for companies to listen to what their customers want.
Whatever methods are used to improve data consent, we should act sooner rather than later.
We are still in the relatively early days of big data use, so it is important that we think hard now about the ethical implications of this technology, to prevent further abuses in the future.
Dr Adam Andreotta is from Curtin University's School of Management and Marketing.