• rachelkellynz

Losing Ourselves In The Digital Future

In business conversations across multiple companies, I often reflect upon how quickly we lose ourselves in the dream of technology and its potential - specifically around data mining, machine learning and AI.


We're often quick to estimate the business value and slow to consider the consequences. 


That's because there's a disconnect between what we can do in the digital world and what's right to do


These ethical questions are not new questions. They're just new to computer science. 


Fundamentally, our capacity to create and destroy life through mishandled data is rapidly outpacing our moral capacity to command such power at the scale, scope, and anonymity of 'digital'.

Like we've seen in other disciplines, ethics and morality boil down to people and trust.

People at the very centre of each decision. 


Building and maintaining trust on every line of code.


Followed closely by accountability when that trust is broken.


Fourteen years ago, my Master's thesis required ethics approval because I took biological samples from humans to develop forensic technology to put criminals behind bars. While it was a worthy cause that was awarded a New Zealand William Georgetti Scholarship, I still needed ethical approval from my peers.


And just like I needed a morality check to take pieces of a person for biotechnology research, what is the accountability when we take pieces of a person in the form of data?

Who owns it?


What is the expected use of it?


Can the original owner see how it's being used?


What is all this legal jargon when they accept the 'Terms Of Use'?


Why is a company allowed to abdicate their responsibility with the data?


Shouldn't a Global Bill of Digital Rights enforce companies to use plain, every day language so people can make informed decisions about their data?


But, the law shouldn't have to 'keep up' for us to do the 'right thing'. The GDPR shouldn't have been needed, but it was.


It should just be about being a good human. A good company. I'm trying to be that, aren't you?


So, this is a little reminder to keep people at the very centre of your decisions because how you decide to manage people's information today will shift the line of morality either one foot closer to good or bad. 


And like every decision that has a morally 'grey area', those seemingly small decisions eventually creep you to one side or the other.

I only hope that we don't find ourselves lost in a dark place with no hope of return.



Welcome to the complexities and responsibilities of 'creating life'. Artificial or not.

#machinelearning #ethics #datamining #computerscience #digitalbillofrights

If anyone knows of a simple, yet legally effective, plain language template for 'terms of use', please let me know in the comments below. I'm literally in the middle of converting the coHired legal speak into something simple and would love any help.

©2019 by Ethical AI.

Waikato, New Zealand

This site was designed with the
.com
website builder. Create your website today.
Start Now