ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. or Join now .
Sign up for the free Nautilus newsletter:
science and culture for people who love beautiful writing.
NL – Article speedbump
Explore

One question for Nita Farahany, a philosopher at Duke University who studies the implications of emerging neuroscience, genomics, and artificial intelligence for law and society.

In Body Image
Photo courtesy of Nita Farahany
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Will neurotech force us to update human rights?

Yes. And that moment will pass us by quickly if we don’t seize on it, which would enable us to embrace and reap the benefits of the coming age of neural interface technology. That means recognizing the fundamental human right to cognitive liberty—our right to think freely, our self-determination over our brains and mental experiences. And then updating three existing rights: the right to privacy, freedom of thought, and self-determination.

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

Updating the right to privacy requires that we recognize explicitly a right to mental privacy. Freedom of thought is already recognized in international human rights law but has been focused on the right to free exercise of religion. We need to recognize a right against manipulation of thought, or having our thoughts used against us. And we’ve long recognized a collective right to self-determination of peoples or nations, but we need a right to self-determination over our own bodies, which will include, for example, a right to receiving information about ourselves. 

If a corporation or an employer wants to implement fatigue monitoring in the workplace, for example, the default would be that the employee has a right to mental privacy. That means, if I’m tracking my brain data, a right to receive information about what is being tracked. It’s recognizing that by default people have rights over their cognitive liberty, and the exceptions have to be legally carved out. There would have to be robust consent, and robust information given to consumers about what the data is that is being collected, how it’s being used, and whether it can be used or commodified. 

I’ve written a book that’s coming out in March, called The Battle for Your Brain: Defending the Right to Think Freely in the Age of Neurotechnology. One of the chapters in the book explores the line between persuasion and manipulation. I go into the example from Facebook experimenting on people, changing their timelines to feature negative or positive content. It was deeply offensive, and part of it was the lack of informed consent but a bigger part was it felt as if people’s emotions were being toyed with just to see if they could make somebody unhappy in ways that you could measure.

In a world of neurotechnology you can measure the effect of those experiments much more precisely because you can see what’s happening to the brain as you make those changes. But also, these technologies aren’t just devices that read the brain. Many of them are writing devices—you can make changes to the brain. That definitely requires that we think about who controls the technology and what they can do with it, including things to intentionally manipulate your brain that might cause you harm. What rules are we going to put into place to safeguard people against that?

ADVERTISEMENT
Nautilus Members enjoy an ad-free experience. Log in or Join now .

I’m optimistic we can get this done. There’s already momentum at the human rights level. The value of the human rights level is that there will be lots of specific laws that will have to be implemented to realize a right to cognitive liberty locally, nationally, and across the world. But if you start with a powerful legal and moral obligation that’s universal, it’s easier to get those laws updated. People recognize the unique sensitivity of their thoughts and emotions. It’s not just the right to keep people out of your thoughts, or the right to not be manipulated. It’s a positive right to make choices about what your mental experiences are going to be like, whether that’s enhancements, or access to technology, or the ability to use and read out information from that technology.

Lead image: AndryDj / Shutterstock

close-icon Enjoy unlimited Nautilus articles, ad-free, for less than $5/month. Join now

! There is not an active subscription associated with that email address.

Join to continue reading.

You’ve read your 2 free articles this month. Access unlimited ad-free stories, including this one, by becoming a Nautilus member.

! There is not an active subscription associated with that email address.

This is your last free article.

Don’t limit your curiosity. Access unlimited ad-free stories like this one, and support independent journalism, by becoming a Nautilus member.