Search

Who Will Guard the Guards Themselves?


We have already recognized the ease and complete surrender with which we have offered up our data to the information oligarchs. But what is next? We have all seen the venture of big data internet companies showing interest in brain-computer interfaces as a means to understand your preferences under the guise of "augmentation". Will we continue to surrender our information - and possibly ourselves to corporations if it means the advancement of science, society, or even mere convenience? The advancement of the greater good? Not to get political, but when does self supersede the collective or should we be asking the question the other way around? Ethics is a touchy topic, yet an "elephant in the room" : When the Constitution of the United States was written, people were fighting with muskets. As we’ve advanced to fully automatic weapons, should we revisit the implications of technology on the second amendment, or should we agree that technology has advanced in both weapons and defense? Similarly, in neurotechnology, there is a greater push to understand how deeper regions of the brains work. These brain regions are ones that might produce information at high resolution that might provide input into both motor control (as is the case with neuroprosthetics) or epileptic foci, or in some cases, emotions. With increasing diagnoses of depression and an emphasis to understand more about brain waves and their function, could we have the same gun problem but this time in neurotechnology? Unfortunately, this is not a situation where laws and regulations can lead the way. The regulations are written around an archaic form of technology which is already outmoded today and will advance exponentially in the next few years. The regulators do not and will not know or possibly even understand what the future uses of the brain waves and signals might be. So we, as a community of scientists, should think about this concept of ethics and have an active conversation about this evolving field. It is very possible that future mining of data from the high resolution interfaces would intersect with the definition of ‘self’ in a way that previous generations could only comprehend in their dreams-slash-nightmares and dystopian science fiction. They may also deliver promise the likes of which we have never known. We need to prepare ourselves by asking questions that seem unlikely today - but could become reality within a hiccup of the geological time clock. To that end, we have asked Lauren Sankary, J.D. and Paul Ford, Ph.D. onto the show to discuss what the questions might be and where we should look for the pitfalls in the ethics of neurotechnology. Resources:

Bioethical Considerations in Today's Neurology and Neurosurgery Practice

International Neuroethics Society

Informed Consent

I Am Human

IEEE BRAIN Ethics Panel

1 view0 comments

Contact us to sponsor the podcast