Salmon Upstream, LLC

Developing and implementing real-world community solutions.

Electric skateboards, social media, and the future of human society

Confused? Me too. For sure, I didn't see this coming. I have had plenty of crazy ideas, chance interactions, and eerie coincidences, but who would have thought the missing element would be something so simple and yet so frustratingly complex as consent.


Image

What could these possibly have in common? Quite unexpectedly, consent.

Confused? Me too. For sure, I didn't see this coming. I have had plenty of crazy ideas, chance interactions, and eerie coincidences, but who would have thought the missing element would be something so simple and yet so frustratingly complex as consent.

Last Monday night, a friend showed our "man night" group the 2018 documentary Three Identical Strangers (which says more than a little about our group; it's OK, I am proud to be a nerd). This is the true story of identical boys who were separated at birth and adopted by three different families, only to discover each other's existence some nineteen years later. What unfolds is the fact that they were just one set of twins that were part of a well-orchestrated medical study seeking answers about the fundamental debate on nature vs. nurture. Though that discussion is intellectually compelling, I tend to lump it in with discussions about the nature of the universe: realistically pointless. I say that because I am much more interested in issues that you can actually do something about, and these are not those. If it's all nature, does that mean we lock up kids whose parents break the law because it's in their DNA? If there is no God, does that mean it's OK to be an asshat?

The much more actionable debate is the one about consent, because whether you know it or not, it is fundamental to what is currently going wrong with our world. And we may be able to actually do something about it.

The ethical issues relating to this twin study parallel those demonstrated in Rebecca Skloot's The Immortal Life of Henrietta Lacks. Both of these stories follow societal changes in medical ethics, and most significantly, changes in our definition of consent. In the 1950s and 60s, this wasn't even a topic of discussion. In medicine today, the term consent isn't even sufficient: it is informed consent. And that's where things take a hard turn.

Sixty years ago, scientists were just as thirsty for knowledge as they are today, and the need to expand our understanding of things like human disease was used to justify being, perhaps, a bit less than forthcoming with potential research subsects. Just as no one told Henrietta Lacks that they were going to clone the cells of her cervical cancer, no one told the families adopting these babies that they were involved in a twin study. They didn't even tell them the kids had siblings. Look, if we tell them we are studying them, it will muck up the data. So let's just not tell them, mmmK? It's all for the greater good.

But it wasn't all for the greater good. Studies can be poorly designed so they won't teach us anything of value, or extremely dangerous and expose people to unnecessary harm. The ethical principles of medical research were more formally defined in The National Research Act of 1974, which created the National Commission for the Protection of Human Subjects of Biomedical and Behavioral Research. The guiding tenants include respect for persons and specifically vulnerable populations like children or the incarcerated; and beneficence, which expands upon that protection and requires weighing risks against the potential benefits to society.

There is also another important factor with exponentially increasing influence: money. Just as Cyndi Lauper said, money changes everything.

Today, any study on human subjects of any kind in the US should be reviewed by an institutional review board. The role of an IRB is to determine whether the design of a study meets our current ethical principles. That review includes careful evaluation of informed consent.

It's not enough to design a safe and important study, your subjects have to agree to participate. And to agree, they have to completely understand what they are agreeing to, every possible risk, every possible benefit, all of it. The burden is on the researchers to explain it to them and to be sure they understand. It's not enough to just have them sign some form, you have to have someone trained in obtaining informed consent go through the documentation. They have to be able to say no without any adverse consequences, and they have to be able to opt out at any time.

And it can't be about money. Not even a little.

But it doesn't stop there, because there is something so inherently obvious in any research experiment that we often forget about it: no one knows what is going to happen. (Uh, that's the point, eh?) So, another job of the IRB is to monitor for unintended consequences, and mandate changes to the protocol or even cancel the study all together if unforeseen adverse outcomes cannot be mitigated.

This is an article about a Harvard study that went a bit sideways. The researchers used de-identified social data from Facebook. The study was IRB approved. They were careful in the design -- this was not some nefarious plot -- but they ran into unintended consequences: when research assistants downloaded the information they realized that it was possible to piece together who some of the people were, simply because the data itself contained information which could re-identify them (they were in these classes and did these activities and ate at these places, etc.). So now information that was intended to be anonymous was very much not anonymous. Oops.

Turns out, these are really difficult studies to do, because the technology and the way we interact with it is changing so rapidly that neither researchers nor IRBs can keep up.

And here's the real kicker: anyone reading this droning blog is already participating in a variety of similar, ongoing research studies. And not one of these studies would pass the most lax IRB on earth for about a half a dozen reasons. Because they aren't being called research. And they never get to an IRB.

And they are all about money.

Google. Apple. Facebook. Siri. Alexa. Exactly where is your data going? Why does my iPhone know I am going to trivia at the Yellow Sub on Wednesday? How did maps know to fill in the exact street this morning when I punched 2-4-0 into the search window without entering a single letter? Why after a discussion of drones one night at a friend's house did my wife's Facebook page begin showing ads for drones -- that night?

This is medical research on human subjects. These experiments are associated with very real risk, and not just to privacy: emotional, financial, even physical risk. And the only people the designers are talking to about this risk are people with a vested interest in the outcomes. And lawyers. Lawyers who also have a vested interest in the outcome, which is a whole different level of wrong. And these experiments are all about money, which is a fundamental no-no.

Yes, you checked a box, and you "accepted" the risks. Or did you? Did someone explain those risks? Did an external third party assure that every remote possibility was adequately described in a language that you understand? Remember, ethics mandates that the onus is on them to do their due diligence as researchers when they invite you to participate.

But that's the rub, isn't it? It's not medical research, it's business. And business ethics are governed by one set of rules: whatever you can get away with. Just look at Juul.

The following Thursday, I had the incredible privilege of meeting with a team of some of the smartest people I have ever been exposed to, kinda the nerd equivalent of a backstage pass. As he often does when I get to ranting, Mark Zuckerberg came up. The point made was that when it all started, no one, not even Zuckerberg, knew what Facebook would become. Like the iPhone itself, it was impossible to predict the individual or societal impact. Facebook today was likened to a country -- a virtual nation -- which is now a nation without a government.

Unintended consequences.

Social networks are here to stay. Smartphones are here to stay. And no one can predict how all of this is going to shake out. But one thing is imperative: we need to at least try to do it right. Which, in the case of Facebook, would mandate the removal of money from the equation. Let me know how that idea goes at the next shareholder meeting.

When I go on about a purpose-built social network, this is what I mean. Your data really is yours, and more and more, that data can be used to help you live better. And since we now all have a vested interest in the success or failure of everyone else, we need to think about how we might empower people to both use their data effectively, and keep it safe.

And you have to obtain informed consent.

Which means that individuals control their data. All of it. Not just a few things like social security numbers or credit card info, every single bit and byte. Because no one has any idea when it is all going to go sideways.

Here is an interesting tidbit I have learned while getting informed consent from patients for medical research: if you try to take something from someone, they almost always say no. But if you ask, and what you are doing might help advance mankind (even just a little), they almost always say yes. You will get more and better data if you just ask. But you have to be genuine, both in the way you ask and in what you are doing.

The standard practice today is to try to take. We need to change that to one in which we ask. Only then will we be able to actually achieve the things we are trying to do, the things we need to do. Like help people be successful in life, to be happier in life, to be healthier in life. It's such a subtle difference, but sometimes the devil is actually in the details.

Wait a minute -- what about the electric skateboard? That's what I came here for. I thought there would be videos of people falling off electric skateboards...

So I got this One Wheel, and who would have predicted this thing would be so useful? Probably not the guys who built it, who were really looking to emulate snowboarding. This thing primarily appeals to kids who are very used to falling off of things and getting hurt and posting videos of people falling off of things and getting hurt. Of course, the thing came with a dozen warnings and pesky little lawyer-written terms and use agreements jammed in various places in the box. The risks of falling off and getting hurt are more than obvious.

And then the thing turns out to be useful. And old people like me start using them. Old people who aren't used to falling off things and getting hurt and would rather not fall off anything or get hurt, and sure don't want to be in any online video doing any such thing.

Those included warnings? They give you information about how the thing works and what not to do. No one can fault them for that. But there isn't anything about what might happen if the battery or some controller fails, at which point the thing apparently suddenly stops and does what the kids online call a "nosedive". Which often results in you falling off and getting hurt.

Unintended consequences.

And this is business, not research. So the lawyers and those with a vested interest aren't going to inform the participants anything about how this might occasionally go a little sideways (or suddenly stop dead, as the case may be). Because that would be bad for business. And the plan would probably be OK were the thing not so damn useful. Because those kids are used to falling off and getting hurt. No one predicted that old people might end up in the mix. And I have a feeling that is going to go about like it did when three genetically identical brothers stumbled into each other.

HERE IS THE BLUEPRINT FOR PHISION, A USER-CONTROLLED SOLUTION