Earlier this year, JustLabs published an extensive report about how the process of turning phenomena into data, otherwise known as datafication, could bring about fundamental changes to the understanding, defense, and promotion of human rights by the year 2030. What became clear as a result of the report’s findings is that the time to address such changes is now. Thankfully, there are already teams of rights practitioners addressing how data and technology infringe upon rights today.
In partnership with Amnesty International, JustLabs designed a workshop in April 2022 to tackle two urgent challenges that are facing the human rights field—activist surveillance and the overly broad non-disclosure agreements (NDAs) used by large technology companies. Over the course of two days, Michael Kleinman, director of Amnesty International’s Silicon Valley Initiative, and a team of rights practitioners in the private and public sector, academics, funders, and researchers delved into the problem of tech NDAs by adhering to a methodology that focused on key tensions to manage when constructing actionable solutions.
In the following interview, Kleinman shares key takeaways following the workshop. His responses have been edited for clarity and brevity.
Why is addressing overly broad NDAs so urgent?
MK: The urgency, which is only going to continue to grow over time, is that we basically live our lives using a handful of tech company platforms and services. Imagine going through your daily life and trying to totally cut out the biggest tech companies. Imagine trying to go through your life even for a day saying, “I'm not going to use Google, Amazon, or Netflix. That I'm not going to use Tik Tok, Twitter, Facebook, Microsoft, or even Zoom.” They are so intertwined in our lives that it's really impossible not to rely on their products and services.
We rely on these products and services not just in order to work, but for our lives outside of work. It’s how we communicate with friends and family. It's how we plan. And so these companies have tremendous potential control over our lives because of the massive amounts of data that they're collecting about each of us, which is concerning because at least in the U.S. (the situation is a little better in Europe) there are very few limitations about what companies can do with that data. And an increase in machine learning and artificial intelligence means that companies can make increasingly accurate inferences about our most private behavior from the data that we provide.
When you think about platforms like Facebook or Twitter, changes in the terms of service, content moderation policy, and their algorithms have an immediate impact on our lives. They change who we are able to connect with (whether or not people get de-platformed), the information we see, and the reach of that information. We often have no idea that any of this is happening. So when you think about the huge amounts of influence these companies have over our lives, it's incredibly concerning that they are so opaque and we really have almost no idea what happens inside these companies.
It's a sad fact about modern life that we are reliant on tech whistleblowers to give us insight into the decision-making at large companies that impacts almost every aspect of our lives. And one of the major tools that companies use to try and limit that kind of information from leaking out is NDAs.
If you were talking to a recent college graduate who was about to enter the technology workforce, what would you tell them about big tech NDAs?
MK: I would tell them three things. First, I would say that you don't necessarily have a lot of leverage to change the NDA, but you do need to read it and you do need to at least understand what you're getting into.
The second thing that I would tell them is, and here's hopefully where there's work that can be done, no matter how broad the NDA, different jurisdictions have carve-outs even if they seem ironclad. For instance, in California, an NDA cannot silence you from talking about sexual harassment. So I would just direct them to whatever resources are available so that they understand the legal limitations of whatever NDA they're signing.
The third thing I would say is if you see something once you're employed, that's really critical and upsetting that you think you need to reveal or expose, or if at any point you're thinking about becoming a whistleblower, that's a really significant step and you need to be prepared.
I would also let them know that if this were to happen, there are resources that exist that can be really helpful, like the Tech Worker Handbook. In other words, I would try and take a realistic approach because an entry level college grad isn't going to really have the leverage to force a big company to renegotiate necessarily the language of an NDA but that doesn't mean that they don't have any power themselves.
How does that approach fit into the problem you wanted to address during the workshop?
MK: We [Amnesty Tech] have been working to support tech whistleblowers in particular, through our partnership with the Signals Network, which helped produce the Tech Worker Handbook. And increasingly, we have grown aware of the ways in which NDAs really freeze and chill what is a necessary conversation. And it's not just employees. Journalists are sometimes asked to sign NDAs as well as outside researchers. As we began to dig deeper into this issue, the salience of NDAs became increasingly clear.
What perspectives had you not considered prior to attending the workshop?
MK: When we started this conversation, I thought that the focus on NDAs was too narrow. I thought at the very least we should focus on other ways in which tech companies try to control the dissemination of critical information about what they're doing. As we delved deeper into the issue, I increasingly realized that NDAs were such a foundational issue when it comes to chilling speech that a focus specifically on NDAs actually did make sense.
Was there part of the methodology that you were hesitant to, but then actually ended up opening your thinking?
MK: It was actually the dashboard exercise, which at first I thought “Why are we doing this? We all know how to have a conversation. What is the point of adding levers and dials and all of this?” And it was really only as the workshop was going on that I had my “aha moment,” which was that using the dashboard forced us to do two things that led to an incredibly positive and productive conversation. First it forced us to prioritize and think about cause and effect which is something that is rarely, explicitly brought out in a conversation and was incredibly powerful.
(This image of a dashboard illustrates the various tensions that would be needed to be managed if a certain intervention or solution were to be pursued. Photo courtesy of JustLabs).
Second, more than anything else, it brought structure to the conversation. And the great difficulty, with two or three-day workshops, especially workshops that bring together a lot of people who are relatively informed but have not worked together before, is that it's far too easy for that conversation to remain at a very abstract level and to run in circles.
And so what the dashboard did was it really funneled and focused our conversation and led to a much more concrete outcome than I think we would have achieved just talking about it on our own.
Could you talk a little bit about what those outcomes were?
MK: We identified two necessary parallel tracks to address this issue of overly broad NDAs. The first was to ensure that employees know their rights. Often people don't read NDAs. Even if you read them, they can be full of legalese so you're not exactly sure what you're signing, and different jurisdictions have different carve-outs with NDAs. We realized that any project to address this needed to start from how we engage tech workers and soon-to-be tech workers (people who are being recruited) currently where they are, to make sure they understand their rights and that when confronted with an NDA, even if they don't have the leverage to renegotiate it, they at least understand what they're signing and what its limits are.
And then the second track was to address the world as it is and how we start to create the world that we want. We understand there are legitimate corporate concerns around intellectual property and trade secrets. But how do we get to a world where NDAs are correctly calibrated, when people do see something of the public’s interest in the public interest, they can say something.
That requires research and a much better understanding of what is the range of current NDAs. So how can we collect NDAs? How can we analyze different NDA structures that companies use? How can we do a much more intense analysis of the legal protections for workers in different jurisdictions?
Based on that, how can we then come up with a model NDA that we think is much more appropriate? And then, what are the most effective ways to push for legislation or regulation in different jurisdictions to try and move us towards that sort of model NDA. Then there was that second sort of much more in-depth research and proto-campaigning track.
What were the immediate next steps for you and your group after you returned home? And what stage are you currently in?
MK: A bunch of us came up with a draft concept note for how to address these two tracks. We've sent the concept note to a donor, and now we're in the midst of conversations to see what might be funded because to really move forward there does need to be resources. And so we're in the stage of trying to identify and secure the necessary resources.
Could you paint a scenario of how the landscape of NDAs would change if this kind of project were funded?
MK: In the short-term, we would have a workforce which was much more aware of its rights and in the longer term, in the U.S., Europe, and other key jurisdictions like India, we'd start to see either legislation or regulation that pushes back against overly broad NDAs.
What would an ideal NDA look like for you?
MK: An ideal NDA would not silence people when it comes to sexual or racial discrimination or harassment within a company. Second, it would be able to balance that tension between the companies' need to protect its intellectual property and trade secrets and the public's right to know the decisions that a company makes that have massive public policy implications. Where and how you draw that line, how you create that balance— that's what we need more research to look at.