Smartphone Surveillance
Anti-abortion campaigners have turned to technology to stalk women who are at women’s health clinics: by using GPS tracking to find out who is, or has been, near a clinic that offers abortions, and sending them advertisements to dissuade them from getting an abortion. Apparently it’s totally legal, too: using a horribly creepy technique called geofencing, which is used in marketing, they can send women who are sitting in abortion clinics advertisements pressuring them to have babies, or give them up for adoption. One evil sounding advertising agency boasts of “being able to reach every Planned Parenthood in the US and...gather a tremendous amount of information”, then using that data to send anti-choice ads to women while they are actually at the clinic. Euggghhh. Not to mention the data that they are gathering from women who are at, or near, the clinics - nor, what they will do with it in the future.
|
|
Women in tech, or tech in women?
Some start-up ideas make me think: “Wow, how did nobody think of that before?” And others evoke the exact opposite reaction: why oh why would anybody ever come up with that idea. Here’s an example of the latter - a tampon with Bluetooth connectivity that lets the user know when the tampon needs to be changed. And they’re not the only ones going on this weird line of thought: there’s a whole bunch of “smart” products which are targeted at menstrual needs, and even - perhaps most pointless of them all - a pregnancy test that connects via Bluetooth to your phone. What kind of data will these devices be gathering, and where will it be sent or stored?
On the topic of technologies to make controlling women even easier: it’s not new, but the Gates Foundation are funding the development of a remote-controlled contraceptive implant, which would last up to 16 years. If we try for a second to imagine the worst case scenario for this kind of device, it’s terrifying.
|
|
Responsible approaches to image data
So, this might not strike you immediately as a responsible data issue, but I believe it is. Censorship on social media platforms has serious, discriminatory consequences, and the fact that these examples are targeted against women’s bodies is yet another way of technology being used to control women. As an example: Instagram has banned photos showing women’s nipples (in certain cases) - but always allows men’s nipples.
Their approach has led to a backlash and the #freethenipple campaign, with some creative answers - like the Tata Top, which makes bikini tops with nipples printed on them. Or these women, who have photoshopped male nipples over their own, to show just how ridiculous their rules are. Digital data comes in many formats - including images - and the ways in which it is managed (or censored) says a lot about how we want to visualise and understand the world around us.
Instagram also took down this photo of a woman menstruating, saying it broke their community guidelines. For more examples of censorship of women’s bodies on social media, and their consequences, I’d recommend this talk by Jillian C. York and Addie Wagenknecht, held at re:publica last month.

|
|
Calibrating for white skin
I have struggled with taking photos with white friends for years. Turns out, it’s not (just) me being bad at taking photos- it’s yet another example of bias built into technology. As Syreeta McFadden explains in this great piece, the first colour film was calibrated to a woman with pale white skin, called Shirley. Essentially, light settings and colour standards are designed for white skin tones - not for brown, or black ones, meaning that “standard” film makes dark skin appear strangely, sometimes charcoal coloured, sometimes far darker than in reality, or unrealistically lightened. There’s a whole history of how filmmakers and professionals have dealt with it in the past - either by throwing a lot of light on black actors, or (please tell me this doesn’t still happen) - “rubbing Vaseline on black actors’ skin”.
Nowadays, there are things that can be done to fine-tune brightness and colour when it comes to filming - but, wow, isn’t it almost unbelievable that in this high-tech age, it’s still difficult to take a photo of a black person and a white person together?

- photo from Imgur
|
|
Responsible Data food for thought
Back in 1951, a black woman called Henrietta Lacks died of cervical cancer, and her cells - taken without her, or her family’s knowledge - became one of the most important tools in medicine. Henrietta’s cells, known as ‘HeLa’, could not only reproduce but actually grow incredibly rapidly outside of the body, and so they were taken, grown, bought and sold for billions, leading to breakthroughs like polio vaccines, and chemotherapy drugs, to name just a few. But during all this, and to this day, her family remained (and remains) living in extreme poverty.
“She's the most important person in the world and her family living in poverty. If our mother is so important to science, why can't we get health insurance?” - quote from one of her children, taken from the Immortal Life of Henrietta Lacks by Rebecca Skloot.
Clearly, there’s injustice at play here - but how do we remedy the fact that laws are not keeping up with technological advancements as quickly as they need to? It turns out that what scientists in the 50s and 60s did with Henrietta’s cells was legal (at the time) - and not legal anymore. For Henrietta’s family, the changes in law after the event don’t change the fact that they had no idea until years afterwards that Henrietta’s cells were still alive, nor that many other people have made millions from those cells, without their consent or knowledge. I’ve just finished reading a fantastic book called The Immortal Life of Henrietta Lacks which explains the story of her and her family, as well as the injustices and data ethics issues that we now face -and if you want a quicker intro to the story, check out this Radio Lab podcast.
|
|
Community updates
-
IF, a creative company working on “the future of the internet” are looking for a researcher + a creative technologist to help them work on design, data and privacy issues: apply by June 17th.
-
A project we worked on with MercyCorps just launched - a Data Starter Kit for Humanitarian Field Staff, complete with a series of tip sheets on responsible data issues within the context of cash transfers.
-
AlgorithmWatch, a newly launched non-profit initiative looking at algorithmic decision making, is looking for the public’s help to gather sources and types of algorithms as well as use cases for algorithmic accountability.
-
At the engine room, we’re kicking off a new project together with Global Open Data for Agriculture and Nutrition (GODAN), looking at responsible data implications and considerations around open data usage in the agriculture sector, with a special focus on the least powerful players in the sector.
-
Relatedly: Cadasta have been working on open property rights, and recently published a series of useful outputs, including a risk assessment framework. Their work is ongoing - get in touch with Lindsay Ferris if you want to know more.
-
Open Knowledge Finland is hosting an event on MyData, from August 31-September 2nd, in Helsinki -must say, currently an unfortunately un-diverse set of speakers…
-
The Alan Turing Institute, the UK’s new national institute for data science, has an open call for Fellowships, with a deadline of July 13th.
-
Rahul Bhargava just published this great summary of a talk he did on “practising data science responsibly” with some practical recommendations for businesses working in this space, and a whole bunch of links to other smart people thinking about these topics.
-
On June 9th, CUNY is hosting an event on Critical Data Visualisation in NYC. We’re very happy to see some familiar faces on the schedule from the Responsible Data Visualisation event that we co-ran in January, and looking forward to seeing the outputs!
|
|
As always: Feedback, suggestions or links for the next newsletter are very much welcomed!
- Zara & the engine room.
|
|
|
|
|