The second instalment of the Strangers Talks series – the Babbel employee initiative exploring issues of difference and diversity – was an exploration of representation and gender in marketing. Looking at imagery from marketing campaigns across different moments in advertizing’s history, Babbel’s Ben Davies unpacked the persistence of stereotypes in all manner of marketing, and the often insidious messages they carry. It was provocative enough to warrant a bit of follow-up discussion, here.
This was, on the surface anyway, a rather specific topic, given your talk was effectively one of the inaugural presentations in the series. And I guess I’m wondering whether you were working less from a place of principle or aspiration, and more from a place of necessity. Did the intersection of gender and marketing seem particularly pressing to you for some reason?
I think for me, this was very much a necessity. Within the movement for gender equality, there is discussion happening constantly about portrayals of women and men in various mediums, be it in television shows or in music, but what struck me as odd was that images from marketing rarely made it into discussions on portrayals of gender. Perhaps this is because we don’t consider marketing anything more than this annoying thing that tries to get us to spend our money, but the fact remains that images from advertising make up a large portion of the imagery we are exposed to everyday. Even on an unconscious level, this will start to have an effect on a person. (more…)
Big innovations in machine-learning have made some unsettling headlines the last year, holding a mirror to our own persistent biases by adopting them. When it comes to gender stereotypes, there’s a double-jeopardy nestled in how machines learn languages. Babbel’s computational linguist, Kate McCurdy, has been looking at how algorithms conflate semantic and grammatical gender, what this could mean for any application of so-called Artificial Intelligence, and how we might think about correcting course.
So, how about we start by just breaking down your project?
So, I’m looking at grammatical gender in word embeddings. Word embeddings are a kind of natural language-processing technology that are used in a lot of things. The core of this is an algorithm that learns the meaning of a word based on words that appear around it. In the past few years, we’ve seen pretty major developments in this area. Lots of research is happening, and big companies like Facebook and Google are using these technologies. A couple of years ago, there was this new algorithm that allowed you to train a model quite quickly and get these representations of word meaning that seemed to be really impressive. So, you could just automatically let it loose on a corpus and it would learn, for example, that “dog” and “cat” and “animal” are all related, or that “apple” and “banana” are related, without anybody explicitly telling it to. This is quite powerful, and it’s being used in a lot of technological applications. But we’ve started to notice that there are some issues with it. (more…)
Toward the end of 2017, a number of Babbel employees launched an ongoing series of internal presentations now known as The Stranger Talks – a sort of salon aimed at highlighting difference and diversity as ways of innovating and transforming how we work. The inaugural talk, given by Lars in our Didactics team, served as an introduction to the project’s central themes. We sat down to chat about it, and reflect on its impact. (more…)