Report on 'Racial Bias in Natural Language Processing'

  • Walter Pasquarelli profile
    Walter Pasquarelli
    27 August 2019
    Total votes: 2

Dear colleagues, I would like to share with you our latest report on racial bias in natural language processing. We found that in its current form, introducing natural language processing in government risks excluding the needs and opinions of people of colour.

We reviewed the current academic literature and interviewed experts in natural language processing. We concluded that, should governments widely adopt natural language processing systems, there is a risk of racial bias in three areas: racial prejudices found in language in training data; weaknesses in filters designed to catch racist language; algorithms’ inability to handle linguistic variation.

As per usual happy to answer any questions, comments, etc. 

walter.pasquarelli@oxfordinsights.com