To program a prototype of a scripted chatbot that doesn't reinforce harmful gender stereotypes, and recommends books with strong diverse characters + books written by people from social groups whose views and narratives have been historically sidelined in popular literature.
Xela is a simple, gender neutral chatbot that aims to help users find more diverse books. It has been designed with a special set of inclusivity guidelines called the 'PIA Standards' written by AI researcher, Josie Young. The bot gives a set of recommendations on various diverse categories such as Race/Ethnicity, Gender Identity + Sexuality, Feminism, and Disabilities.
Most popular voice assistants are gendered to be female, and tend to be quite disturbingly passive to verbal or sexual abuse aimed at them by their users. Companies design these assistants to be unfailingly chirpy and polite (even in the face of harassment) because that sort of behavior maximizes a user’s desire to keep engaging with the chatbot. This raises many ethical concerns, as these female-coded virtual assistants also reinforce stereotypes of women being servile and submissive beings. Therefore, it is vital that companies are mindful of unconscious bias when designing chatbots.
Xela is a scripted chatbot whose persona is set to be gender neutral - with imagery suggestive of a cat. Moreover, the conversation design follows the aforementioned PIA Standards, with the tone of the bot being polite yet assertive.
This project was created while on the FutureLearn Course on "Designing a Feminist Chatbot" by UAL Creative Computing Institute.