SOPhiA 2022

Salzburgiense Concilium Omnibus Philosophis Analyticis

SOPhiA ToolsDE-pageEN-page

Programme - Talk

Do we need a cure for gender-biased AI?
(Philosophy of Mind, English)

In this talk, I will apply a post-phenomenological approach based on I-technology-world relation in order to inspect gender biases in technologies and how they influence ordinary users. The particular focus of his presentation is being fixed on gender biases on the level of cognitive skills. For that purpose, it analyzes AI personal assistants that are developed with an aim to mimic human interaction.

The philosophical approach to this politically pressing topic seeks not only to unveil hidden gender biases manifested in the cognitive skills of personal assistants, but, furthermore, argues for morally responsible technologies. It will show that such an excuse of AI developers as claiming their creations to the work purely on statistics and algorithms leads to discrimination and enforces biased vision further into humans with which it interacts. Due to the feedback mechanism, the gender-biased inclinations are fed back to the system and are caught in a loop. The outcome - deepening the biases.

The concept of implicit stereotypes as "culture in mind", which was put forward by Perry Hinton, can be extended to technologies, to suggest that it can actually be used as a weapon to combat culturally conditioned stereotypes through, for example, long term associative training with an aim to have an impact on explicit and implicit processes of cognition.

Chair: Martin Niederl
Time: 15:20-15:50, 07 September 2022 (Wednesday)
Location: HS E.002
Remark: (Online Talk)

Palina Yaroshyk 
(University of Warsaw, Poland)

Testability and Meaning deco