We live, contends Alexander Galloway, in an algorithmic culture. Algorithms are now inescapably embedded into everyday life transforming processes and objects from cultural artefacts into “smart” systems. But unlike most algorithms, which are obscured behind the black box of postindustrial processes, Intelligent Personal Assistant Softwares such as Apple’s Siri are imbued with voice and personality. That is, they are given a materiality and tangibility. This paper aims to interrogate the nature of this materiality, and specifically, the manifestation of the gendered voice. It is my contention that the gendered voice of Siri is symptomatic of the difficulties in performing trust and transparency in what is essentially an intangible process. As Christian Sandvig has argued, transparency and trust are processes that must be seen in order to be believed but the issue with algorithms is that for the most part they can’t be seen. Thus for these “robots,” the performance of human sociality, specifically the use of language, humour, and the presentation of gender are cunning manoeuvres that contribute to the performance of “trust” in the theatre of persuasion. Continuing Sandvig’s trajectory, this research seeks to explore the relationship between gender, sociality, and immediacy in these artificial systems.
|Number of pages||11|
|Publication status||Published - 2017|