Description
Incorporating machine learning technology in artistic and creative practice often calls forth new and critical ways of looking at data and data-informed automation processes. Many generative artificial intelligence tools (especially those concerning neural synthesis) that have become easily accessible in recent years, tend to afford a peculiar yet increasingly relevant practice of data curation. This curatorial practice seemingly re-embodies a subjective position-taking and lived experience, whilst making visible many idiosyncratic attributes of being human that are often overlooked through holistic statistical analyses.This paper will thus revolve around this emergent process of re-embodiment through the lens of two recent AI-assisted inter-media projects – that’s what they said (2022) and The Wernicke’s Area (2022). The former uses deepfake technology, real-time audio neural synthesis, and machine generated news articles to articulate certain socio-political undertones of being a Chinese creative practitioner living in the West. The latter, informed by medical and biographical data obtained through an international multi-disciplinary collaboration, evokes new understanding of the real complexities of living with epilepsy.
| Period | 16 Jun 2023 |
|---|---|
| Event title | Music Ex Machina: Methods and Methodologies for Technology-Centred Practice-Based Research in Contemporary Music |
| Event type | Conference |
| Location | London, United KingdomShow on map |
| Degree of Recognition | International |
Documents & Links
Related content
-
Research output
-
Ombra / The Wernicke's Area
Research output: Non-textual form › Composition
-
The Wernicke’s Area
Research output: Non-textual form › Composition
-
that’s what they said
Research output: Non-textual form › Composition