From senses to sensors: autonomous cars and probing what machine learning does to mobilities studies
Cars are nowadays being programmed to learn how to drivethemselves. While autonomous cars are often portrayed as thenext step in the auto-motive industry, they have already begunroaming the streets in some US cities. Building on a growingbody of critical scholarship on the development of autonomouscars, we explore what machine learning is in open environmentslike cities by juxtaposing this to thefield of mobilities studies. Wedo so by revisiting core concepts in mobilities studies:movement, representation and embodied experience. Ouranalysis of machine learning is centred around the transition fromhuman senses to sensors mounted on cars, and what this impliesin terms of autonomy. While much of the discussions related tothis transition are already foregrounded in mobilities studies, dueto thisfield’s emphasis on complexities and the understanding ofautomobility as a socio-technological system, questions aboutautonomy still emerge in a slightly new light with the advent ofmachine learning. We conclude by suggesting that in mobilitiesstudies, autonomy has always been seen as intertwined withtechnology, yet we argue that machine learning unfoldsautonomy as intrinsic to technology, as the space between thecar, the driver and the context is collapsing with autonomous cars.