Sensing the city, update one: our approachCities 29 November 2009 | 10.00am
This update is in three parts: ‘our approach’; ‘the hardware’; and ‘sensing’.
A quick technical update on our mobile phone sensing project with UTS (see earlier post for context). This project is exploring technical approaches to sensing the presence of mobile phones in transit environments (bus, train, ferry etc.) as well as pedestrians, in order to provide real-time data on such activity, potentially informing urban planning and transport planning decisions.
Such approaches might reveal how the city is being used, in real-time. This write-up will get a little geeky in places, but we share it in the hope you’ll find something interesting in the overall idea or the particular approach, and do feel free to contribute via the comments form at the bottom of each post. We’re interested in your feedback.
Our colleagues at the Centre for Real-Time Information Networks (CRIN) at UTS have made significant progress in terms of both the sensing process and the hardware prototypes.
Dealing with the first part, we’ve been exploring a ‘stack’ approach to sensing phones, starting with scanning for Bluetooth, then wi-fi, then GSM, and so on. This is partly due to ease of sensing, and partly exploring ethical issues i.e. if people have Bluetooth turned on, or wi-fi connecting to routers automatically, can we assume they are more likely to be happy to be sensed? (Probably not, due to poor design on the part of mobile phone software meaning many may leave it on by default without paying much attention to it thereafter, but part of the point of the research is to explore these issues of privacy and security as well as technical approaches.)
And dealing with the first of those wireless technologies, CRIN have made particular progress in terms of sensing Bluetooth. Using the basic Bluetooth scanning functionality in a PC or Mac Mini, say, we can sense people with Bluetooth turned on and visible if they’re walking past slowly, due to the relatively slow default scan rate i.e. it takes a while for the scanners to detect and observe the phones in the vicinity (the scan rate takes over a second, and is dependent on the number of devices. In essence the scanning uses multiples of 1.28 seconds, with the number of multiples increasing the liklihood of finding all devices. A good quick summary can be found in this PDF.)
As we’re trying to spot a couple of things – for example, both passengers in transit or waiting at a bus-stop (more static) and also pedestrians (moving at around 1-5m/s) – and given the likelihood of groups in these scenarios and the low numbers of people scannable, we needed to increase the Bluetooth scan time.
There are legal and illegal ways to do this. Choosing the former route, CRIN have made great progress in terms of speeding up the scan time, and the detection rate. Software is being written in Python, on the Linux operating system – rather than say Processing on Mac OSX, where the need to parse higher-level languages with limited direct interfaces between Bluetooth drivers and Java would slow things down a little – and several hardware approaches have also been explored, with the current solution considering using multiple Bluetooth dongles in an array, staggering scan times
The range of Bluetooth (class one) effectively turns out to be around 5-20m (depending on the particular dongles employed, the structures in that environment, and so on. Wi-fi is much broader). Of course, in transit, on a bus, train, or tram, or relatively stationery at a bus-stop/platform, the captive audience is much easier to spot.
(NB: the RTA assumes people walk about 1.2 m/s on average, according to their transport planning regulations.)
Essentially, the array of Bluetooth dongles is now able to scan phones much faster, and certainly within our intended environments of buses, trains, bus-stops, platforms, stations etc. Recall that the original rationale for this project is to generate real-time feeds on transit activity in urban areas, as most current transit data is not real-time, not particularly scalable, doesn’t uncover individual ‘multi-modal’ trips where someone might walk to a bus-stop and then switch to a train, say.
Given this impetus, the scan-rate from the Bluetooth array achieved above is certainly good-enough as a start. The next requirement is to wirelessly communicate this data in real-time to ‘the cloud’, via a small robust ‘box’ that could be installed in such environments.
In the next post, we’ll discuss the emerging hardware prototype.