Over the last few years there’s been a noticeable rise in the number of drivers opting to fit a “black box” to their cars to obtain cheaper insurance. According to some recent reports, these black boxes could save drivers as much as $360 a year.

advertisement

advertisement

The idea is that the boxes send location data to nearby satellites, allowing insurance companies to monitor how people are driving, offering discounts and even refunds to those deemed to be driving more safely. As a result, black box drivers tend to drive quite cautiously, avoiding fast acceleration and never exceeding the speed limit, wherever they happen to be. While this type of driving can be rather annoying to more experienced drivers, the whole concept of the “black box” exposes some fundamental truths about the way rules work. For example, you may notice the trend of some black box drivers adding bumper stickers to their cars, informing fellow road users that they’re not driving slowly on purpose, but rather that they’re only doing it for cheaper insurance. These bumper stickers put into words a process that we all tacitly accept on a day-to-day basis, but never quite admit. That is, that there is a certain amount of “flexibility” in the application of law—whether it be on roads or elsewhere. This “flexibility” is based on the fact that it is quite simply impossible for the authorities to enforce all rules on all people at all times. In this case, the black box driver freely admits that they’d break the rules if they weren’t driving under the watchful gaze of their electronic overseer; they’re only obeying the rules because they’re being watched. But what’s most fascinating about this example is that logically speaking, the black box doesn’t even need to contain any electrical gadgetry at all.

advertisement

Certainly, the driver wouldn’t know any different, as they’d still drive with the same acute awareness of the rules of the road. Of course, this would mean the insurance companies wouldn’t receive any telemetric data, but then, what are they using the data for, if not to compel “safe” driving? Rules, codes, and conventions The reason I find this all so interesting is because it ties in very much with my own research relating to human and robotic behavior, and the power structures that shape our everyday lives. Think CCTV warning signs for example, or target pictures painted on men’s urinals. But beyond these more overt forms of control, there are also many hidden social structures that encode our behavior and prompt us to behave in a certain way. The cinema is a good example. While there will always be those who flout the rules to a minor degree—such as people who check their phone while a film is on or noisily chat to their friends—instances of major disruption are few and far between, as most people adhere to the unspoken rules of cinema etiquette. In a similar vein, there are no formal rules about how to behave at a wedding ceremony, a funeral, or a job interview. While there may not be a written code as such, we all tend to have an idea of what constitutes appropriate behavior. In this way, we self-manage our own conformity, and in so doing also share the same hidden cultural codes with those around us. We do this to avoid censure and feeling bad, as we are constantly aware of the gaze of others. The invisible black box To bring this back to the car insurance example, what’s fascinating is that it doesn’t matter whether or not the black box really is watching our every move. Much rather, what’s important is that we think we’re being watched and we modify our behavior accordingly. This concept links somewhat with Jeremy Bentham’s famous panopticon concept from the 18th century. In his writings, Bentham describes an “ideal” form of prison where inmates live under the constant threat of surveillance. While in reality, each individual prisoner is barely watched at all, there remains the chance that they might be watched at any point.

advertisement

While Bentham’s panopticon has since fallen out of favor, the concept continues to this day, and ties in very closely with our understanding of surveillance culture and biopolitics—that is, the way the state takes life as its central objective, and frames our lives as constantly under threat. You may for example, notice that in many shopping outlets, the warning signs about CCTV coverage are far more prominent than the cameras themselves. This is because the security team can’t watch all people at all times. But the possibility of surveillance is used as a means to encourage good behavior. This same concept also applies to the insurance black box. While insurance companies undoubtedly do monitor the habits of drivers, the purpose of the box isn’t so much to monitor but to enforce good behavior. For this reason, it doesn’t really matter whether or not there are any electronic gizmos inside the box at all. The important thing is not that we are watched, but rather, that we obey. Whose rules are they anyway? This whole concept of the black box and what it says about our surveillance culture poses some interesting dilemmas. We accept conformity and regulation as a way to keep us safe from harm. But at the same time, we also want “freedom,” and to feel as though we’re still in control. This is why transgression is such an important part of modern life. If we don’t believe we have the free will to break rules, then we are forced to confront the tension at the heart of our everyday lives. On the one hand, we want “freedom,” but we also want safety, and to live our lives free from harm. We can’t have it both ways, so breaking the rules, if only just a little bit, gives us some access to the (illusory) “freedom” that we give up as citizens of the surveillance state. Something to think about next time you drive 80 miles an hour down the motorway. Mike Ryder is an associate lecturer in literature and philosophy at Lancaster University. This article was republished from The Conversation under a Creative Commons license. Read the original article.