Robots and hypocrisy
Most of us wouldn’t think twice at stopping briefly by the side of a quiet road marked with a double-yellow line. But will your self-driving car be of a similar mind? The car’s decision-making software is likely to be subjected to a great deal of scrutiny, especially if evidence of law-breaking is ever found. So it’s safer for the programmers to simply make it extremely reluctant to even bend the law.
Software cannot safely be hypocritical because, unlike the human mind, the guiding logic is available to be inspected in detail - if not to a casual investigator, certainly to a judge with a warrant. Moreover, while it is possible to write obfuscated software, whose behaviour is difficult to determine, adopting such practices would be suggestive of having something to cover up. Humans only come in the obfuscated version, so enjoy the plausible deniability that that brings.
One advantage of having strictly law-abiding software may be that we will be forced to be explicit about where there is leeway in the law. Laws that nobody really follows may simply be repealed, like bans on recreational drug use. On the other hand, in some cases we may be unable to overcome our hypocrisy, and just have to live lives forced to comply with what we say we want, rather than what we actually want. We may get a world with no recreational drug use, even if nobody actually wants that!
This is a reason why a future which relies more on ems (human emulations) may be more comfortable for non-em humans. Although ems will probably face selection pressure to be law-abiding it would be very hard to truly purge them of any tendency to bend the law, since they will be replications of opaque biological systems. If it is an em driving the car, rather than a written program, they might indeed let you out at the double-yellow line!