What happens if you get into a collision with an autonomous car? Who is legally responsible? The driver? The GPS? The carmaker? Google? And whose fault should it be? The answer is that legally, we don’t know. The development of autonomous cars is accompanied by numerous legal uncertainties that, if left unresolved, may discourage companies from investing in the development of such technologies, despite their great promise for reducing the number of deaths and injuries on our highways. State legislatures should be encouraged to develop a clear legal framework that encourages investment in the development of autonomous vehicles.Currently, carmakers are releasing and developing driver assistance technologies which include semi-autonomous systems. How will the law treat these threshold technologies? Existing driver assistance features (like radar systems that automatically brake prior to an accident) involve countless owner’s manual disclaimers that the driver is at all times responsible, but what about the next level of automation? What if you could make your car follow the car in front of it perfectly … would the driver be responsible then? A legal regime should encourage the prudent development of these threshold technologies in order to advance towards more perfect safety systems. Fear of exposure to liability could discourage innovation regarding imperfect driver assistance technologies, but these partial measures must be developed in order to advance toward safer and more autonomous systems. A legal liability scheme demanding crash-free perfection would be an enemy of progress.What for the fully autonomous vehicle? Do we require there be a licensed driver? Could a Cornell student from New York City who has never driven a car rent an autonomous Ithaca Carshare? It’s hard to imagine why anyone would need to prove they can parallel park if their car can do so automatically. Nevada has updated their laws to allow texting while “driving” a licensed autonomous car, but Nevada has prohibited being intoxicated in the same autonomous vehicle.In present day software use, users frequently click “accept” on a license agreement that disclaims any harm that results from using the software. It is understood that any software product will have some type of latent glitch. What does this mean when the software is driving a family down the highway? Would you get in an autonomous car that made you click “accept” to a license agreement that disclaimed any harm that results from the autonomous car driving off a bridge? Would a court enforce such a contract?What should the law be regarding speeding and autonomous cars? Is the car company negligent in designing a car with a feature that allows the car to knowingly break the law? A current car is capable of breaking the speed limit; an autonomous car would know it is breaking the speed limit. One radical suggestion (which would certainly require clear legal rules) is to have the car decide for itself what the optimal safe speed is. After all, an autonomous car would have a better grasp of its handling, current driving conditions, obstacles on the road and visibility than a speed limit which has remained static on that stretch of road for 50 or more years. Also, an autonomous car may not compete well with the constant bending of traffic laws you see on any busy highway or at any busy intersection. Would a car that followed the traffic laws perfectly seriously disrupt the flow of traffic and is that unsafe and counterproductive? The invention of autonomous cars requires new laws to resolve these questions. I do not propose to have the answers to what the best laws should be. To some degree, the exact laws that are adopted (so long as they’re reasonable and do not deter innovation) are less important than the existence of a clear legal regime within which the developers of autonomous vehicles can innovate and integrate their products onto our roads. Legal uncertainty inhibits innovation by risk-averse companies and people afraid of potentially ruinous liability. One is hopeful that the law be updated early on, both to encourage the development of autonomous vehicles as well as to contemplate the law soberly and impartially before major and vested interest groups can hijack the lawmaking process.The great hope for autonomous cars is that fatal car accidents will cease to be regular news. Studies find that up to 90% of car accidents are due to human error. To realize this hope, our motor vehicle laws must be updated. It would be counterproductive and tragic if the motor vehicle laws, laws created to provide for safe and orderly public roads, stymied our progress by lagging so far behind the development of such revolutionary safety technologies as semi-autonomous and autonomous vehicles.Twenty years ago, a discussion of changing our laws for autonomous vehicles would have been delegated to a science fiction conference. Today, the Nevada Motor Vehicle Laws allow texting while “driving” an autonomous vehicle. Other states are considering following suit. Twenty years from now, our legal debate may be whether to allow society to face the risk of letting human drivers behind the wheel at all. That is, so long as the fear of legal liability doesn’t keep us from getting there.
Nick Kaasik is a first-year law student at Cornell Law School. He may be reached at firstname.lastname@example.org. Barely Legal appears alternate Fridays this semester.
Original Author: Nicholas Kaasik