Self-driving cars and federal regulations
As the development of self-driving vehicles continues, one concern that has been brought up is who or what can be considered a driver. The National Highway Traffic Safety Administration, in a letter to California-based Google, has addressed this issue by declaring that the software that controls the company’s self-driving cars can be considered a driver under federal regulations.
This is a major step in getting autonomous vehicles on the road, since the implications for this could mean that in the event of a car accident the developer of the software or artificial intelligence that controls an autonomous vehicle can then be held responsible, if investigators find that the error or action that caused the accident originated with the software. However, there are other challenges to allowing more on-road testing of these vehicles and to eventually allow a large number of them on the road. These include how to address safety tests and features, many of which depend on human action, such as the regulation that cars must have braking systems activated by foot control.
Google is concerned that humans might be tempted to override the autonomous system and take control of the vehicle, which could pose its own problem when it comes to auto accidents. The NHTSA understands that to allow more self-driving vehicles on the road, the regulations concerning cars and safety features must be completely rewritten and that Google must prove that its cars meet the standards developed for human drivers.
If completely self-driving cars are allowed on the road, human drivers might end up in collisions with these vehicles. In some cases, since the software controlling the car can be considered the driver, an attorney representing an injured victim may choose to seek damages from the manufacturer of the autonomous vehicle.