×

Bill Proposes Rules For Self-Driving Cars

A Rochester Democrat is proposing a series of regulations to govern the use of self-driving cars in New York state.

Sen. Jeremy Cooney, D-Rochester, has proposed S. 8468 to amend the state Vehicle and Traffic Law. It authorizes the operation of a fully autonomous vehicle without a human driver.

No state agency or local government would be allowed to prohibit the use of autonomous vehicles, automated driving systems or on-demand autonomous vehicle networks, nor would they be allowed to levy taxes or fees specific to the operation of autonomous vehicles. A proper registration and title would still be required.

“This legislation creates a regulatory framework for autonomous vehicles to be commercially deployed in the State of New York by detailing requirements for: the safe operation of a fully autonomous vehicle with and without a human driver, licensing and insurance requirements for autonomous vehicles, duties following a potential collision involving an autonomous vehicle, and integration into an on-demand autonomous vehicle network,” Cooney wrote in his legislative memorandum.

The Rochester Democrat wrote that New York law currently does not regulate the commercial use of autonomous vehicles, though there is a limited testing program that expires April 1, 2023. He also cited several statistics showing increases in deaths from car accidents and in accidents caused by alcohol or drugs.

“Autonomous Vehicle technology is one solution that can help reduce the number of traffic fatalities and collisions due to human factors such as distraction, exhaustion, and intoxication,” Cooney wrote.

The technology isn’t quite ready for use. In February, Tesla recalled nearly 54,000 cars and SUVs because their “Full Self-Driving” software lets them roll through stop signs without coming to a complete halt. The recall shows that Tesla programmed its vehicles to violate the law in most states, where police will ticket drivers for disregarding stop signs. The Governors Highway Safety Association, which represents state highway safety offices, said it is not aware of any state where a rolling stop is legal. Tesla agreed to the recall after two meetings with officials from the National Highway Traffic Safety Administration, according to documents. Tesla said in documents that it knows of no crashes or injuries caused by feature.

Selected Tesla drivers are “beta testing” the “Full Self-Driving” software on public roads. The company says the cars cannot drive themselves and drivers must be ready to take action at all times. The “rolling stop” feature let the Teslas go through all-way stop signs as long as the owner enabled the function. The vehicles have to be traveling below 5.6 mph while approaching the intersection, and no “relevant” moving cars, pedestrians or bicyclists can be detected nearby. All roads leading to the intersection had to have speed limits of 30 mph or less, the documents said. The Teslas would then be allowed to go through the intersection at 0.1 mph to 5.6 mph without coming to a complete stop.

Philip Koopman, a professor of electrical and computer engineering at Carnegie Mellon University, said 4-way stop signs are commonly placed to protect intersections for children when no crossing guard is present. He said Tesla’s “machine learning” system can mistakenly identify objects. “What happens when FSD decides a child crossing the street is not ‘relevant’ and fails to stop?” he asked. “This is an unsafe behavior and should never have been put in vehicles.”

Koopman said traveling through a stop sign at 5.6 mph is akin to treating it as a yield sign.

In November, NHTSA said it was looking into a complaint from a California Tesla driver that the “Full Self-Driving” software caused a crash. The driver complained to the agency that a Model Y went into the wrong lane and was hit by another vehicle. The SUV gave the driver an alert halfway through the turn, and the driver tried to turn the wheel to avoid other traffic, according to the complaint. But the car took control and “forced itself into the incorrect lane,” the driver reported. No one was hurt in the Nov. 3 crash.

In December, Tesla agreed to update its software to prevent video games from being played on center touch screens while its vehicles are moving.

NHTSA also is investigating why Teslas using the company’s less-sophisticated “Autopilot” driver-assist system have repeatedly crashed into emergency vehicles parked on roadways.

Last week Tesla said in its earnings release that “Full Self-Driving” software is now being tested by owners in nearly 60,000 vehicles in the U.S. It was only about 2,000 in the third quarter. The software, which costs $12,000, will accelerate Tesla’s profitability, the company said.

CEO Elon Musk said he’d be shocked if the software can’t drive more safely than humans this year. In 2019, Musk predicted a fleet of autonomous Tesla robotaxis on the roads by the end of 2020.

— The Associated Press contributed to this report.

Newsletter

Today's breaking news and more in your inbox

I'm interested in (please check all that apply)
Are you a paying subscriber to the newspaper? *
   

Starting at $2.99/week.

Subscribe Today