Speaking of moral dilemmas, I think it's moral for legislation to legalize and remove liability (in cases of no defects) from car companies and drivers of autonomous cars, even with the knowledge that they will kill people. It's for the greater public good in many ways, most importantly safety.
Re: the trolley problem, I think once again it's moral to legislate that 3 lives are more important than 1. Also, once it's programmed into the AI, I think it's actually better than pulling an actual lever to switch the tracks, as in this case, the car changing it's course to only kill 1 person is not pulling any levers and following what it's programmed to do.
Re: drivers/passengers lives vs pedestrians/other drivers, I think the car should place priority on it's own passengers instead of the other party which is likely to blame for the impending accident. But I'm not so confident in that this is the moral thing to do.
Then you can get much stickier and start measuring potential injuries (at worst) of passengers vs likely death of pedestrian, or 1 passenger vs 3 pedestrians, this stuff is all too much for me.