LMAO
LIDAR doesn't even work in *heavy* rain!!!!!! Gather the point cloud data from a heavy rain and you'll how noisy the data is. Completely useless.
Humans drive fine in rain with vision. If rain/fog was so bad, humans would pull over until conditions clear as stated in the DMV handbook. Follow the same rules. Simple.
LOL shadows????? That's your silver bullet argument? Give me a break. Not hard to detect shadows. Clearly you're not understanding how the system works.
This is not a competition between lidars and cameras, as they are different technologies with different strengths.
Almost all useful detection technologies (apart from very-short range scanning) use electromagnetic radiation. Frequencies or wavelengths range from tens of gigahertz (radars) to hundreds of nanometers (UV). In practice, there is a gap in between radars and IR, but terahertz technologies are emerging to fill that gap. Detection can be either passive sensing (e.g., cameras) relying on external radiation sources or active with own radiation source (radar, lidar).
In general, smaller wavelength (optical) gives better resolution than longer wavelength (radar) but worse penetration and shorter range. This is dictated by physics (diffraction). Active technologies are useful in distance measurements and independent of external illumination. Passive technologies are simpler (and hence less expensive), and they can be used to detect radiation sources, which is especially useful in thermal radiation (IR).
There is no silver bullet. Camera-only approach sacrifices a lot of depth information. Accurate and reliable depth detection from two or more camera images is possible in most cases but difficult and prone to pattern-related alignment errors. That is one of the purposes of camouflage patterns.
It is true that if the ambient illumination is at a good level, passive optical detection is more robust than active in heavy rain or fog. This is due to the physics of photons reflecting back from rain drops between the transmitter and the target. That is why fog lights exist, the problem is exactly the same. On the other hand, there is no difference in the dark, as the only existing light sources are own headlights, and then we talk about active detection (without the timing benefits).
Another example would be a situation with dark concrete, moderate rain, opposite direction traffic with bright headlights, poor road lighting, and a pedestrian in black clothes. The light from opposite traffic headlights scatters from rain and droplets on the lens cover deteriorating the contrast and essentially hiding the dark object. In this case, a NIR-lidar would be able to see the pedestrian (due to being an active technology, and due to the different wavelength where the black clothes would not be that black anymore), and also a passive IR imager would see the heat from their face. A terahertz radar might also be very useful.
While it is true that humans drive in heavy rain, humans do not necessarily do it at an acceptable risk level. Very few people actually follow the DMV handbook. A multi-modal machine vision system would be much more capable of "seeing" what happens in the environment.
Trying to make it work with optical cameras only is a rational decision from the commercial point of view; cameras are dirt cheap. Discarding the capabilities offered by other technologies, however, reduces the amount of information significantly.
There is no reason to think lidars would be hideously expensive in the future. ToF (time-of-flights) cameras are essentially lidars and cost relatively little. Lidars can be made inexpensive once the volumes scale up. There are also some interesting opportunities if the headlights are modulated and used as an active source for ToF imaging, but this requires some R&D before it happens.
(Disclosure: I am involved in remote sensing industry but not in automotive lidars.)