Yes, it's myth. Check
@GrumpyCoder's post above. He posted a technical report that illustrates that heat has zero practical impact on semiconductor longevity under temperatures these systems are usually exposed to (100C or below). And there are other papers. And of course, there is the indirect evidence of Apple laptops not failing in droves even though they regularly reach temperatures of 100C. Like the fact that my research group regularly runs heavy-duty stat simulations on their laptops over night. I run the stats for my PhD yers ago, took my laptop about two weeks of more or less continuous 90-100C. Number of CPU failures in the group after 12 years I've been there? I can remember only one. Among hundreds and hundreds of machines.
The origin of the myth likely stems from the old overclocker community who would experience CPU hardware failures when attempting heavy overclocks without sufficient cooling. But the main issue there is voltage, which accelerates the detrimental effects. You don't have to worry about this stuff with consumer laptops, if they are well made.