...there is a theory that 48 Hz is causing the brainwaves to more or less synchronize with that. And because in a cinema there are no distracting other lights, the whole audience gets slightly hypnotized. So that is why after 15 minutes everybody stops to talk and silently watches the movie...
This is the reason that Roger Ebert so vigorously lobbies for 24fps. He says that 24 is more closely related to the brain wave frequencies that reflect a state of mind of fantasy and passive acceptance, meaning that you can more easily become emotionally immersed in the drama.
Actually, those brain waves are far away from 24 Hz and nearly as far away from 24 as they are from 30 Hz (typical video frame rate) which is the frame rate he seems to abhor so resolutely, so I think the argument is pretty thin.
To answer the OP's Q I think you have to define "cinematic". I think the larger definition has much more to do with the depth of field in a Panavision camera as opposed to a garden-variety video camera, and also more to do with the gamma response curve. Both seem more important in defining "cinematic" by my personal yardstick than does having a frame rate so low that the motion artifacts it induces become an expected part of the experience.
And BTW, what critical flicker frequency is relevant to human perception can't be boiled down to a single number; image brightness and screen size/distance and ambient room light, among a host of other factors, modulate the answer significantly. Movies are delivered by illuminating each frame twice, so the image rate may be 24 fps but the flicker rate is 48 fps, and that is not by any stretch a "magic number"; it is designed to be sufficient for "most viewers" under "most circumstances" only, meaning perception of fluidity is a dynamic subjective process that can't really be quantified all that simply.
But if you define "cinematic" by how much frame-rate induced motion artifacts there are, such as flicker and judder, that can be a valid definition as well.
Some posters loathe frame rates faster than 24. They have a right to do that, even if you discount the "snob" factor, which is also alive and well, but I suspect a lot of this is because of a life-long accustomization to the flicker of film, and the judder of 3:2 pulldown (coupled with a stubborn refusal to adapt to change). IOW, all film and all TV that any of us have been exposed to until just recently is full of flicker and judder artifacts, and it seems foreign when those artifacts are taken away. It takes some getting used to. My advice? Get used to it. 4K and 8K video is coming, and it will have at a minimum 48 fps frame rate.
When faced with 60p or 120Hz or 240 Hz frame interpolation, that removes the artifacts. Many see this as plastic and weird, and invoke the "soap opera effect" argument. For myself, it took about a grand total of 4 seconds for me to abandon any love of flicker or judder artifacts when I first bought a 120Hz TV. I see it as a significant, major improvement. I will not buy a new set without 240 Hz frame interpolation, possibly 480 if its available. I am more interested in video looking closer to how things move in real life than I am having it seem closer to what film and its limitations might provide.
But then I sit very close to the screen, with about a 53-degree angle. Sitting ultra-close means that the reduction of motion artifacts means more to me than it might to some. The recommended distance for a 60" screen is 7.8 feet. The ATSC suggests a 30-degree angle (which 60 at 7.8 is); THX suggests a 36-degree angle, and most people sit so far away that their eyes can't even resolve the 1/60th of a degree of arc that supports HD resolution at that distance, meaning that most viewers don't even enjoy the full benefits of HD resolution because they are sitting too far away for the limitations of human vision to be able to take advantage of HD resolution; they might as well be watching in SD. If you have a 42" screen and sit 12 feet way, that's you. FOr folks sitting at reasonable or far distances, maybe the motion artifacts of film are more comforting than they are annoying as they are to me.
I also predict that 720p is becoming a dead format. I was a huge supporter of 720p, and one of the reasons was due to the frame rate and reduction in motion artifacts. But I see a day in the not-too-distant future where all content is sent as 1080p24, because that format is not bandwidth-intensive compared to the impractical and never-yet-used 1080p60 (is similar in file size to 720p and 1080i30), and virtually all TVs will have frame-rate interpolation available to convert it to true 1080p60 or even up to 1080p480, which would be a much easier implementation for broadcast or internet delivery than 4K.
Those sets that it has not trickled down to will still be OK at 1080p24, but they will have the same motion artifacts as film. Many would probably prefer that anyway, since they seem to want to cling to those motion artifacts rather than having accurate reproduction.