There are lots of things that can cause "judder" in an image but the telecine process is not one of them. If you have a television that will take a true 24p signal (or even 23.976p or psf) please post the brand here; sincerely, I am trying to find an alternative to the
very expensive and tiny broadcast monitors we use in the edit suite. Otherwise all televisions in North America are playing everything at 29.97 (sometimes erroneously referred to as 30) or, at best, 59.94/60; most is still interlaced. So even if you have a 1080p24 file you cannot play it back on your television without conversion (tell me I am wrong, provide me the model of your television monitor, and you will make me and my producers very happy).
Curiously, in most of Europe they use a 25fps playback so even they are not looking at the original 24 frames.
In North America, throughout television history, almost all dramatic television and film has been recorded at either 24fps or 23.976 fps and broadcast at 29.97 fps (59.94 fields per second). We have been looking at telecine all our lives. Any judder you perceive has to do with a faulty conversion or improper connection (sending interlaced signal to progressive monitor, or vice versa) or improper settings. Not telecine.
Oh, and while killing myths, the reason film is shot at 24fps is not because it looks better (it really doesn't, especially in action sequences) but because 24fps was the slowest they could run the film and still make it appear like the sound was in sync. It is an economic choice, not an aesthetic one. In fact, before the introduction of sound most silent films were shot at 16fps, saving even more money on film stock. So why shot digital film at 24fps, when, for instance, The Hobbit was acquired at 48fps? Again, it's all about the money. Storage ain't cheap.