The whole attempt at trying to get LCDs to run at any arbitrary refresh rate seems like it will never work consistently enough to be any sort of releasable system.
Normally, it should be video source that defines consistency, display should only fulfill minimum requirements to follow along with the given rhythm. But in any case, it's only with arcade games where this is a problem, otherwise refresh rates are pretty standard.
All the media is already converted in production to correct refresh rates for your region. As far as TVs are concerned this is all just about motion interpolation to compensate for large screens. Extras, such as 'frame rate scaler', that's just for NTSC/PAL conversion, and that's as far as REAL-TIME framerate conversion goes.
But wait a second, if the MAXIMUM refresh rate of some LCD is 60fps, why would it not be able to do 57fps or 53fps?
I would imagine that there is a vast amount of variability between different LCD driver in as far as what frame rates they actually display.
Are you saying LCDs can't be driven by the rhythm of the video source, they have their own internal clock, and even if they made them tick at exactly 60Hz some LCDs still end up at 59.93Hz, some at 60.42Hz and so on?
I would also imagine that the rate at which it attempts to display frames, and the rate at which the actual liquid crystals can change is variable, as was suggested previously. Since monitor speeds are generally stated as 'gray to gray' i would also be lead to believe that there is variability in the speed at which different colours change to one another, again independent of what rate they are being instructed to change.
Variability in crystal response should not matter as long as they can satisfy some minimum to maintain internal refresh rate in worst case scenario. Maybe there will be impact and colors will go funny if you really go tough on them, but the framerate itself should not suffer because of it, more likely is that crystals would go bad if they can't cope with given refresh rate.
I doubt there is any technical reason why an LCD could not be updated at a fully arbitrary frame rate. In face I imagine there are probably ones on the market right now that would happily be fed a random refresh speed and display it just fine.
I think there is a reason, even if only economical, but I thought you suggested they have their own internal clock and work in some kind of non-cooperative and asynchronous mode completely independently from the rhythm given by the video source?
However, I would assume that most are going to simply ASSUME they will be run at ~60hz and happily drop frames that don't line up since 99.9% of the consumers will not only not care, but not even notice.
That sounds like truth.
In traditional CRTs the actual speed of the electron gun was being driven by the sync signal, which is why arbitrary rates were a non-issue.
Yes, so what drives the refresh rate of an LCD, the rhythm of the video source or the rhythm of an internal clock? But even if it's internal clock, why it would need to be fixed? I don't know, but it surely looks like LCDs come with completely fixed output refresh rates.