I have notice certain arcade games using the same clock frequency signal for the CPU, GPU/TIA/VIDEO ROM, ROM CHIPS, RAM CHIPS
But other arcade games using different clock frequencies signals for the CPU, GPU/TIA/VIDEO ROM, ROM CHIPS, RAM CHIPS
I don't understand why and what's the difference between using the same clock frequency compared to using different clock signals for each chip