Godzil (./8083) :C'est expliqué sur Wikipédia :
Le plus bizzarre c'est qu'apriori il y à moyen mathématiquement de ne pas changer la fréquence d'affichage, je ne sais pas pourquoi ils l'ont fait au final, il doit y avoir une raison a la con autre que 'oh avait pas trouvé cette solution"
https://en.wikipedia.org/wiki/NTSC#Color_encoding : If non-linear distortion happens to the broadcast signal, the 3.579545 MHz color carrier may beat with the sound carrier to produce a dot pattern on the screen. To make the resulting pattern less noticeable, designers adjusted the original 15,750 Hz scanline rate down by a factor of 1.001 (0.1 to match the audio carrier frequency divided by the factor 286, resulting in a field rate of approximately 59.94 Hz. This adjustment ensures that the difference between the sound carrier and the color subcarrier (the most problematic intermodulation product of the two carriers) is an odd multiple of half the line rate, which is the necessary condition for the dots on successive lines to be opposite in phase, making them least noticeable.
The 59.94 rate is derived from the following calculations. Designers chose to make the chrominance subcarrier frequency an n + 0.5 multiple of the line frequency to minimize interference between the luminance signal and the chrominance signal. (Another way this is often stated is that the color subcarrier frequency is an odd multiple of half the line frequency.) They then chose to make the audio subcarrier frequency an integer multiple of the line frequency to minimize visible (intermodulation) interference between the audio signal and the chrominance signal. The original black-and-white standard, with its 15,750 Hz line frequency and 4.5 MHz audio subcarrier, does not meet these requirements, so designers had either to raise the audio subcarrier frequency or lower the line frequency. Raising the audio subcarrier frequency would prevent existing (black and white) receivers from properly tuning in the audio signal. Lowering the line frequency is comparatively innocuous, because the horizontal and vertical synchronization information in the NTSC signal allows a receiver to tolerate a substantial amount of variation in the line frequency. So the engineers chose the line frequency to be changed for the color standard. In the black-and-white standard, the ratio of audio subcarrier frequency to line frequency is 4.5 MHz⁄15,750 Hz = 285.71. In the color standard, this becomes rounded to the integer 286, which means the color standard's line rate is 4.5 MHz⁄286 ≈ 15,734 Hz. Maintaining the same number of scan lines per field (and frame), the lower line rate must yield a lower field rate. Dividing 4500000⁄286 lines per second by 262.5 lines per field gives approximately 59.94 fields per second.
Godzil (./8083) :Y'a des avantages sur le PAL (insensibilité aux déphasages), mais aussi pas mal d'inconvénients (plus cher et plus compliqué à implémenter, impossibilité de combiner directement deux sources même genlockées, chrominance qui "bave" davantage verticalement, "SECAM fire"). Tout chauvinisme mis à part, je trouve pas que le SECAM soit réellement mieux, d'ailleurs il a été choisi en France sur des critères plus politiques que techniques (entre un système conçu par un Allemand et un autre conçu par un Français, peu de temps après la seconde guerre mondiale, le choix était un peu orienté ).
Puis est venu le SECAM, qui en théorie a la meilleur qualité
Godzil (./8083) :En fait il y a deux légères variantes du NTSC sur ce point :
Tiens fait amusant, en NTSC la NES a un noir qui est trop foncé, et ne doit pas etre utilisé parceque sinon le téléviseur va prendre ce noir comme un signal de VBLANK