Originally Posted by 1620_nz
Do you mean wearing them out?
Frequency intermodulation distortion is literally the cantilever vibrating and oscillating violently in the groove AGAINST the high frequency groove modulations in the polyvinylchloride traces. This can be due to the inertia of the cantilever's tip mass and/or from high frequency resonances that feedback between the actual groove's high frequencies and the resonant characteristics of the cantilever.
The Stanton 500 series tips, for instance, tend to have an intentional resonance that's part of its frequency response. It's one of the reasons why it is so linear from mid bass to mid treble, but makes it susceptible to re-grooving the highs, even on a very thin, delicate cantilever like the Emk2 (or whatever version they've got now). The AL2 has that resonance AND is a heavier tip mass. Flat, but flawed, and it's not particularly finessed, extended, or refined. A mint AL2 on a mint record can afford a very brief, silky linear playback, but will never sound that way again. And I'm not just talking about a little grunge and increased noise. Raspy, nasty, damaged treble transients will be left behind. On cantilevers that do not have an intentional resonance in the treble band to smooth the response, having an overly massy cantilever will still cause errant oscillations on high-amplitude high frequencies.
This is related to one of the reasons good vinyl playback usually sounds "better" than your average digital release... not the treble distortion itself, obviously, but this potential overt distortion that forces (or used to with competent mastering houses, at least) vinyl mastering engineers to preserve the full dynamic range and not over-do the highs. Mastering a vinyl master plate too hot overall, too compressed, and/or with two loud of high frequencies could cause even well-engineered cartridges and cantilevers to distort audibly and also damage the record. In vinyl, when the cartridge mechanically distorts it IS damaging the record. You can't separate the two, especially with the delicate high-frequency modulations in those grooves. The loudness war has traditionally been off-limits in the vinyl format, though there are certainly exceptions in cases of really bad pressings on recycled vinyl or cutting too hot (to maximize S/N ratio) or using too much compression.
Well recorded, mixed, and mastered digital in 20bit 48khz can sound as good as the very best analog, but it's easier sneak problems by than you can with a vinyl master. And for that matter, a good 16/48 recording can equal the best vinyl. Most commercial digital releases, though, are too compressed, and far too many of them show signs of 0dBFS clipped waveforms. If you use to cut vinyl, the diamond would never track that kind of groove accurately, leading to groove damage, diamond wear, and bad sound getting worse very rapidly with each play.