It's not empty hype ‐ but there's no guarantee you'll be able to hear the difference.
Modern albums are typically recorded at sampling rates and bit depths far higher than those supported by CDs or typical home listening equipment. Whereas a CD supports 44.1khz/16bit digital music, music has typically been recorded at 96khz/24bit for quite a while now.
In the past, record labels typically supplied the 44.1khz/16bit masters to online music retailers like iTunes, who then converted them into AAC (iTunes) or MP3 (everybody else) format for sale. Therefore music was lossily converted twice:
Non-"Mastered For iTunes" (3 steps)
(#1) 96khz/24bit studio master recording → (#2) 44.1khz/16bit CD master
→ (#3) 44.1khz/16bit MP3/AAC version for online sale
Clearly, the middle step is a bit of a waste of time. "Mastered For iTunes" recordings skip the middleman, so to speak.
"Mastered For iTunes" (2 steps)
(#1) 96khz/24bit studio master recording → (#2) 44.1khz/16bit MP3/AAC for online sale
Mathematically, this holds water. It's a fact: all other things being equal, less information is discarded this way. Whether or not you'll be able to hear the difference is another story.
You're particularly unlikely to hear a difference if the studio master was recorded, mixed, or mastered poorly as a part of the ongoing "Loudness Wars." From a fidelity standpoint, these recordings are essentially pre-ruined before they ever reach step #2 or #3 in either scenario. =)