Transducer, amplification and source absolute make a difference
Absolutely. There is a measurable difference between all of these things, and that has an effect on the sound you hear.
and are generally priced accordingly
With high end audio the price has little to do with the cost of the components and manufacturing and everything to do with the ads, gimmicks, and branding.
Listen for yourself.
Funny you should mention that. Properly testing audio is a rather complicated matter because of the problems associated with how people hear sound. We hear different frequency response depending on the volume of a source. We hear different frequency response due to constructive/destructive wave interference due to the shape of the room that people are sitting in (moving your head as little as an inch can change the frequency response at your ears by as much as 10 dB in an untreated room). We hear frequencies differently depending on the ambient air temperature, pressure, our own blood pressure, etc. It's very difficult to properly isolate sound tests down to individual variables. This is one of the reasons that so many myths abound . . . it's very easy to convince people that there's a difference when testing even when there is none.
I've participated in, and set up multiple double blind tests on audio cables and equipment in acoustically treated environments. The people who most strongly claim to be able to hear nuance between digital cables hear largely with their eyes, and when you force them to use their ears they come to very different conclusions.
With analog cables (especially using longer runs with very low source signal levels) there is an audible difference between high and low capacitance wires though. Telling the difference between short runs of high/low capacitance wire is very difficult, and I've yet to find someone able to tell the difference between two low capacitance cables of the same length.
Don't know any DAC that takes HDMI but USB, coaxial, optical and i2s cables do make a difference to trained ears.
Hmm. Not sure exactly what you're talking about here. Surely you know that HDMI signals are wholly digital. Every HDMI audio signal that passes through the wire then has to go through a DAC of some kind before it hits your speakers.
I was referring specifically of the ridiculously overpriced cables that big box stores try to sell with their home theater systems when mentioning HDMI. Due to the built in phase checking inherent in the design, HDMI is just not susceptible to data corruption problems. If your HDMI cable passes signal, it's good signal . . . regardless of what you paid for it.
USB is another perfectly valid example. If you have a USB cable, and it's functioning properly (and is shielded) it will sound identical to any other USB cable (provided it's within the lengths specified for the USB type). At least, that's why my testing shows.
I guess that it's possible that if the USB timing code is improperly implemented on the sender side, then you might run into jitter problems . . . but that's not the cable's fault. If you try to pass more data in a burst through the cable than what the standard allows for, that could also cause problems I guess. If you're running your cheap USB cable next to an MRI for example, you might also have problems . . . but I'd be shocked to hear that you wouldn't experience similar issues with a multi thousand dollar cable as well. There's no magic.
Sadly, a lot of manufacturers have spent a lot of money to lie about this to people. That's why misinformed folks like you think that 'trained ears' can magically tell the difference between these things. It's also why I said that the audio industry often makes me sick . . . lying and misinformation is omnipresent.