Years ago my friend who was heavy into audio used to laugh at some of the cable specs and reviews in the audio magazines. One vendor actually said you should connect their cables between the amp and speakers and run them for some number of hours, then disconnect them and swap them end-for-end to continue. Something to do with "conditioning" the cables. He was sorely tempted to start his own cable company with some high-quality mike cable and silver-solder it to connectors.
As for digital, I hate digital TV. In the old days you could still see through a little snow and hear through a little static, but now it's just large chunks of picture and audio dropping out. Verizon just had to replace the terminal unit at my house because of noise or signal loss on the upper channels, making them unwatchable (and unrecordable for time shifting). I need my "My Favorite Martian" fix!
Galvanic isolation is a step in the right direction - however its only providing perffect isolation at DC. Check the interwinding capacitance of your S/PDIF transformer - the ones with the best isolation sport a figure <1pF. If yours is higher you still might gain improvement in fitting a CM choke to your cable (or clamping on a ferrite). Also check to see if internally there's a dedicated ground wire going to the digital input termination (75ohms) from the mains trafo secondary. A lot of DACs just use their internal groundplane to carry the CM noise current, impacting the sensitive analog circuits using that groundplane as their 0V reference.
That is one reason why I prefer optical cables over the coax variant, you get a proper galvanic isolation between the high frequency digital device (TV, Blueray Player, PC etc.) and your analog system (The DAC has to be designed like a analog device...).
My DAC at home is from the studio market, not consumer, it has a galvanically isolated coaxial input. This helps too to control the output signal quality independent of the cabling in use (lenght, noise on the way).
@bob.....this is, as you say, garbage. Unless there is a lot of noise, crosstalk or some other very good reason, these cables will perform no better than the ones you buy at the $2 shop.
However....the guys who can sell these cables to the people with more money than sense will wind up richer than all of the posters on this column. If they can do that, good luck to them. Why didn't I think of this? :-)
I hope this cable is better than the audio cables I have tested from Audioquest. Audioquest RCA audio cables exhibit significant microphonics, so much so that vibrations from my home theater subwoofer created an oscillator. Yes certainly on long runs cables can be a factor in digital AV, but the garbage I have seen from Audioquest is not tha answer.
Your title here is an example of denialism. Based on what you write in the piece, a more accurate title would be 'I currently know of no way for a digital cable to make an audible difference'. Surely engineering is about finding ways to deliver satisfactory products to customers - if the customers are hearing someting then its our remit to explore the reason for that rather than lapse into knee-jerk denials of what they say they hear?
Whilst its demonstrably true that digital cables don't change bits, they can and do present different common-mode impedances to RF noise in a system. Many DACs are sensitive to common-mode noise on their digital inputs due to poor grounding and noise control design decisions - so that for example adding a ferrite clamp to a digital cable can make an audible difference.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.