This information is from Prog Rock Man's original thread. It was difficult to compile and should be readily available to read.

1 - ABX Double Blind Comparator.

2 - Effects of Cable, Loudspeaker and amplifier interactions, an engineering paper from 1991.

3 - Do all amplifiers sound the same? Original Stereo Review blind test.

4 - Cable directionality.

5 - Head - Fi ABX Cable Taste Test Aug 2006.

6 - HiFi Wigwam, The Great Cable debate. Power cable ABX test Oct 2005.

7 - What Hifi The Big Question on cables. Sept 2009

8 - Secrets of Home Theatre and High Fidelity. Can We Hear Differences Between A/C Power Cords? An ABX Blind Test. December, 2004

9 - Boston Audio Society, an ABX test of Ivor Tiefenbrun, the founder of Linn. August 1984

10 - The (In)famous Audioholics forum post, cables vs coathanger!. June 2004

11 - Matrixhifi.com from Spain. ABX test of two systems. June 2006.

12 - AVReview. Blind cable test. April 2008

13 - Journal of the Audio Engineering Society, ABX test of CD/SACD/DVD-A. Sept 2007

14 - What Hifi, Blind Test of HDMI cables, July 2010

15 - Floyd Toole from Harman International (AKG, Infinity, JBL) Audio, Science in the service of art 1998

16 - Sean Olive, Director of Acoustic Research Harman Int, blog on The Dishonesty of Sighted Listening Tests 2009

17. Russ Andrews re-cable David Gilmour's recording studio (not a blind test) 2000-2001

18. DIY Audio forum, confessions of a poster. 2003

19. The Boston Audio Society, discussion of two blind tests and their analysis 1990

20. Cowan Audio, an Australian audiophile and a blind test between CD players 1997

21. Pio2001's own ABX test between CD and vinyl in Hydrogenaudio 2003

22. Tom Nousaine, article to Tweak or not to tweak? 1988

23. AV Science Forum, Monster vs Opus cables. 2002

24. Stereo.net, blind testing of two pre-amps April 2008

25. Stereomojo Digital amp shootout 2007

26. Head-Fi ABX Cable Test by member Edwood Aug 2006

27. Les Numeriques. A blind test of HDMI cables by a French site (Google Translator used)

28. Home Cinema Fr .Com, a French test of interconnects (Google Translator used) May 2005

29. Sound & Vision. Article by Tom Nousaine with 3 Blind Tests of speaker cables. c1995

30. Insane About Sound, Blind Tests of CD vs Audio Files and expensive vs cheap speaker cable. Wall Street Journal Jan 2008

31. AV Science forum, Observations of a controlled cable test Nov 2007

32. The Audio Critic, ABX test of amps Spring 1997

33. Expert Reviews. Blind test of HDMI cables. Expert reviews 8 Feb 2011

34. Blind test of six DACs, Stereomojo

35. The Wilson ipod experiment CES 2004. Stereophile Jan 2004

36. An evening spent comparing Nordost ICs and speaker cables. AVForums June 2006

37. A blind test of old and new violins. Westerlunds Violinverkstand AB March 2006

38. The Edge of audibility, blind test of recordings made with and without a mains filter. Pink Fish Media forum June 2011

39. Try a blind test of bit rates. mp3ornot.com

40. Blind test of CD transports Stereo.net.au Oct 2008

41. ABX test of tracks with various levels of jitter added. HDD Audio forum March/April 2009

42. Stereophile ABX test of power amps July 1997

43. Head-fi. A forum member testing cables sighted and blind Nov 2011

44. Audio Society of Minnesota. Speaker cable listening test. April 2012

45. The Richard Clark Amplifier Challenge - Reported by Tom Morrow June 2006

"Do the results indicate I should buy the cheapest amp?

46. Audio Video Revolution Forum, thread on blind speaker tests, Nov 2007.

47. PSB Speakers, blind comparison test of four speakers, Nov 2005.

Conclusion

●●●●The original thread is here: http://www.head-fi.org/t/486598/testing-audiophile-claims-and-mythsSo, we love to have a good discussion/argument/rant here (and on all the other audio forums I have seen) about the many claims audiophiles make that others dismiss as myths. The arguments go round in circles; I hear a difference - but there cannot be a difference, it is all in your mind - have you tried different cables? - I don’t need to it is all in your mind etc etc, we all know how it goes.Occasionally there are attempts to test such myths. WHF’s own Big Question is an example. Three What Hifi forum members are invited to their listening rooms and have been blind tested on cables to bit rates. From the issues I have read, there is a confirmation that the myths of differences are not correct, the differences are real. Different bit rates have been correctly identified, different cables have produced different sounds in the same Hifi kit. But, they are blind listening reviews, which are different from ABX tests where people are asked to correctly identify products.Here is a list of blind listening and ABX tests that I have found on the internet. What I have done is summarise their conclusions.The aim is to see what the overall result of these tests gives us and whether they provide evidence to back up or deny the reality of alleged audiophile myths. Before you read on here is a test you can try out yourself......and here is a very interesting article on a debate between audio sceptic Arny Krueger and Stereophile editor John Atkinson on ABX testingFinally, for those who say blind testing is designed to produce fails and discredit audiophiles, here are some positive ones where differences have been identifiedThis is a web site dedicated to such testing. Back in May of 1977 there was a comparison of amplifiers which found over three tests of two amps each, listeners could tell a difference in two, but not the third which was an even split.A test of interconnects and speaker cables found that no one could pick out the differences between a series of wires from ‘blister pack $2.50 to $990 speaker cable. All the results were even with approximately 50% going for the cheap and expensive options.There is an interesting comparison of ‘video cables’ which found that once over 50 feet it was easy to spot which was the 6 foot cable and the much longer one.DACs don’t fair well with CDPs finding an original CDP being distinguishable from a more modern one, but an expensive stand alone DAC being the same as a CDP.None of the tests involve a large amount of people and some are just of one person.Twelve cables are tested from Levinson to Kimber and including car jump leads and lamp cable, from $2 to $419 per metre. The results are based on the theory that loudspeaker cable should transmit all frequencies, unscathed to any speaker from any amplifier and loss is due to resistance. There is an assumption that letting through more frequencies with less distortion will sound better. But that seems reasonable to me.The best performance was with multi core cables. The car jump leads did not do well and cable intended for digital transmission did! The most expensive cable does not get a mention in the conclusions, but the cheapest is praised for its performance and Kimber does well. Sadly there is not a definitive list of the cost of the cables and their performance, so it is not clear as to whether cost equals performance, but the suggestion is that construction equals performance.(The original Bruce Coppola link is broken, and I cannot find any existing link at this time)A number of amplifiers across various price points and types are tested. The listeners are self declared believers and sceptics as to whether audiophile claims are true or not.There were 13 sessions with different numbers of listeners each time. The difference between sceptic and believer performance was small, with 2 sceptics getting the highest correct score and 1 believer getting the lowest. The overall average was 50.5% getting it right, so that is the same as you would expect from a random guess result. The cheapest Pioneer amp was perfectly capable of outperforming the more expensive amps and it was ‘striking similar to the Levinson‘.As an extra to this and for an explanation of how amps can all sound the same, here is a Wikipedia entry on Bob Carver and his blind test amp challengesNot the best link as it only refers to a test without giving too many specifics. The cable maker Belden conducted a test with an un named magazine which found the result was perfectly random.I liked the next sentence which was “Belden is still happy to manufacture and sell directional cables to enthusiasts”Three cables from Canare, Radio Shack and a silver one were put into the same sleeving to disguise them, a mark put on each one so only the originator knew which was which and then sent around various forum members. The result was that only one forum member got all three correct. The Radio Shack cheap cable and the silver were the most mixed up.Unfortunately I cannot see from the thread, which is huge how many members took part and what the exact results were.This is a very well done large scale ABX test. A similar set up to Head-fi where four mains cables including 2 kettle leads (stock power cords that had come with hifi products), an audiophile one, a DIY one and a tester CD were sent out forum members. The results were inconclusive to say the least, for example;The kettle lead was C. There were 23 answers :4 said that the kettle lead was A6 said that it was B8 said that it was C5 said that they didn't know.The overall conclusion was that the kettle lead could not be properly identified or that one cable was better than another.From the Sept 2009 issue. Three forum members were invited to WHF and blind tested where they though the kit (Roksan, Cyrus, Spendor) was being changed, but instead the cables were. The same three tracks were used throughout.The kit started out with the cheapest cables WHF could find and no one liked it saying it sounded flat and dull. Then a Lindy mains conditioner and Copperline Alpha power cords were introduced and the sound improved.The IC was changed to some Atlas Equators and two out the three tracks were said to have improved with better bass and detail.Last the 60p per metre speaker cable was changed for £6 per metre Chord Carnival Sliverscreen. Again, changes were noticed, but they were not big.Various swaps took place after that which confirmed the above, that the power cords made the biggest difference. When the test was revealed the participants were surprised to say the least!A comprehensive article with pictures and the overall result was 73 out of 149 tests so 49% accuracy, the same as chance.A rather complex testing of Ivor Tiefenbrun himself, who at that time was very pro vinyl and anti digital (the opposite almost of how Linn operate now!). There are various different tests and the overall conclusion was"In summary, then, no evidence was provided by Tiefenbrun during this series of tests that indicates ability to identify reliably:(a) the presence of an undriven transducer in the room,(b) the presence of the Sony PCM-F1 digital processor in the audio chain, or(c) the presence of the relay contacts of the A/B/X switchbox in the circuit."Even the founder of Linn could not back up claims he had been making when subjected to an ABX test of those claims.Two systems, one cheap (A) with a Sony DVD and Behringer amp (supported on a folding chair) with chepo cables and the other more expensive (B) with Classe, YBA, Wadia and expensive cables and proper stands were hidden behind a sheet and wired to the same speakers.The results were;38 persons participated on this test14 chose the "A" system as the best sounding one10 chose the "B" system as the best sounding one14 were not able to hear differences or didn't choose any as the best.Some of AVR's forum members attended at a Sevenoaks hifi shop and listened to the same kit with two cheap Maplins cables at £2 and £8 and a Chord Signature at £500. They found the cheaper Maplins cable easy to differentiate and the more expensive harder to differentiate from the Chord. Their resident sceptic agreed he could hear differences. The final conclusion was;....from our sample of 20 near-individual tests, we got 14 correct answers. That works out at 70 per cent correct....So that is the second ABX to join What Hifi which suggests there is indeed a difference. But like What Hiif it shows the difference in results from Blind to ABX testing and how easy it is to try and obscure the two types of test. http://www.avreview.co.uk/news/article/mps/uan/1863#ixzz0nGpGRfCB - note link broken, unable to find anotherYou need to be a member of the AES to access the article here; (EDIT, the link has changed and I cannot find the actual test referred to)a summary of which states "Another What Hifi test of three forum members who are unaware that the change being made is with three HDMI cables. As far as they know equipment could be being changed. The cables are a freebie, a Chord costing £75 and a QED costing £150. Throughout the test all three struggle to find any difference, but are more confident that there is a difference in the sound rather than the picture. They preferred the freebie cable over the Chord one and found it to be as good as the most expensive QED. That result is common in blind testing and really differentiates it from ABX tests.In my opinion the way the differences between the cables are reported, they can be explained by the fact that it would have taken three brave testers to have said there was no difference. They had been invited to a test expecting to be able to identify differences.A paper written by Floyd Toole which covers a number of topics about scientific measurements and audio. Go to pages 10 and 11 and there is a paragraph on blind testing. It shows how the 'differences' between speakers were greater when sighted tests were used over blind tests. The obvious conclusion is that sighted tests result in factors other than sound come into play when deciding on what sounds better.Research using 40 Harman employees and comparing the results of blind vs sighted tests of four loudspeakers. As with the above by fellow Harman director, sighted tests show bias that blind do not.Below the article are various responses to the blog, including a very interesting exchange between Alan Sircom, editor of Hifi Plus magazine and Sean Olive. Alan Sircom makes the very interesting point that volume has a role to play with blind tests"Here's an interesting test to explain what I mean: run a blind test a group of products under level-matched conditions. Then run the same test (still blind), allowing the users to set the volume to their own personal taste for each loudspeaker under test. From my (admittedly dated and anecdotal) testing on this, the level-matched group will go for the one with the flattest frequency response, as will those who turn the volume 'down', but those who turn the dial the other way often choose the loudspeaker with the biggest peak at around 1kHz, saying how 'dynamic' it sounds."I had not thought of that before. You will end up with different conclusions between a blind test where the volume is set and where the volume can be adjusted. Adjustment allows preferences for different sounds to be expressed, without other influences being present that clearly have nothing to do with sound.This is not a blind test, but I think it is worth including here. The studio used (and I think owned) by David Gilmour was re-cabled using Kimber cables by Russ Andrews. This was apparently after extensive AB testing. I would have loved that to be after extensive ABX testing!(Thanks to Pio2001 for finding the below tests and links)A forum member joined and confessed that " Then I started to hear about some convincing blind tests and finally conducted my own. I was stunned at the results. I couldn't tell a $300 amp from a $3000 in the store I was working at. Neither could anyone else who worked there." Then he did his own blind test on a mate between an Onkyo SR500 Dolby Digital receiver and a Bryston 4B 300 wpc power amp and a Bryston 2 channel pre-amp owned by his mate. The 'red faced' mate could not tell the difference.The BAS in an article discussing a CD tweek blind test by Stereophile; " In the CD-tweak test Atkinson and Hammond conducted a 3222-trial single-blind listening experiment to determine whether CD tweaks (green ink, Armor-All, expensive transports) altered the sound of compact-disc playback. Subjects overall were able to identify tweaked vs untweaked CDs only 48.3% of the time, and the proportion that scored highly (five, six, or seven out of seven trials--Stereophile's definition of a keen-eared listener) was well within the range to be expected if subjects had been merely guessing."Then the BAS are very critical of a Hifi News analysis of a blind test of amps from 2006; " Listeners scored 63.3% correct during those trials where the amplifiers were different (95 of the 150 A-BB-A trials). However, subjects scored correctly only 65% of the time when the amplifiers were the same (26 of 40 A-A/B-B trials.) Another way of saying this is that subjects reported a difference 35% of the time (14/40 trials) when there could have been no difference."A $1800 un named (they were reluctant to name it) versus a $300 Sony which resulted in both only guessing and getting about 50%. William Cowan stated that a sighted test before hand made them say "This will be easy, lets get on with the blind test". Ooops!The results were 3/7 and 5/8 correct.A test of identical CDP and speakers but different amps and cables, one being $300 and the other $5000. The results with 7 listeners of varying interest in hifi and 10 trials was a fail.Not particularly rigorous as in there were not enough tests, but as the poster states "And to cut to the chase, Mike could not identify the Monster from the Opus MM with any accuracy (nor the reverse, which also would have been a positive result if he had been consistently wrong) using our testing methodology. We stopped the test a little less than halfway through, I think we got through 8 A/Bs before we gave up."Its an Australian forum so the conclusion is typically forthright "CONCLUSION:There is bugger all between the 2 preamps, they were so close that any difference could not be reliably picked." The test was run well despite what doubts the tester has.Various amps were tested blind, in pairs where the preferred amp went through to the next round. The winner was one of the cheaper amps called the Trends Audio TA-10 at $130, which is the tiny one on the top right of the pileThree ICs made with Canare, Solid Silver and Rat Shack cables, but dressed to look the same. Only one person could tell the difference, which you would expect to happen when there is no audible difference and people are most likely guessing.Nine participants using no name, Belkin and Monster HDMI cables. Only one claimed to have a preference, but his feedback was inconsistent.The cables included ones from Taralabs, VDH, Audioquest and DIY ones. The result was that no one could reliably tell a difference.All three are fails by the listeners using their own hifi systems and with their choice of track, volume and time.Tests set up at an audio show in Las Vegas, found Wav files (52%) doing better than MP3 (33%) when compared with CD and in a test of $2000 Sigma speaker cables vs hardware store cable 61% of the 39 who took the test preferred the more expensive cable. So nothing conclusive for any of the tests, but interestingly John Atkinson and Michael Fremer from Sterophile magazine were described as easily picking out the more expensive cable.A blind test between Monster cables and Opus MM, which as far as I can find is $33,000 worth of cablebut the owner of the very high end kit and cables was unable to tell the difference.A letter by Tom Nousaine to The Audio Critic in which he describes an ABX test of the owner of a very high end system, where a Pass Labs Aleph 1.2 200w mono block amp was randomly changed with a Yamaha AX-700 100w integrated amp. In the first test the owner got 3 out of 10 identified, then 5 out of 10. His wife then got 9 out of 16 and a friend 4 out of 10 correctly identified.The letter is split between pages 6 and 7 of the link.Two TVs, two Sony PS3s and a James Bond film played side by side with the only variable being changed HDMI cables. What is interesting is that there was little difference with the picture, but much more perceived difference with the sound. But, as many preferred the sound of the cheap to the expensive cables.Note - not an ABX test and the reviewer acknowledges there could also be slight differences in the TVs and PS3s to contend with.Like the other blind as opposed to ABX tests this one found the cheapest and most expensive DAC in the final, with only a hairs width between the two in terms of sound.Tenth paragraph down. A 'trick' blind test where a group at a consumer technology tradeshow thought they were listening to a $20,000 CDP, but were actually, happily listening to an ipod and uncompressed WAV files.Sight really does have a major role to play in sound!Further to the above ipod experiment, a report from a member of the AVForums and his experience of sighted and blind listening tests at a dealers.The conclusion comparing the tests"And here's what I heard.1. All the cables sounded subtly different with one exception.2. Differences were less apparent with some music than others3. My assessment and experiences "blind" were different to my experiences "sighted""This is really a bit of fun, but it again shows how we hear differently sighted to blind. In this test 6 violins, three c1700 (including a Stradivari) and three modern were played to a group of string teachers who cast votes 1 to 3 on their preferred violin. The stage was kept dark and they could not see which was which. The Stradivari came last, a modern brand won.You can download and try the recordings yourself. Of those who have already, 2 preferred one, 6 the other and 10 had no preference.A really well set out and easy to use bind test of different bit rates.Well set up and described, but to reinforce the Australian stereotype, after one set of failed tests they admitted no one could hear a difference, gave up and drank some beers instead! (New link via a NZ forum)One member MM has recorded his scores and they are no better than random.There were 505 listeners producing the following nicely made graph of resultswhich is a bell curve around random, just as you get from guessing. Yet Stereophile claim there was success with test as some people did better than average. There could be some truth in that as there have been blind test passes for amps. Even so it is a very small part of those tested who really need to tested again to confirm whether or not they were just lucky. The test is not statistically significant enough to say there is an audible difference.This provides yet more evidence that sighted and blind testing produces consistently different results whereby people can hear a difference when sighted and cannot when listening blind.The results are very mixed with no cable making any clear difference. They accept there is no objective difference, but since there is a difference found which can easily be explained by random selection, they conclude a subjective difference is there and so allegedly "cable do make a difference"."Theis a listening test intended to show that as long as a modern audio amplifier is operated within its linear range (below clipping), the differences between amps are inaudible to the human ear."It is an ABX test which to pass needs two sets of 12 correct identifications. Reputedly over a thousand have taken the test and none have passed.No. You should buy the best amplifier for your purpose. Some of the factors to consider are: reliability, build quality, cooling performance, flexibility, quality of mechanical connections, reputation of manufacturer, special features, size, weight, aesthetics, and cost. Buying the cheapest amplifier will likely get you an unreliable amplifier that is difficult to use and might not have the needed features. The only factor that this test indicates you can ignore is sound quality below clipping."Which is a relief for those who have shelled out a lot on a nice amp.Positive results which strongly suggest speakers are clearly different even under blind testing conditions, both objectively and subjectively.The writer is happy he did not pick out the cheapo speaker, but he makes no mention of whether or not the speakers were easily identified as different or not.The clear conclusion is that ABX testing does not back up many audiophile claims, so they become audiophile myths as they show cables do not inherently change sound. Any change in sound quality comes from the listeners mind and interaction between their senses. What is claimed to be audible is not reliably so. Blind testing is also sometimes passed off as ABX. But blind testing is not really testing, it is a review of a product without seeing it, and that allows claims to be made about sound which have not been verified.If hifi is all about sound and more specifically sound quality, then we should, once the other senses have been removed be able to hear differences which can be verified by being able to identify one product from another by only listening. But time and again we cannot.So you can either buy good but inexpensive hifi products such as cables, amps, CDPs and be satisfied that the sound they produce is superb. You do need to spend time with speakers as they really do sound identifiably different. Or you can buy expensive hifi products such as cable tec and luxuriate in the build and image and identify one hifi from another by looks and sound. But you cannot buy expensive and identify it from cheap by sound alone.Here is The Institute of Engineering and Technology's conclusions on audiophile mythswhich backs up the above conclusions.