Even better: An electromagnetically shielded, gold plated optical audio cable. Sold at Sears [1].
I stopped buying cables retail a long time ago. My favorite supplier: RiteAV. They're (IMHO) the Amazon of cables. Huge selection, modest prices [2], and great customer service [3].
Not to knock your suggestion, but I've had similarly great experiences with http://monoprice.com . It seems similar to RiteAV, but much bigger and their prices seem on par (maybe 6-10 cents more expensive):
[1] made me hysterical for a few minutes. I know this old audiophile who insisted on buying the gold-plated kind when I helped him set up his home theater. I literally spent hours trying to convince him that it didn't matter, but apparently learned behavior just beats logical arguments and evidence.
I think the reason there is a market for these kind of ripoffs is that back in the analog era the quality of cables really mattered, and that's when the people who buy these learned what they know of tech. I don't think it gets harder to learn new ideas as you get older, but instead unlearning the ones that have worked for you gets progressively harder the longer you have had them. I only hope I can be more rational when I get older.
My patience for the line, "It is digital. It either works or it doesn't" ended a long time ago.
"Digital" signals are analog signals, but the only difference is that they are restricted to being above or below some threshold. But around that threshold there is a gray area where you can't really tell whether your signal will be interpreted as 1 or 0.
I am not defending the need for high-end HDMI cables. But saying "It's digital so don't worry" is wrong.
There are reasons that "digital" cables use twisted pair. Wow, even some "digital" cable standards specify the number of twists per inch required for reliable operation. And wow, some cable standards even specify shielded cable is required. I wonder why.
Here is a fun exercise for those that think "digital" signals don't need engineering steps to protect signal integrity: build all of your cables (or PCBs for that matter) with straight signals and you will have an excellent time trying to debug why your communications don't work. Bonus fun for making the cable length 1/4 of your signal wavelength.
I think the point of the "either works or it doesn't" is a comparison versus analog. With an analog signal you could get slightly better results by getting better cables (or worse with worse cables). If you have a poor digital cable it flat out doesn't work. If the cable you're using works, buying a more expensive cable isn't going to make it work any better. Either the 1s and 0s made it from the source to the destination or they didn't.
No one is claiming that there isn't engineering that needs to happen to make a digital signal reliably travel from the source to the receiver.
For HDMI in particular, there's no error correction at all on the video data (though there is on the audio and control data).
So if one bit gets flipped in the video data as it goes through the cable, you're going to see that wrong bit on screen. You'd have to have an insanely high bit error rate for your eyes to notice it, but it's there, and it gradually gets worse as the error rate increases.
What could flip a bit in a cable? Any number of things. EM interference from outside a badly shielded cable. EM crosstalk between conductors inside a badly designed cable. Signal reflections at badly designed connectors. Basically anything that can happen to any other cable, whether the signal that goes over it is considered to be analog or digital.
I assume the reason HDMI applies EC to the audio and control data is that errors in those channels are more noticeable to the user, whereas errors at reasonable rates in the video will mostly sneak by.
I've had this happen. You see what are called "sparkles" when you get a lot of failures (and it truly looks like sparkling glitter on your TV). It turned out to be the TV's connector more than the cable itself. The same $4 cable worked fine on a replacement set.
Totally true. HDMI takes many steps to try to reduce errors to a reasonable rate. In practice, this rate is usually low enough to ignore, even with cheap cables.
I was just making the point that there is indeed a progressive degradation in image quality with increasing error rate, even in HDMI.
IMHO, seeing sparkles on the screen means 'does not work.' Hearing audio errors means does not work. When pushing a digital signal around if you have significant errors you'll know quickly at the receivers end if the cable is bad. If you hook up your cheap cable and don't notice any of these errors, getting a better cable isn't going to make the picture sharper or sound clearer like in the days of pushing analog signals around.
Clearly you've never had an Ethernet Cable that got caught in a door. Or was terminated poorly. Or doesn't have the insulator sheath under the crimp. Or run parallel to a set of Florescent light ballasts. All of these events can reduce performance on Ethernet without causing it to fail outright.
There is a reason why every single TIA/EIA 568 cable plant is carefully tested for things like Near End Cross Talk (NEXT), Latency, Skew, etc..
There is also a reason why there is an entire _discipline_ dedicated to ensuring that the "Digital" signal not only has a channel, but a meticulously crafted one. (See: http://www.linktionary.com/t/tia_cabling.html for details)
To reiterates the OP - there is no "Digital Signal" - signals are analog in nature, and they need an analog channel to transmit them. That analog channel needs to be engineered carefully to get the lowest EBR (Error Bit Rate) possible.
What's interesting is the Failure case of the Digital CODEC on the other end - some may reduce transmit rate in the presence of an increased EBR resulting from dropped packets. Or some may seize up and cease to function. In the degraded case, having a pristine analog channel from the transmitter to the receiver will certainly result in a better digital signal.
There is also the issue of future proofing - We're installing an entire plant of Category 6a cable right now in one of our data centers for a gigabit environment - Category 6a is like the "Monster" version of Category 5e/6 - It seems like a douchey way to do a network install, except when you consider the following - (1) this cable plant will be in use for 10-15 years, (2) We're testing a bunch of Nexus 5010s that will likely make good use of that 6a for 10 Gigabit. So, in addition to passing _today's_ signal flawlessly, it will also pass tomorrow's. This might be an important factor if your HDMI cable is being run through hard to access cabinets/walls.
Now, with all that said, it is still probably the case that the $150 Monster Cables (under normal circumstances) don't have a lower EBR than a $10 cable, but, in the face of RF interference, It may actually be a better performer. I don't know, but I wouldn't discard the possibility without a bit of research and education.
Anything running over Ethernet has packets with checksums. Yes, noise will cause packets to drop, and that results in lower overall throughput, while connectivity still remains possible.
The signal running over HDMI is not packet-based. You don't have any mechanism like with TCP where packets can get dropped and resent later. So if interference causes bits to flip on an HDMI connection, you will immediately see the interference on your TV, and it won't be pretty; it would probably look something like an HDTV antenna signal that loses reception--missing or discolored chunks of the screen or sparkly pixels. The effect is immediate and disruptive.
The end result is that Monster Cable might do a better job of keeping your picture intact during a freak electrical storm, but there is no practical difference otherwise. If you are able to see a picture with HDMI, and you don't expect any wildly varying levels of RF interference in your environment, your cable is as good as anything else.
Again, nobody is disagreeing. This isn't a defense of Monster, it's a typical It's More Complicated Than That jibe. Don't buy Monster unless that flowchart checks out, but also realize that it's not impossible to see signal degradation on those cheaper digital cables.
A $10 cable should fail before a $250 cable, however the frequency of failures for an infrequently moved or manipulated cable should still give it a longevity of a few years.
A $10 cable wont fail 25 times within the lifespan of the technology it's attached to. HDMI won't exist before you get 25 failures, unless you enjoy hammering nails into you cables or some other bizarre behaviour.
I think most people will gladly pay more for a better cable, but they'd like to pay something comensurate to the engineering effort, and production and material costs. The markup is insane (nearly $200 apparently).
One has to wonder if Monster and their retail partners would make just as much money selling them at $50, due to higher volume, with the added bonus of people not totally misleading everyone with the value proposition.
I spent about $5 more on my HDMI cables than I would have to get ones with a heavy-duty rubberised outside instead of the cheap and tacky bendy-plastic that breaks easily from over-manipulation.
You are right, people will pay more for better quality but misleading people into believing something is better quality by outright lies actually violates most consumer protection laws. I've seen no $250 monster cables at best buy here in ontario, the most expensive I saw was a $60 10ft cable by monster, however the consumer protection act is very strong and very strict here. Best Buy has screwed me over before and the mention of the CPA to a manager gets the problem resolved quickly. It appears they know they're in the wrong in many cases, but people let them get away with it - this is usually on their extended warranties from my experience.
You're confusing what I was saying. In an HDMI situation if you use a cheap cable and the picture and sound are fine, buying a more expensive cable isn't going to change anything. With HDMI in particular, if the cable is failing you'll know right away with either a poor picture or sound. It won't have the gradations that were possible when pushing an analog signal around - i.e. the picture looks could, but it could be sharper with a better cable.
I'm not sure why you brought ethernet into the mix, but generally you're talking about much longer runs than HDMI and it's often not as easy to see when your are losing packets without specifically checking if you are dropping packets (IMHO poor network performance IS failing outright, but not as easily seen as a bunch of blocks on the TV). Even then, the cable you use is either delivering all the packets or it isn't (up to an acceptable level of errors/loss for the length you are running). Delivering a stronger 1/0 isn't going to change anything as long as it's strong enough for the receiver to differentiate. That's where it either works or it doesn't comes from.
All signals are analog. Every single one. (Actually, I take that back. With the exception of quantum computers, to date, all signals are analog).
A "digital" cable doesn't just work or not work. It is possible to receive spurious bit errors for many reasons. For example, you turn up the volume to your awesome surround sound speakers, which causes more current to flow through your speaker wires that happen to be running parallel to your horrible HDMI cable. The increased current causes interference in your low-power HDMI cable, causing bit errors in the transmission.
>> "It is possible to receive spurious bit errors for many reasons."
Surely that's why there's error correction layered on top?
Instead of arguing about what might happen, why not find an HDMI device that'll show you how many unrecoverable bit errors it has detected in the stream... Can't be that hard to do.
If my hard disk can send data over a tiny crappy unshielded cheap USB cable without a single bit error, I'd be willing to bet money that an HDMI cable isn't that much different.
ugh, I hated signals theory at uni... but let me dust off that knowledge...
> All signals are analog.
I see your reasoning in saying this ("on the wire" etc.); your incorrect though. Not all signals are considered analog (which is important to the point being made)
Analog signals are generally susceptible to noise because they are usually non-recoverable; error checking is hard because it is a non-discrete continuous signal.
>It is possible to receive spurious bit errors for many reasons.
Yes, which is why error checking exists in all these systems. In the context of the wire alone, yes, it is not a case of "it just works". But in the case of the communications subsystem (i.e. the transmitter, medium, receiver) it really does either work or not work :)
Ultimately thought the phrase "it's digital; either it works or it doesnt" is making no comment about the actual signal. It's really saying: an expensive cable is not, due to the digital sub-system, likely to make any difference to error rates (or more succinctly: the performance of the error correction is better than the noise affect of the transport medium)
Hm, no idea, but I find that very hard to believe. I suspect digital transmission of information is almost impossible without error correction. IANAEE, though (I am not an electronics engineer).
Note, it's traditional on HN to add the phrase [EDIT] when making substantive modifications to comments, particularly if someone has already responded to your original comment.
The point is not that you can use anything, the point is that you only need a certain level, and then it's good enough. All my HDMI cables are from monoprice.com, I don't think I paid more than $4 for a single cable, and I can't imagine what kind of EM interference I'd need to mess with them. I have so many electronics in my house I'm probably on a watch list.
I guess the argument should have been that once you cross a certain threshold, cable quality doesn’t matter. That was of course also true for analog signals (human senses are only so good), but the transition from good signal to bad signal might have different characteristics with digital signals.
While a image from a analog signal will only get noisier and noisier as signal quality decreases, images from a digital signal will first show artifacts and then just stop displaying altogether. That’s a rather sudden transition. One second the signal is there, the other it's gone. Something like that just doesn’t happen with analog signals. (I’m basing this on my experience with terrestrial digital broadcasting – could be totally wrong for HDMI connection but I doubt it.)
I would guess that how sudden that transition is has something to do with the error correction and the codec of the signal. How many wrong bits can the error correction reconstruct? And, as soon as the wrong bits get past the error correction, how many wrong bits can the decoder handle and still reconstruct a image (albeit one with artifacts)? It would certainly be interesting to know how that problem was solved for HDMI connections. If you know the maximum bit error rates for your cables figuring out which are the right ones should be easy, given enough testing :)
I think "It is digital. It either works or it doesn't" in the context of A/V refers to the quality of the converted analog signal - what comes out of the speakers or is visible on the screen. And the signal getting through can either work or not, no matter how well engineered the cable is, isn't that correct?
But the reason why your digital TV shows the correct screen is because the cable transmitted the "digital" bits between your DVD player and the TV correctly. If you have a poorly made cable you will get bit errors. Electromagnetic interference from other A/V equipment could also cause bit errors. A well made cable with twisted pairs, proper shielding (few people do this correctly), and well formed connector crimps will prevent those bit errors. With bit errors, what your awesome Blu-Ray player decodes off of the Blue-Ray disc will not be what is shown on the screen (assuming there isn't bit error correction in the TV, etc.).
And your speakers are playing an analog signal. Cable quality matters even more there.
Yup, the correct signal might not get through if you don't have perfect shielding. And a speck of dust on a CD could theoretically ruin a whole song.
That's why CDs (and cellphones, modems, and countless other digital devices) use channel encoding.[1] That way you don't have to have a perfectly noise-free signal to reconstruct the original information.
I don't know that many specifics of the DVD or Blu-Ray standards, but I'd put money on them using Reed-Solomon or something similar.
> But the reason why your digital TV shows the correct screen is because the cable transmitted the "digital" bits between your DVD player and the TV correctly. If you have a poorly made cable you will get bit errors.
Yes. Which is why the poorly made cable can be said to not work.
> And your speakers are playing an analog signal.
Huh? I've never heard of HDMI cables capable of carrying analog sound.
Compact discs use something far better (in that context), a combination of two interleaved Reed–Solomon codes called CIRC, which is error-correcting (and not just detecting like a CRC). All digital A/V systems have to implement this in order to read a CD. It's made to be strong against burst errors which is why it can handle scratches on a CD pretty well. Samples can also be interpolated if something's really unreadable.
Even with an error-detecting code, there comes a point where there's too many errors to know if you received a valid code word or if the errors just "canceled each other". If 000 and 111 are your only valid words, it's still possible that a 111 gets turned into a 000 (3 consecutive errors) and there's no way to know about it...
There probably are CRCs run during the decoding process, but CRCs only tell you "yes this is correct" or "no, something went wrong."
There are additional protection schemes that can detect and correct bit errors. Those are probably also run during the decoding process. The problem is that you can only detect so many wrong bits in a given word, and beyond that no correction can be made. That is probably why the damaged disc half works.
Yes it depends on the way you handle the response of the CRC, you could say "give me the data I don't care" or "please try to read it again it's wrong".
In my opinion that is just the proof that it's not as simple as "it just works or not" but rather : depending on muptiple factors such as error correction and the way your algorithm works on the receiver it may "half work" no matter what kind of cable you plug in.
There is still a qualitative difference between digital and analog signals. With analog signals, you can always get higher fidelity by upgrading components (until something else becomes the bottleneck). With digital, there are gradients of "wrong" but there is only one "right." Once you hit the threshold of bits not being flipped, getting more expensive and higher-quality components will not get you better-sounding playback. A digital cable can achieve literally perfect playback. Analog cannot reach perfection, which means it's always possible to improve.
I'm not an electrical engineer so I don't know what the threshold of perfection is, but if the consumer reports can't tell the fancy cable apart from the Radio Shack equivalent of two-buck chuck, I'm not going to worry about it.
Not really, I have 2 $12 25 foot hdmi cable that work just great. I bought the cheapest one I could find because, hey, if it doesn't work I'm only out $12. But it's been working great for a couple years now.
"The real time to worry about the quality of those cables is when you go over 6 feet," would make sense as something said to the cable manufacturer. Translated to consumer-speak, that would be "it's really quite hard to find cheap HDMI cables longer than 6 feet from a well-known brand." (Of course, shady brands will just sell the crappy cables and rely on people's innate laziness and stores with no-refunds policies.)
It takes a lot of noise to alter the digital signal like this. If we're talking long distance, like a telephone line, the signal gets weak enough that it can happen regularly, but on the cable from the DVD to the TV the voltage drop is ridiculously low. Noise doesn't affect your signal by 4 volts unless you have a serious problem with your system.
Actually, I think that it takes the exact same amount of noise to flip a bit as it does to transmit a bit.
Through the principle of superposition, if you transmit a 0 with X watts, and I transmit a 1 with X watts, our combined signal will be somewhere in the middle, which can cause a bit flip.
I am not talking about voltage drop from end to end, which is a separate issue. I am talking about interference occurring adjacent to any point on the cable that you pick.
Ok when it comes to HDMI the goal was to always have good quality. The difference between an analog and digital is that digital ENCODES the information, it either gets there or not. The only thing the cable gives you is bandwidth.
Analog is different. The signal is itself the image, so a poor signal = poor image.
MAYBE if you have a projector projecting onto a movie theater screen will a better HDMI cable matter... maybe? because of bandwidth limitations if the sender is smart enough it might send info at a lower resolution. Other than that I can't see theoretically why a good HDMI cable will help.
Also on my 42'' tv, I've yet to be shown a performance improvement when connecting my XBOX 360 via component or hdmi. It looks identical. And if HDMI is better than component, then I don't see a reason why you would ever need a good HDMI cable unless you are a movie theater and the theoretical problem does exist.
Edit regarding bit errors... Lets assume that the cable you bought is a good quality cable. By good quality I mean most seem to be shielded and not cut and no shorts. This is a reasonable expectation. I've yet to see an argument for buying more expensive super gold plated uber network cables. Hey why is your network cable that handles SO MUCH DATA not gold plated shielded etc.?
My network cable that handles all my data doesn't have gold plating or shielding because the cable as designed is sufficient to operate at Ethernet speeds, temperatures, humidity, etc.
If you want to operate a faster network, or one in a harsh environment, you will be using shielded cables with gold plates and more twists per inch.
I bought some cables from http://www.bluejeanscable.com/ a few months later. I needed to make a few longer than normal runs, and they had reasonable prices and were direct about the capabilities of their different products outside of spec.
There's a bit more to it than that. An HDMI cable contains three serial data links, each operating at roughly 1.5Gbps. At rates that high, some shielding is required, or the cable will pick up enough noise to damage the signal. The HDMI link has some error correction (contrary to a comment on that site) whereby 8 data bits are transmitted using 10 bits on the wire, so minor errors are tolerated.
The important thing, and the central argument of that article, is that with a digital link, once there are no bit errors, no further improvements can be made, and even a cheap cable is plenty good enough for a short HDMI connection. Gold-plated connectors are still a good idea. Corrosion is as much of a problem there as anywhere.
Should you use a truly awful cable, and those do exist, you will probably notice an occasional "spark" on the display.
The graphic is totally useless, the same information could have been conveyed in 30 lines of text. An "infographic" should have some kind of graph or chart or something that adds insight, not a picture of 9 playstations and a dopey flow chart.
Second order effects, of course. If Samsung ships a cheap HDMI cable in the box, it's very little skin off their nose; they raise the price by their cost, about $2, and even in the competitive TV market that's probably absorbable to some extent.
But what happens next? They ship their TV to Best Buy, who quickly notices the TV ships with an HDMI cable, "robbing" them of the chance to sell a Monster cable. Or worse, the customer might still buy one, get them both home, notice there's no difference (or at least no $400 difference) and return the Monster cable. Ack! That's a $100+ profit opportunity that Samsung is costing them. How will Best Buy make up for it? They're going to jack the price of the Samsung TV up by some fraction of the $138 that they can no longer make. Not necessarily the whole amount, it would be prorated based on the average rate of Monster purchase; Best Buy has its own problems if it jacks the price of a TV up as it is a competitive market for them, too!
In fact, it is so competitive that it is not hard to imagine that Best Buy will discover they can't effectively sell that TV for ~$50 more (guessing) and just plain take it off the market. I am not saying this is inevitable, just that it is definitely a very possible outcome. If they don't, the very-price-sensitive American consumers will certainly notice the price difference in sufficient numbers to reduce sales of that TV.
So, how does this feed back to Samsung? They put in an HDMI cable, and either their retails sales drop, or Best Buy even entirely drops their TV.
So, whose fault is this? Money grubbing Samsung? Money grubbing Best Buy? Well, the capitalistic model tends to assume that the customer is informed, and when the customer is not informed, they can be scammed. Here, the customers are not informed. So I split the blame between Monster, who are aggressively lying to customers, and the customer base itself. (Best Buy to some extent here too, for the same reasons as Monster.) Not one, not the other, both. Too many audio/videophiles will aggressively defend their purchase of expensive cables, even after it is explained that their justification is technical gibberish, and as the market leaders they deserve some of the blame.
The other reason I blame this is that if you resolve the problem that a critical mass of customers actually believes expensive cables work, the problems go away. Best Buy raises the prices of all TVs a bit to make up the profit margin, as do all similar retailers. (Most are working on single-digit % profit margins from the top corporate POV, so if they lose something like the Monster cable they will need to make it up elsewhere.) Customers don't even notice because TVs continue to work like computers with constant price drops, so it manifests as a brief interruption in otherwise-falling prices rather than a huge, visible increase. Samsung starts sticking cheap-but-effective cables in, and instead of being punished by the retailers, customers reward them with happy thoughts about good service and a good out-of-box experience (which are hard to quantify but certainly produce bankable assets in the end).
The other option: When buying a TV through a different channel where there isn't a good Monster upsell opportunity, perhaps something like Amazon (which still has the upsell opportunity in some sense, but will present customers with many 1-star reviews and give them a chance to become informed properly), ship an HDMI cable then.
Second order effects, second order effects, second order effects. Always think second order effects when thinking about economics. It's never the simple story, the economic entities react to each other.
Assuming my discussion above is correct, which is a big assumption, yes.
One of the other consequences of misinformed consumers is sometimes merely being an informed consumer allows you to sponge off a bit of value from the uninformed ones too. See also "being in the small fraction of people that turn in their rebates"; everybody bought based on the advertised price, but only you and a few others paid it. (If you're like me, you properly factor in time-to-prepare-rebate and risk-of-no-return, though I personally have had very good success. I don't bother with $2 rebates; take out the stamp, the time, the risk of missing it, and you end up with little or possibly even negative. But I've scored $50 rebates for things I might have paid full price for willingly anyhow. That's a win.)
A) Not everyone is going to use a HDMI cable. (Wrong length, already have one etc. EX: How often did you use the phone line that came with a new phone?)
I think it's mostly a combination of B and C. I have enough monitor and pc power cables to last me the rest of my life at this point but they still package new ones whenever I buy a power supply or monitor.
There is only one "good" (dubious) reason to by a really expensive cable, and that is for range. There is a performance differences between the cheap and gold cables, due to the different conductivities. However over short distance this isn't observable, as the receiver will correctly interpret the digital signal even though there is no noise on the line, this is the beauty of digital signals. It is there or it isn't, and with decent ECC on the line, which I'm assuming HDMI has, even with a fair bit of degradation it wont be an issue.
However as you extend the cable the signal with degrade at a different rate depending on the quality of the cable, and eventually you will reach a distance where the digital signal even with ECC is lost, and it will no longer work. With a higher quality cable this range is greater, (we are talking of ranges in the 10's -> 100's of meters), certainly no issue for a normal TV setup, and you would have to question why you even needed such a long cable to start with, move your source/receiver closer to each other.
The only other factor which might show up is the sensitivity and power of the transceivers, and you might find a cable of 100m works on some gear and not on others, but if you do you really are using the wrong tool for the job, go buy a optical repeater :)
I'm guessing on distances here, but you get the idea, I use to work in high frequency data transmission systems, and have no idea of the detailed specs of HDMI but should give you an idea.
Except when you're buying a new TV and you're at the checkout, you realize you need a cable to finish your hookup. This is why you never see a $10 cable in a retail store, like the poster above mentioned.
It's all about the psychology of needing it now instead of being able to wait 4 more days.
This is why you use monoprice.com. Every time you need a cable, you actually order 4. It's still 1/4 the price of buying it from a retail store, and then when a friend needs a cable for that new tv he just bought, you can hand him one, for 1/4 the price of what he'd pay in stores, and he thanks you, AND you've just paid for your own cables.
Really, just don't ever buy a cable in a store. Ever. Getting a TV? Plan it ahead of time. Buying new tech? Make sure you have the cables already.
We're talking about an average 20x price markup on these cables, and sometimes significantly more (network cables I'm looking at you!).
Knowing your options is the first step in not getting screwed.
The original poster is precisely correct from my experience.
The only time I purchased a cable online was when I bought someone a gift that required the HD cable.
All the other times, we ran around the corner to Shopko and paid whatever their price was because we had a new HD system setup and wanted it in HD. Now.
I really don't think it's a casual purchase for most people, and I certainly never considered buying more than one at a time.
And when you've paid $150 dollars for a disneyland park hopper pass, that $12 slice of pizza and small soda doesn't seem that bad either, but you're still being taken to the cleaners.
Does your city have a Fry's electronics? Last time I was there, they had HDMI cables on sale for something like $4; I bought a gaggle of them.
If you don't have a Fry's, you're missing out on something wonderful. Every Fry's about a hundred people who absolutely and completely despise every single aspect of their jobs, and have been forced to wear white button-down shirts with ties.
Fry's sells everything. From green laser pointers, to 2 post telco racks, to embedded computer parts, to washing machines, to televisions, to cheap chinese toys, to candy bars, to tasers and pepper spray and home security gear.
And I've had the same experience as you. I've never seen a store so stocked with salespeople (and 70 cash registers!) but with so little interest in the process of selling and supporting material.
But my original point was that the typical person sitting at a checkout with an HDTV doesn't know what the HDMI cable does, much less that they can buy it at Fry's or Monoprice for 25x cheaper.
HDTV was new enough to me the last time I made a purchase that I assumed an HDMI cable would be in the box. It wasn't and next time I will plan ahead. Not everyone will do that so, so maybe a little business for same day delivery of cables to people would earn an enterprising college student a few extra bucks for beer. It could work where the population is sufficiently dense.
That's unfortunate. I guess it goes to show how much value those "Hacker Safe" badges are. But how do you know they store your password as plaintext?
I was talking about the way they handled it in general. They didn't email any customers, and they still don't know which cards/orders were affected. I think they should at least be able to tell me which customers have been complaining (ex. those who made orders between DATE_A and DATE_B)
Also, they deleted data that they previously assured me would be kept & accessible (last 4 digits of credit cards, so that I can check the appropriate bill). It took me over a week and at least 5 different CSR's to find this out- I was given the runaround before that. Live support & phone support both told me they couldn't help, and claimed that the email support team is the only group that could. Over a week later, I still have not received a response.
It's rare to see even a $10 one at retail. Often the lowest priced one is $20-25, and there are a bunch of $40 options. And not even talking about Monster which this article is singling out.
At the local Fry's (in Austin), there is a section of HDMI cables by the TVs and DVD/Blu-ray players. Cheapest in that section is about $20-$25 as you mention. However, over on the other side of the store, near the computer components, there is a section full of cables (coax, etc.). They have HDMI cables over there too - for ~$10 or so. I guess in some stores you just have to know where to look.
I've read exactly one semi-scientific comparison of popular DVI cables of varying price. Amazingly, the conclusion was that Monster Cables really are better. However the difference had no effect unless you were running a high-bandwidth signal (>=1920x1080x60Hz) over a long distance (>=12 feet? I don't recall exactly). The cheaper long cables would work for low bandwidth signals, but would lose sync and fail at high rez.
I suspect Monster has it in their catalog for government contract fulfillment where they can just drop it in without any review of cost when A/V components are ordered (seriously).
It's not aimed at people who actually look at price tags.
Have you ever gone into a Best Buy or similar and seen the cable selection? The "cheap" cables are $50, and the Monster cables are sold right beside them. Many people compare prices on the TV before coming into the store, but are not armed with good information when the salesperson tells them that their $1200 TV is worthless without $250 in cables.
It's more insidious at a Best Buy where large purchases like HDTV with accessories are often financed and the salesperson says they can have the best without paying for it (right now).
The average consumer has no clue and doesn't want to learn that digital is digital, not like analog. All they know is when there is poor signal they get nothing on their TV (or blocky) where on their old TV they at least got a viewable picture with noise.
So what if some dude with extra money pays a premium for a brand name? If it wasn't for them helping with the profits, Best Buy wouldn't be able to sell us a $400 dollar dual core laptop at a razor thin margin.
An excellent place for purchasing av and other sorts of cables at sane prices is monoprice.com Ive been very happy with the pricing which is consistently the lowest I've seen and the construction quality both. Sanely priced does not imply any loss in construction quality nor does the contrapositive hold, though US consumers are conditioned to view price as a signalling mechanism for quality to various extents
Unless you know exactly why you need the expensive one, you don't need it.
Unless a reasonable-length cable is damaged or malconstructed in such a way that it distorts transmitted waveforms so much that they're not recoverable, any reasonable conductor will work. This stuff isn't rocket science since about a year after RS232 was invented.
My personal favorite store to get cables from: Fleet Farm.
Typically, you can find almost any kind of cable for $5-$10, some of the common ones for less. I think my 25-foot flat ethernet cable came in at a whopping $12, but it's been the most durable I've owned.
They've also got rational pants, which my wife in particular likes.
I always thought that the "high end" cable market was a rip-off. Even as someone who loves high quality audio, its REALLY hard to tell the difference between a moderately priced cable and a high end one.
The idea of basing expensive audiophile equipment on unbalanced audio seems silly; if they really cared about quality, wouldn't they be using balanced signaling?
I stopped buying cables retail a long time ago. My favorite supplier: RiteAV. They're (IMHO) the Amazon of cables. Huge selection, modest prices [2], and great customer service [3].
1. http://www.sears.com/shc/s/p_10153_12605_05750807000P?vName=...
2. HDMI cables start at $3.50 - http://www.riteav.com/hdmi-cables-c-141_147.html?osCsid=cjmg...
3. I've never seen another retailer with as high a satisfaction rating: http://www.resellerratings.com/store/RiteAV