I always had Intel CPU-s simply because I saw my friends as they tried to be "smarter" than everyone else, and melted their cheapoo AMD rigs on a yearly basis when they overclocked it. to keep up. Ofc they didn't want to spend the difference on serious cooling... Mind you, they did the same to cheaper Intel CPUs as well. You could cook an omelette on an AMD, but you could sear a steak with a Celeron :D
I still remember the fights in the class room about someone clocking their AMD 20% higher than factory, and pissing on Intels cos you could only get them up about 10-15% before you needed to get serious cooling (and also perforation gains dropped off). What they always failed to mention was that their AMD rig only performed 5% faster with +20% boost and kept crashing every 10-15 minutes, while clocking up the mid-high end intels by 5-10% gave you an almost 1 to 1 gain in performance without the need for any changes in cooling, and remained stable.
I'm talking about the year 2000-2005 ish, I have no idea how todays bunch would perform if you tried a 10-15% OC. And I couldn't care less really.
When all is said and done, if you have a clean system with something like an i5 3,5ghz or an AMD equivalent, and a program doesn't run flawlessly, it's most likely the programmers fault, and wouldn't perform any faster with an OC. They got way too laid back and forgot how to optimise stuff while not much progress was made in raw computing power in the last decade, so the need for optimisation is still there.
AMD (Ati back in the day) and NVidia graphics cards are the same deal. You simply get what you pay for, or you pay the price later when you try to get more out of the cheaper stuff. I had a few of the early Geforce cards, then switched to Ati because I liked the cheapness. And boy, they were really cheap when it came to build quality and driver support. I hadad Asus stuff mostly with the exception of an MSi and a Sapphire card. All were crap in some way, and performed just as worse as they were cheaper than nvidia cards. Ofc the difference in build quality is much closer these days, nvidia cards have the same badly designed loud, rattling after a week kind of fans and dodgy connectors on the board the same way AMD cards have lol.
I still have a 1st gen Core2Duo (thats almost 10 years, damn I'm gettin old), and it ran 10% overclocked since day one, with it's factory sink and fan. And I only cleaned it like once a year, and it lived in an open house for at least 5 of it's years! It's a mystery how it survived all the food and dust that got in that fan... I finally retired it in December as older i5's got real cheap on the used market.
So yeah, I ain't gonna switch.