This is kind of a shit article. Most of these are just old hardware that eventually had modern improvements, not "trends."
A "trend" is cold cathode black lights inside the case, not a silly naming scheme for CPU revisions.
This is a most excellent place for technology news and articles.
This is kind of a shit article. Most of these are just old hardware that eventually had modern improvements, not "trends."
A "trend" is cold cathode black lights inside the case, not a silly naming scheme for CPU revisions.
Ya acrylic side cases where a trend, maybe 3D monitors but everything else in there was just specific technology that has been replaced by better technology..
The blower gpu fans were definitely a trend. I remember buying third party coolers and strapping 120mm fans onto them with zip ties.
Blower fans had a technical reason to exist that isn’t very relevant anymore.
It used to be to keep the card profile low so you could have other PCI card slots populated. However these days everything including Wifi can be found pre populated on the motherboard. It is rare you put in any additional PCI cards in modern personal systems.
ATX boards had most relevant slots below that already, especially due to SLI / CF being a thing at that time. I know because I used wlan and audio cards back then - that is with the third party cooler + fans, which blocked like 3 slots.
IDK I would say 3d monitors are a trend that died pretty hard
A trend implies a level of popularity. There was none.
It's ultimately just failed (or "pre-successful") technology that wasn't able to do the job well enough at a sufficient price to develop a market.
How was IDE a hardware trend?
It's an XDA article, what did you expect.
None of these are trends. They're all hardware standards, and all but one of them are still very much here anyway
Molex connectors were almost universally hated for being flimsy and requiring a lot of effort to connect properly. They were fortunately replaced by SATA connectors.
I can understand the "lot of effort", but flimsy? Those things were built like a tank. SATA connectors certainly aren't more-durable (not that that normally matters, inside a case).
Yes they were flimsy. When pushing them together the crimped ends would get pushed out the back of the plastic connector casing. Or they wouldn't align properly and would require either major force or fiddly realignment.
I remember instances where the force required to disconnect the connector caused me to slip and rip a wire out.
They also came from a time when hard drives could draw several amps while in use and much more on spin-up. There was a good reason why SCSI drive arrays used to spin each disk up one-by-one.
Molex connectors are good for 10 amps or so, SATA connectors couldn't have handled that amount of current.
I have seen so many flimsy molex connectors. SATA was far, far, far more robust. They were enormously flimsy. Are you thinking of the right connector?
RGB. Please. Finding hardware that doesn't light up like a Christmas tree is harder than it should be. Even a simple power LED can light up an entire room.
I don't really mind RGB, but my complaint is why every single LED has to be vivid electric blue. I want old red LEDs back, they were nice, they didn't scorch my retinas.
Agreed. My PC case came with a blue power light, after one night of watching the blinking illuminate my entire room I ripped it out and swapped in a dim red one myself.
For a quick fix, you can make blue power LEDs slightly more tolerable by sticking a piece of yellow post-it note on top of them, it turns them white.
Not anytime soon. Way too cheap to include(like cents for a mouse or ram and a few dollars for a keyboard) , and way too popular not to include. Well at least you can disable it.
right, you fan disabile them using their unique software which you have to install for every component, signing away your life (cough cough Disney) in the process
I remember my first serious build, blue acrylic case with as much black light reactive components I could get
My case is an old Tower Server Case tucked away behind my monitors. Loads of space and no need for cable management.
That bastard would slice you open and gut you like a pig at the first opportunity though.
I have sacrificed to the case god already
I remember the first full build I did. All of my fans had LEDs, the case had LEDs. The first time I tried to play on it in the dark basement the SU was blinding. I disconnected all of the case LEDs, and replaced my fans for plain black ones.
Oh man I went through this phase too. I had the clear acrylic case and a bunch of those UV CCFL tubes.
The thing that I wish would go away is oversized graphics cards that take up 3 or more slots. There needs to be more options for liquid cooling that doesn't require modifying the card.
I think I’m misunderstanding your comment. Once you liquid cool the card, it’s no longer an oversized behemoth. My reference 4080S is only taking up a single slot.
Most graphics cards have massive air coolers that block other PCIe slots. I want more water cooled options since they are low profile. I just don't want to have to void the warranty on a brand new card to install a water block.
I know for sure that installing a water block does not void the warranty on reference Nvidia cards. I’ve read that Asus (and evga rip) are the same. Not sure about MSI, and have read that Gigabyte will try to void warranty.
The PCP is still big in those cases.
Sure, but the PCB with water block only takes up a single PCIe slot, and is shortened enough to fit in pretty much any case. Is my water cooled 4080S longer than my water cooled RX 480? Yes. Substantially longer? No. Thicker? Also no, basically same thickness.
I am thinking that maybe more liquid cooling will happen with the whole AI thing on the datacenter side. That has a lot of parallel compute cards generating a lot of heat. Easier to move it with liquid than air.
Some other liquid-cooling annoyances:
Cases don't really have a standard-size mounting spot for the radiators.
I want to use one radiator for all of the things that require cooling. Like, I'd rather have an AIO device that provides multiple cold plates.
I really doubt liquid is easier for a data center. They have airflow solved pretty well and noise doesn't really matter. Liquid failing could potentially do way more damage, and might require shutting down whole areas for repair/damage prevention in the case of a single leak.
If they did do liquid at scale, it wouldn't be done in a way it would work down to consumers. It would be like custom boards with full coverage blocks for the whole system that tied into whole room water chillers or something.
That would require cooler mount standards. I don't think AMD or Nvidia currently have a standard.
The worst is still around: that GPU's require more and more power. I wished more focus on efficiency. Not long until water cooling is mandatory, to get all the heat away.
They are. GTX 590 from 2011 has a TDP of 375W. RTX 4080 has 320W, while offering over ten times better performance. 4060 outperforms the 1060, 2060 and 3060 while having a lower TDP than any of them.
If you want low TDP, the RX 6400 is twice as powerful as the 590 while having a TDP of 53W.
It's the very top of the line stuff like 4090 that push the limit by achieving that very last 10% performance bump at the cost of using double the power, and that's kinda like complaining a Bugatti Veyron gets terrible highway MPG figures.
The capacitor plague era, ever wonder why we don't see a lot of PC's in the early 2000s, this is why as everything with a cap would fail and kill the boards, essentially having to call on the oem to fix it.
Intel's slot CPU interface. Sure it cleaned up motherboard layouts but the need for more comprehensive cooling solutions that would soon follow made this a bad direction to go in.
Did bottom PSU ATX cases disappeared? Floor dust suckers.
Nope lol