this post was submitted on 29 Jun 2023
44 points (100.0% liked)

Technology

37551 readers
286 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
 

Nintendo Wii: Sold like gangbusters.

64bit Processors: The computing standard.

Battlestar Galactica: Considered one of the greatest sci-fi series of all time.

Facebook: Continues to be the world’s leading social media platform by literally BILLIONS of users.

High Definition: HD only got even more HD.

iPhone: Set the standard for mobile smartphone form factor and function to this day 16 years later.

you are viewing a single comment's thread
view the rest of the comments
[–] kbity@kbin.social 7 points 1 year ago (2 children)

To be fair, a lot of these are accurate, or at least were at the time.

  • Multi-GPU just never caught on. There's a reason you don't see even the most hardcore gaming machines running SLI today.

  • The Wii's novelty wore off fairly quickly (about the time Kinect happened), and it didn't have much of a lasting impact on the gaming industry once mobile gaming slurped up the casual market.

  • Spore is largely forgotten, despite the enormous hype it had before release. It's kind of the Avatar of video games.

  • It took years for 64-bit to become relevant to the average user (and hell, there are still devices being sold with only 4GB of memory even today!). Plenty of Core 2 Duo machines still shipped with 32-bit versions of Windows and people didn't notice or care because basically no apps average people cared about were 64-bit native back then and you were lucky to have more than 4GB in your entire machine, let alone need more than that for one program.

  • Battlestar Galactica (2003) fell off sharply after season 2 and its ending was some of the most insulting back-to-nature religious tripe that has ever had the gall to label itself as science-fiction.

  • Downloading movies over the internet ultimately fell between the cracks outside of piracy. Most people stream films and TV now, and people who want the extra quality tend to buy a Blu-Ray disc rather than download from iTunes (can you even still do that with modern shows?)

  • I definitely know people who didn't get an HDTV until 4K screens hit the market, and people still buy standard-def DVDs. Hell, they're still outselling Blu-Rays close to 20 years later. Calling HD a dud is questionable, but it was definitely not seen as a must-have by the general public, partly because that shit was expensive back in 2008.

  • The Eee PC and the other netbooks were only good when they were running a lightweight operating system like Linux or Windows XP. Once Windows 7 Starter became the operating system of choice for netbooks, the user experience fell of a cliff and people tired of them. Which is a shame, because I love little devices like UMPCs.

  • The original iPhone was really limited for 2007. No third-party applications, no 3G support, no voice memos, you could only get it on a single carrier... the iPhone family did make a huge impact in the long run, but it wasn't until the 3GS that it was a true competitor to something like a Symbian device.

The only entry on this list that's really off the mark is Facebook, which even at the time was quickly reshaping the world. And I say that as someone who hates Zuck's guts and has proudly never had a Facebook account.

[–] Cowbob45@beehaw.org 2 points 1 year ago (1 children)

Are you joking? I thought the Wii was a wild success, I remember it being very popular.

[–] i_am_not_a_robot@discuss.tchncs.de 2 points 1 year ago (1 children)

Multi GPU video cards (not multiple video cards) might be making a comeback.

[–] kbity@kbin.social 1 points 1 year ago* (last edited 1 year ago) (1 children)

Possibly, now that we have much tighter integration between different chips using die-to-die interconnects like Apple's "UltraFusion" and AMD's "Infinity Fabric" to avoid the latency and microstutter issues that came with old-fashioned multi-GPU cards like the GTX 690 and Radeon HD 7990 XT.

As long as software can make proper use of the multiple processing units, I think multi-GPU cards have a chance to make a comeback... at least if anyone can actually afford the bloody things. Frankly, GPU pricing is a bit fucked at the moment even before we consider the idea of cards with multiple dies.

[–] Kalothar@lemmy.fmhy.ml 0 points 1 year ago (1 children)

Is it worth it to do this with something like a 3070?

[–] Johanno@lemmy.fmhy.ml 0 points 1 year ago (1 children)

You can't. I think. The new gpus don't have even wn option for sli or similar. If you want a multi-gpu setup where the gpus work together you will need specific Hardware

[–] Kalothar@lemmy.fmhy.ml 1 points 1 year ago

That was actually my assumption, my roommate in college had two 970s using an SLI bridge. Couldn’t remember if that stopped being a thing, it always seems like such a cool tech at the time.

Now I feel like a grizzled between after surviving the 10, 20 , 30 and then on arrival of the 40 series finally upgraded to a 3070.

My single 970 was a workhouse for so long and now it’s dedicated to the sims 4 for my girlfriends computer hahah.