this post was submitted on 21 Nov 2023
1 points (100.0% liked)

Homelab

371 readers
9 users here now

Rules

founded 11 months ago
MODERATORS
 
top 50 comments
sorted by: hot top controversial new old
[–] gizzlyxbear@alien.top 1 points 10 months ago (1 children)

I only have a passing interest in homelab stuff, but I just wanted to say that I got really excited because I thought those were a bunch of PS2s. Thought this was gonna be some weird FFXI or Battlefront LAN system.

[–] ghafri@alien.top 1 points 10 months ago

Well at least all do run some graphics so there is some truth close to that

[–] freakierice@alien.top 1 points 10 months ago

Other than mounting the switch above and running the network cables up the rear there doesn’t seem to be a easy solution

[–] 16golfr@alien.top 1 points 10 months ago (1 children)

Serious question what is this for...

[–] ghafri@alien.top 1 points 10 months ago

I run certain AI bots that require integrated cpu graphics to run 24/7

[–] unevoljitelj@alien.top 1 points 10 months ago

what is the use case for this?

[–] magic_champignon@alien.top 1 points 10 months ago (1 children)

What are you running here?

[–] ghafri@alien.top 1 points 10 months ago

AI bots that use integrated graphics

[–] theresmorethan42@alien.top 1 points 10 months ago

Get a cheap UCS or super micro server on eBay. You can get a stupid number of cores and RAM for just about nothing, then you have one box, and altogether probably a LOT less power draw

[–] Mintfresh22@alien.top 1 points 10 months ago

Looks great now.

[–] Poooturd@alien.top 1 points 10 months ago

What about a bunch of poe power splitters and a poe switch ?

[–] DonkeyOfWallStreet@alien.top 1 points 10 months ago (1 children)

10" racks

Consolidate power might not be possible 12v? Even 20amps is 4-6 computers. Just get 10" pdu's they have 3 plugs. I know there's 17 computers and 3 switches that's like 7 pdu's..

20 shelves

2x 10u 10" racks..

You'll need to have a 10u empty to keep going for expandability.

[–] ghafri@alien.top 1 points 10 months ago (1 children)

Do you have some amazon links or pic links for those?

[–] bmensah8dgrp@alien.top 1 points 10 months ago

Have a look here for some ideas racksolutions

[–] Chuffed_Canadian@alien.top 1 points 10 months ago

Low hanging fruit: ditch the dumb switches and get a 48 port or something. Place it on the shelf above. If you’re using the two switches to ensure that you have more throughput per machine considering putting in a 10gbit uplink. Beyond that consider mounting/stacking the monitors differently and changing out the wood shelves for metal shelves for airflow. The trick is to know in which direction they dissipate heat. If they’re fanless the heat is gonna go up which will limit your ability to stack.

[–] PopNo626@alien.top 1 points 10 months ago

Get some 2x2 boards, a crown stapler, and window/door screen and make 3 panels big enough to cover the exposed sides. I did that and everyone said it looked cool. And it only cost like $40 because I already had the crown stapler.

[–] bryan_vaz@alien.top 1 points 10 months ago

The setup is fine since you're obviously trying to be budget conscience. I'd say cable management is really what you need more than anything else.

However, if you're handy with wood and a saw, or sheet metal and a brake (I assume you don't have a 3d printer):

  • Stack them in sets of 4. The design of those NUCs allow them to be stacked without any thermal issues, as long as the ambient air is kept low enough.
  • Put the power supplies for each stack at the bottom of the stack in a 2x2 pattern (two side by side and stacked 2 high) - they should then match the footprint of each NUC
  • Set up a frame to hold each stack, I'd recommend 4 posts of 1/2"x2" flat stock wood (or 18GA steel strapping), 2 running along each side with 6"x6" platforms for each stack level to hold the frame together
  • For stack platforms at a minimum I recommend one at the bottom to hold your termination points (see below), one about 1" above the bottom one for the power supplies, one separating the power supplies and NUCs, and if you're using sheet metal, one for each NUC. You can also add a cross bar at the top to use as a handle.
  • For power, those DC bricks use a standard C13 connector, so I would recommend a short length 4xC13-to-1xC14 splitter so that you only have one C13 input for the entire stack of 4 NUCs (if you're in Asia, your local electronics shop should be able to make you a few splitters for pretty cheap if they don't already have ones kicking around)
  • Both the NUCs and the DC bricks should be ziptied/string tied, or velco'ed to stack frame
  • Route the single C14 input to the bottom of the stack to the termination level (will probably be about 6"
  • Same for the network cables, run them down the side zip tied to frame and have them terminate at the bottom termination level. If you're decent at terminating, or if you know someone who is fast at it, you can just crimp custom length cables for each of the 4 NUCs. Otherwise it should be 2x12" cables and 2x18" cables.
  • At the bottom I personally would recommend terminating with a CAT5e/CAT6 jack rather than a plug
  • Then orient your termination points so they point forward, that way all the cable hide at the back of the stack

Once you have all your NUCs in modular stacks, you can just run network cables and C13 power cords neatly along the front of the shelf, and have them input at the bottom of each stack. Then just tie your runs to each stack in a neat bundle and mount your switches to the underside of those shelves.

If you need to service a stack, you can just disconnect the stack in question at the termination points at the bottom of the stack and lift the whole assembly out, without having to play with wires. If you want to be extra fancy, you can even set up grooves or guides on the shelf to ensure the stacks are always placed in the correct position.

[–] Professional-Fee2235@alien.top 1 points 10 months ago

Personally I'd buy big power supply at the voltage it needs, get some board with fuses and connect all to that.

But it seems you need few more shelves and pack of zip-ties first.

[–] chandleya@alien.top 1 points 10 months ago (2 children)

This looks like a problem that would be solved cheaper with less mess by a used Lenovo Thinkstation P720, a pair of Xeon Gold 6140s, a dozen 32GB DIMMs, a 4TB NVMe, and a copy of VMware.

Why on earth do you have so many minis?

[–] NavySeal2k@alien.top 1 points 10 months ago

To mitigate terminal server cost I guess. And no, virtualizing windows is no solution because to be legal you need expensive open license windows licenses and software assurance on that licenses …

[–] ghafri@alien.top 1 points 10 months ago

But can it be solved if I need to run graphics on each of those devices and one graphics vietualized is not good?

[–] linerror@alien.top 1 points 10 months ago (1 children)

you can replace the 17 power supplies with a server PSU... these should all be 12v DC input with a standard 2.1x5.5MM barrel connectors...

flip them up onto their face, with the rear facing up. if you want to get fancy you could even make a base that integrates a little wedge and rod for the power button.

get at least a 24 port switch and micro ethernet cables.

this could be cleaned up to a single row on one shelf with no visible wires other than the ethernet and power lead running up and to the back. 1 switch under the machines... 1 PSU instead of 17... even one power cord...

or a little cable management at the very least...

[–] ghafri@alien.top 1 points 10 months ago (1 children)

I didnt know about server psu so will check that out. 24 port switches are expensive, all 3 switches are far cheaper than those big switches, tho im not sure if I add more 8 port switches ill be doing daisy chaining.

[–] linerror@alien.top 1 points 10 months ago (2 children)

you're clients are 1gbe... you can get a brand new unmanaged 24 port gbe switch for $50/£56 a fraction of one of those machines... https://www.amazon.co.uk/Tenda-Ethernet-Internet-Splitter-TEG1024D/dp/B09DPLVLPY/ -- not to mention you can get a used managed switch for less than half that.

if £56 is going to break your bank then get some double sided velcro and clean up your mess. wish you had mentioned your obscenely limited budget...

load more comments (2 replies)
[–] YamStallion@alien.top 1 points 10 months ago

You could buy some cable track for the ethernet cables. It's pretty cheap too! you can even turn the boxes around and practice cable routing to make it nice!

Bonus points if you get a patch panel with keystone couplers to bring small cables to each unit.

[–] ycdrtt@alien.top 1 points 10 months ago (1 children)
[–] ghafri@alien.top 1 points 10 months ago
[–] dkdurcan@alien.top 1 points 10 months ago

I would suggest to get a rack with shelves, rack mountable PDU strips, and a rack mountable switch.

[–] Mythril_Zombie@alien.top 1 points 10 months ago (1 children)

How to make this better?

Replace with a real server and virtualize the snott out of it.

[–] ghafri@alien.top 1 points 10 months ago

Wont work, im running integrated graphics on each seperately

[–] SocietyTomorrow@alien.top 1 points 10 months ago

If only someone made a distributed power supply box for 19V (like for cctv) you cumulative ditch all the bricks.

[–] TheLazyGamerAU@alien.top 1 points 10 months ago (1 children)

I'm sure you could probably replace all of these with a single mid range cpu lmao.

[–] ghafri@alien.top 1 points 10 months ago (1 children)

Yes but I run Integrated graphics on eahc and its required

[–] TheLazyGamerAU@alien.top 1 points 10 months ago (2 children)
load more comments (2 replies)
[–] theofpa@alien.top 1 points 10 months ago

IKEA kvissle

[–] Missing_Space_Cadet@alien.top 1 points 10 months ago
[–] I-make-ada-spaghetti@alien.top 1 points 10 months ago

Stack them sideways and get a patch panel.

[–] pixelvengeur@alien.top 1 points 10 months ago

PoE perhaps? If those boxes support it, that would massively reduce your cable clutter

[–] kY2iB3yH0mN8wI2h@alien.top 1 points 10 months ago

perhaps some context here would help?

replace them with on box = problems solved

[–] TehBIGrat@alien.top 1 points 10 months ago

Rack Mount Bays and PoE kits for them.

[–] Icinglips@alien.top 1 points 10 months ago (1 children)

Get a mini rack system that fits on the big shelf and stack up the box things. Then go from there. And tie wraps for the power cables.

[–] ghafri@alien.top 1 points 10 months ago (2 children)

What about the constant 85 degrees heat that is emiting from it?

load more comments (2 replies)
[–] Jim_Screechy@alien.top 1 points 10 months ago (1 children)

Meh, this isn't what is seems. OP is just being cute. My guess is that he is imaging these hosts from a WDS server or similar. I've done deployments that mimic this setup exactly (though considerably more neatly), and its very typical to misconstrue what is actually happening at a glance.

[–] ghafri@alien.top 1 points 10 months ago

I run Ai bots that require graphics

[–] iMadrid11@alien.top 1 points 10 months ago

Mount the mini PC into shelves inside a server cabinet.

Patch panels for shorter and clean Ethernet cable runs.

Server rack mount power supply, instead of bricks plugged into power strips.

[–] robinskit@alien.top 1 points 10 months ago

Why do you have 17 mini pc? What are you doing with them?

[–] TechnicianTop931@alien.top 1 points 10 months ago

What is going on in this picture? What devices are there and what are they doing?

[–] Wvalko@alien.top 1 points 10 months ago (1 children)

I was up against this last year, 130 Lenovo m75Q's, how to fit 130 into a 42u rack including powerbricks. Solution was 3d printing.

[–] ghafri@alien.top 1 points 10 months ago

Any pics regarding this if you would share?

load more comments
view more: next ›