this post was submitted on 12 Aug 2024
467 points (98.5% liked)

Open Source

31111 readers
349 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] princessnorah@lemmy.blahaj.zone 2 points 2 months ago (1 children)

It really depends. Once every 1-5 minutes, sure, maybe. Once every 1-5 hours tho? You're likely fine.

[–] MangoPenguin@lemmy.blahaj.zone 3 points 2 months ago* (last edited 2 months ago) (1 children)

True, although once per hour would still be a lot of data.

For example me running a fast.com test uses about 1.5GB of data to run a single test, so around 1TB per month if ran hourly.

[–] princessnorah@lemmy.blahaj.zone 1 points 2 months ago

Once every 6hrs would only be 180GB. A script that does it every six hours, but then increases the frequency if it goes below a certain threshold, could work well. I guess it all depends on how accurate you need the data to be.