this post was submitted on 12 Aug 2024
466 points (98.5% liked)
Open Source
31250 readers
246 users here now
All about open source! Feel free to ask questions, and share news, and interesting stuff!
Useful Links
- Open Source Initiative
- Free Software Foundation
- Electronic Frontier Foundation
- Software Freedom Conservancy
- It's FOSS
- Android FOSS Apps Megathread
Rules
- Posts must be relevant to the open source ideology
- No NSFW content
- No hate speech, bigotry, etc
Related Communities
Community icon from opensource.org, but we are not affiliated with them.
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
It really depends. Once every 1-5 minutes, sure, maybe. Once every 1-5 hours tho? You're likely fine.
True, although once per hour would still be a lot of data.
For example me running a fast.com test uses about 1.5GB of data to run a single test, so around 1TB per month if ran hourly.
Once every 6hrs would only be 180GB. A script that does it every six hours, but then increases the frequency if it goes below a certain threshold, could work well. I guess it all depends on how accurate you need the data to be.