this post was submitted on 18 Aug 2024
220 points (97.0% liked)

Linux

48311 readers
848 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 5 years ago
MODERATORS
 

I'm writing a program that wraps around dd to try and warn you if you are doing anything stupid. I have thus been giving the man page a good read. While doing this, I noticed that dd supported all the way up to Quettabytes, a unit orders of magnitude larger than all the data on the entire internet.

This has caused me to wonder what the largest storage operation you guys have done. I've taken a couple images of hard drives that were a single terabyte large, but I was wondering if the sysadmins among you have had to do something with e.g a giant RAID 10 array.

(page 3) 42 comments
sorted by: hot top controversial new old
[–] [email protected] 3 points 3 months ago

I routinely do 1-4TB images of SSDs before making major changes to the disk. Run fstrim on all partitions and pipe dd output through zstd before writing to disk and they shrink to actually used size or a bit smaller. Largest ever backup was probably ~20T cloned from one array to another over 40/56GbE, the deltas after that were tiny by comparison.

[–] [email protected] 3 points 3 months ago

As a single file? Likely 20GB iso.
As a collective job, 3TB of videos between hard drives for Jellyfin.

[–] [email protected] 3 points 3 months ago (2 children)

I mean dd claims they can handle a quettabyte but how can we but sure.

[–] [email protected] 2 points 3 months ago
dd if=/dev/zero of=/dev/null status=progress
load more comments (1 replies)
[–] [email protected] 2 points 3 months ago* (last edited 3 months ago)

20TB (out of 21TB usable), a second 6x6TB zfs raidz2 server as my send target.

[–] [email protected] 2 points 3 months ago

I recently copied ~1.6T from my old file server to my new one. I think that may be my largest non-work related transfer.

[–] [email protected] 1 points 3 months ago (1 children)

While I haven't personally had to move a data center I imagine that would be a pretty big transfer. Probably not dd though.

[–] [email protected] 1 points 3 months ago

I can't imagine how nerve-wracking it would be to run dd on something like that lol. I still don't trust myself to copy a USB stick with my unimportant bullshit on it with dd, let alone a server with anything important on it!

[–] [email protected] 1 points 3 months ago

Probably some vigeo game on that is ~150-200 GiB. Does that count?

[–] [email protected] 1 points 3 months ago

I think it would be my whole broken manjaro install, I just used dd to make a copy so I could work on it later lol. About 500 gigs

load more comments
view more: ‹ prev next ›