332
Free Download Manager site redirected Linux users to malware for years
(www.bleepingcomputer.com)
From Wikipedia, the free encyclopedia
Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).
Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.
Community icon by Alpár-Etele Méder, licensed under CC BY 3.0
Now I need to know who the hell has installed Free Download Manager on Linux.
And via a website too. That's like pushing a car. One of the main strengths of Linux are open repositories, maintained by reputable sources and checked by thousands of reputable people. Packages are checksummed and therefore unable to be switched by malicious parties. Even the AUR is arguably a safer and more regulated source. And it's actually in there.
Everyone knows real admins do
curl https://raw.githubusercontent.com/something/or/other/install.sh | sudo bash
Instructions unclear, "command not found: 404".
The same people that would have given that poor nigerian prince their bank account details
It's still my favorite download manager on Windows. It often downloads file significantly faster than the download manager built into browsers. Luckily I never installed it on Linux, since I have a habit of only installing from package managers.
Do you know of a good download manager for Linux?
How much faster are we talking?
I’ve honestly never looked at my downloads and though huh you should be quicker, well maybe in 90’s.
FDM does some clever things to boost download speeds. It splits up a download into different chuncks, and somehow downloads them concurrently. It makes a big difference for large files (for example, Linux ISOs).
It only makes a difference if the server is capping the speed per connection. If it's not then it will not make a difference.
I guess many servers are capping speeds them. Makes sense since I almost never see downloads actually take advantage of my Gigabit internet speeds.
It's interesting to me people still download things in that fashion. What are you downloading?
I occasionally download something from a web server, but not enough to care about using a download manager that might make it marginally faster. Most larger files I'm downloading are either TV shows and movies from torrents and usenet, or games on steam. All of which will easily saturate a 1Gbps connection.
Im curious as to how it would achieve that?
It can’t split a file before it has the file. And all downloads are split up. They’re called packets.
Not saying it doesn’t do it, just wondering how.
It could make multiple requests to the server, asking each request to resume starting at a certain byte.
Interesting.
I feel I’ll save this rabbit hole for weekend and go and have a look at what they do.
The key thing to know is that a client can do an HTTP
HEAD
request to get just theContent-Length
of the file, and then performGET
requests with theRange
request header to fetch a specific chunk of a file.This mechanism was introduced in HTTP 1.1 (byte-serving).
Huh.. that’s super interesting and thanks for sharing.
Right? I've not thought about download speeds since the 2000's.
just grabbed a gig file - it would take about 8 minutes with a standard download in Firefox. Use a manager or axel and it will be 30 seconds. Then again speed isnt everything, its also nice to be able to have auto retry and completion.
I was just going to recommend this too; Use axel, aria2 or even ancient hget.
JDownloader, XDM, FileCentipede (this one is the closest to IDM, although it uses closed source libraries), kGet, etc.
And JDownloader is the more useful one for easier download from file hosters.
axel. use axel -n8 to make 8 connections/segments which it will assemble when it is done
Even with wget, wget -c can resume some downloads.
Gotta admit, it was me. I've only used a computer for short time.
I've got my first laptop 3 years ago, and that broke after just 2 months. And anyway, with AMD Athlon 64 it greatly struggled with a browser. So really I only started seriously using computer at the start of 2021, when I got another, usable laptop. And that's when I downloaded freedownloadmanager.deb. Thankfully, I didn't get that redirect, so it was a legitimate file.
Oh, I know someone who adds the word “free” to various search words like “free pdf reader” or “free flash player” (happened a very long time ago). He’s also the kind of person who I can imagine having a bunch of viruses and malware on his computer.
People not well versed in Linux.
You know, the non-techies, which the Linux community claims should know such things but obviously does not.
I once did.
Or what is Free Download Manager
I've installed and used it, and still do.
My internet connection is not that reliable, and when I download big files that are not torrents (say >1000 MB) and the download is interrupted because of internet disconnect, Firefox often has trouble getting back to it while FDM doesn't.
FDM also lets me set download speed limits, which means I can still browse the internet while downloading.
It's not my main tool for downloading stuff, but it has its uses.