Opened 15 years ago

Closed 10 years ago

#20652 closed enhancement (wontfix)

'segmented downloading' for fetch

Reported by: macports.users@… Owned by: macports-tickets@…
Priority: Normal Milestone:
Component: base Version:
Keywords: Cc: ryandesign (Ryan Carsten Schmidt)
Port:

Description (last modified by nerdling (Jeremy Lavergne))

If a port has multiple master_sites listed (either explicitly or implicitly through the likes of sourceforge), we should consider splitting the file download chunks into pieces.

The size of the file can be determined with a curl header call and then even portions distributed to each download. In the future, we can use our ping calculations to fuzz this off even segments to chunks that are fitting for the ping times.

This feature should likely be implemented as opt-in, since most users don't expect to attempt to fully saturate their bandwidth.

Change History (8)

comment:1 Changed 15 years ago by macports.users@…

Having port start multiple fetches at the same time from different mirrors to speed up the download process. This and #2421 should get user's bandwidth screaming!

comment:2 Changed 15 years ago by nerdling (Jeremy Lavergne)

Description: modified (diff)
Keywords: fetch axel curl libcurl removed
Version: 1.7.1

comment:3 Changed 15 years ago by jmroot (Joshua Root)

Type: requestenhancement

comment:4 Changed 15 years ago by tobypeterson

I don't understand the utility here. If you're on a slow connection, any single server should be capable of saturating your connection. If you're on a fast connection, it doesn't really matter.

comment:5 Changed 15 years ago by fracai

Sometimes the remote server either throttles individual connections or due to server bandwidth is the slow link. Not that this specific case should be handled by fetch, but even opening multiple connections to the same server can sometimes accelerate the download. In these cases, opening connections to multiple servers distributes the load among the mirrors while increasing the utility to the user.

comment:6 Changed 15 years ago by macports.users@…

comment:7 in reply to:  5 Changed 13 years ago by ryandesign (Ryan Carsten Schmidt)

Replying to arno+macports@…:

Sometimes the remote server either throttles individual connections or due to server bandwidth is the slow link. Not that this specific case should be handled by fetch, but even opening multiple connections to the same server can sometimes accelerate the download.

Multiple connections to the same server sounds like an entirely bad idea. If the server does not throttle its bandwidth, then all this would do is use up more server and client resources. And if the server does throttle its bandwidth, then either it won't be any faster because it's throttled by IP, or it will increase the speed, effectively circumventing the upstream bandwidth throttle, meaning the server administrator might rightly decide to ban you.

In these cases, opening connections to multiple servers distributes the load among the mirrors while increasing the utility to the user.

The whole idea seems like a whole lot of work and increased complexity in MacPorts base for very little if any benefit. We have a lot more important issues to solve than this, which is a non-issue as far as I'm concerned, and I think we should close this ticket as "wontfix".

comment:8 Changed 10 years ago by ryandesign (Ryan Carsten Schmidt)

Cc: ryandesign@… added
Resolution: wontfix
Status: newclosed
Note: See TracTickets for help on using tickets.