When I looked at the release notes sent out by email, I saw this under "New functionality": "httpd(8) can now serve files larger than 2GB in size." I'm very surprised by this.
apache has been able to do that since 2.2. Of course, a web page larger than 2 gigs is a bug not a feature...
Large File Support
httpd is now built with support for files larger than 2GB on modern 32-bit Unix systems. Support for handling >2GB request bodies has also been added.
apache has been able to do that since 2.2. Of course, a web page larger than 2 gigs is a bug not a feature...
You *are* aware that HTTP is used to transfer more than just HTML, right?
Like he said, it's a bug not a feature. Torrent and FTP are much more efficient, especially when handling interrupted transfers. HTTP doesn't. Unreliable networks can make a net-based installation process drag on and on or even freeze.
Torrent and FTP are much more efficient FTP wastes server resources and complicates interactions with firewalling and NAT by using seperate control and data connections. FTP and HTTP both have resume functionality nowadays.
Torrent is designed for peer to peer distribution, of peices this can save the server a lot of bandwith but also adds a lot of checking overhead and is somewhat controversial.
Unreliable networks can make a net-based installation process drag on and on or even freeze. That is more likely a case of poorly chosen timeouts and retry logic than any fundamental problem with the http protocol.
The web server can finally serve large files (Score:3, Interesting)
When I looked at the release notes sent out by email, I saw this under "New functionality":
"httpd(8) can now serve files larger than 2GB in size."
I'm very surprised by this.
Re: (Score:2)
When I looked at the release notes sent out by email, I saw this under "New functionality":
"httpd(8) can now serve files larger than 2GB in size."
I'm very surprised by this.
apache has been able to do that since 2.2. Of course, a web page larger than 2 gigs is a bug not a feature...
http://httpd.apache.org/docs/2.2/new_features_2_2.html [apache.org]
Large File Support
httpd is now built with support for files larger than 2GB on modern 32-bit Unix systems. Support for handling >2GB request bodies has also been added.
Re: (Score:2)
apache has been able to do that since 2.2. Of course, a web page larger than 2 gigs is a bug not a feature...
You *are* aware that HTTP is used to transfer more than just HTML, right?
Re: (Score:2)
apache has been able to do that since 2.2. Of course, a web page larger than 2 gigs is a bug not a feature...
You *are* aware that HTTP is used to transfer more than just HTML, right?
Like he said, it's a bug not a feature. Torrent and FTP are much more efficient, especially when handling interrupted transfers. HTTP doesn't. Unreliable networks can make a net-based installation process drag on and on or even freeze.
Re:The web server can finally serve large files (Score:2)
Torrent and FTP are much more efficient
FTP wastes server resources and complicates interactions with firewalling and NAT by using seperate control and data connections. FTP and HTTP both have resume functionality nowadays.
Torrent is designed for peer to peer distribution, of peices this can save the server a lot of bandwith but also adds a lot of checking overhead and is somewhat controversial.
Unreliable networks can make a net-based installation process drag on and on or even freeze.
That is more likely a case of poorly chosen timeouts and retry logic than any fundamental problem with the http protocol.