Ubuntu

cannot upload files greater than 4 GB

Reported by Ulli Horlacher on 2008-05-27
2
Affects Status Importance Assigned to Milestone
XULRunner
Fix Released
Medium
firefox-3.0 (Ubuntu)
Low
Unassigned

Bug Description

Binary package hint: firefox-3.0

System is:

Ubuntu 8.04 with kernel 2.6.24-16-generic x86_64
Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
3.0~b5+nobinonly-0ubuntu3

When using a HTML upload form (multipart/form-data) and attaching a file with 5 GB, firefox only sends 1 GB.

This is the file locally:

root@pussy:/export/tmp# ll
-rw-r--r-- root root 5368709120 2008-05-27 14:21:08 5GB.tmp

In the webserver log I see:

CONNECT 2008-05-27 14:27:25 pussy.rus.uni-stuttgart.de 129.69.1.33
POST /fup?to=framstag&from=framstag HTTP/1.1
Host: fex:8080
User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
Keep-Alive: 300
Connection: keep-alive
Referer: http://fex:8080/fup
Content-Type: multipart/form-data; boundary=---------------------------70512304014310478592060806319
Content-Length: 1073742513

As you can see: the Content-Length is wrong! It should be:
Content-Length: 5368709120

Webserver is http://fex.rus.uni-stuttgart.de/ and designed for HUGE uploads.

ProblemType: Bug
Architecture: amd64
Date: Tue May 27 14:42:14 2008
DistroRelease: Ubuntu 8.04
Package: firefox-3.0 3.0~b5+nobinonly-0ubuntu3
PackageArchitecture: amd64
ProcEnviron:
 LANGUAGE=en_US:en
 LC_COLLATE=en_US
 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/client/bin
 LANG=en_US
 SHELL=/bin/bash
SourcePackage: firefox-3.0
Uname: Linux 2.6.24-16-generic x86_64

the only way to really support >2G downloads would be to switch all interfaces
over and make those interfaces be the only way to do things. this means
deprecating several key xpcom/necko interfaces (namely nsIInputStream and
nsIStreamListener).

Someone needs to really see how common these large files are now, and/or gauge
when they will be common. I've seen one game demo that broke the 1gig mark, and
I suspect this to be more and more common. That's still not 2gig's but I'm sure
it's comming in the not too distant future.

Another alternative is to do something similiar to what Microsoft did. Provide
alternatives that allow a secondary 32 bit value to be passed. This isn't as
clean, but allows existing code to work unchanged if it's known it doesn't have
to deal with such large files.

darin: are those interfaces frozen?

biesi: last i checked ;-)

Shouldn't this really be "allow 2GB filesizes via 64 bit interfaces" ?

I guess it depends on what you mean 64 bit interfaces? Win32 a 32 bit interface
API provides 32 bit API functions that handle access to files larger than 2 gigs.

*** Bug 205443 has been marked as a duplicate of this bug. ***

*** Bug 215450 has been marked as a duplicate of this bug. ***

*** Bug 131439 has been marked as a duplicate of this bug. ***

Going to make this more general, so that it can apply to upload as well as download.

*** Bug 215091 has been marked as a duplicate of this bug. ***

*** Bug 225866 has been marked as a duplicate of this bug. ***

*** Bug 226391 has been marked as a duplicate of this bug. ***

Well I guess we've arrived at the not too distant future

Inf / 10 == Inf :(

ok, list of frozen interfaces that would require changes for files >2gb:

nsIChannel.idl:
  attribute long contentLength;

nsIStreamListener:
 67 in unsigned long aOffset,
 68 in unsigned long aCount);
(parameters of onDataAvailable)

nsIFile.idl: surprisingly this requires no changes.

nsIInputStream.idl: several functions + nsWriteSegmentFun
nsIOutputStream.idl: basically same as nsIInputStream

nsIScriptableInputStream:
 49 unsigned long available();
 55 string read(in unsigned long aCount);

-afaik this is a complete list of the frozen interfaces that would require changes-

of the unfrozen ones, nsIWebProgressListener.idl comes to mind, but most likely
there are others.

unfrozen ifaces in xpcom:
nsIAsync{Input,Output}Stream: in unsigned long aRequestedCount
nsIByteArrayInputStream.idl: (maybe)
NS_NewByteArrayInputStream (nsIByteArrayInputStream ** aResult, char * buffer,
unsigned long size);
(size would be the part to change, but who would create a byte input stream with
more than 4 GB?)

nsIObjectInputStream: unlikely (putBuffer(in charPtr aBuffer, in PRUint32 aLength))

nsIObservableOutputStream.idl: void onWrite(in nsIOutputStream outStr, in
unsigned long amount);

nsIPipe.idl: segmentSize, segmentCount

nsISeekableStream.idl:
    void seek(in long whence, in long offset);
    unsigned long tell();

nsIStringStream.idl: similar to nsIByteArrayInputStream

*** Bug 215450 has been marked as a duplicate of this bug. ***

*** Bug 228968 has been marked as a duplicate of this bug. ***

I think were going to see more 2GB+ downloads in the future, as said earlier,
some gaming demo's are creeping up already to the 1GB mark. 2GB is only a
matter of time.

DVD images could also be that large.

*** Bug 229979 has been marked as a duplicate of this bug. ***

I think that list of frozen interfaces is wrong - if unsigned longs are used,
that gives us 4GB, not 2GB. The content length is an issue, though happily, not
for mailnews, since we only open streams on parts of a file :-)

sure, 4 GB are better than 2 GB, but I don't think we should limit these APIs to
4 GB either.

2GB -> 4GB is a bandaid, not a fix
For starters a DVD image is easily above 4 GB...

48 or 64bit sizes feels like the way to go here, unless people want to keep
revisiting this issue every 6 months.

nsISeekableStream now supports 64 bit streams (though some implementations will
truncate at 32 bits and ASSERT)

*** Bug 242859 has been marked as a duplicate of this bug. ***

User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.6) Gecko/20040113
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.6) Gecko/20040113

The Download Manager has problems with files over 4gb. For example the DVD-ISO
from Linux Fedora Core 2 (=> URL) has a size of 4gb+79mb. The download manager
shows only the 79mb and cancels the download after the 79mb are reached.

Other software like e.g. Opera 7.5 has similar problems (shows only 79mb, too),
but doesn't stop the transfer.

Reproducible: Always
Steps to Reproduce:
1. start a download with a size bigger than 4gb (e.g. 4gb+79mb)
2. download manager shows wrong size (79mb)
2. wait until the file size above the 4gb is reached (=> >79mb)

Actual Results:
The download stops after 79mb.

Expected Results:
- recognize the real file size
- not cancel the download

Some other programs have similar problems. Checked with wget 1.8 (Linux),
Leechget (WinXP), Opera 7.5 (WinXP). Only Opera continued downloading but
without showing the real file size or correct estimated transfer time.

Tried to save the file on a WinXP NTFS partition that can handle files >4gb.

hmm, interesting. haven't seen this specific bug before.

probably httpchannel truncates the content-length to 32bit, and stops reading
from the socket after having read the truncated number of bytes?

yeah, that's probably true. if the file were served using 'Transfer-Encoding:
chunked', then it's possible that we might be able to download the whole thing.
 if a Content-Length header is specified, then we will indeed treat it as a
32-bit numeric value :-(

dup 184452, please.

(In reply to comment #3)
> dup 184452, please.

hmm, this is about a very specific problem of the http channel... don't think it
should be marked as duplicate

*** Bug 245115 has been marked as a duplicate of this bug. ***

Created an attachment (id=150153)
patch

this is not exactly the minimal patch to fix this bug... it contains parts of
bug 227057, but is not complete for that (nsIResumableEntityID should really
just be a string...) but it would be some effort to untangle that now, so I
decided to leave this as-is; if you want I can try to split this up again.

Yes, the XPCOM part is needed. Otherwise, nsInputStreamPump cancels the
channel, thinking the consumer didn't read any data, while in fact the counter
just overflowed and a number like 10 is smaller than a number like
PR_UINT32_MAX.

with this patch, TestProtocols can successfully read a 4294967306 bytes file
(should be 4 GB + 10 bytes) over HTTP (using a simple script as a "server",
apache doesn't seem to support files > 4 GB on x86, much to my annoyance)

other things that would need doing is make some (more) interfaces use 64 bit
numbers (like nsIWebProgressListener, nsITransportEventSink,
nsIProgressEventSink), but that would make this a tree-wide change, which I
didn't feel like currently

(From update of attachment 150153)
>Index: netwerk/base/src/nsInputStreamPump.cpp

>+ printf("*** offsets: after: %lld, before: %lld\n", offsetAfter, offsetBefore);

you didn't mean to keep this printf in, right?

>Index: netwerk/base/src/nsInputStreamPump.h

>- PRUint32 mStreamOffset;
>- PRUint32 mStreamLength;
>+ nsInt64 mStreamOffset;
>+ nsInt64 mStreamLength;

slightly concerning to me that this also means a change from
unsigned to signed arithmetic. please be sure to double-check
that we aren't making any assumptions anywhere about these being
unsigned.

>Index: netwerk/base/src/nsResumableEntityID.cpp

>+ mSize(LL_INIT(0xffffffff, 0xffffffff)) {

LL_MAXUINT?

>+ if (LL_EQ(mSize, LL_INIT(0xffffffff, 0xffffffff)))

same here,

>+ size = LL_INIT(0xffffffff, 0xffffffff);

and here.

>Index: netwerk/protocol/ftp/src/nsFTPChannel.cpp

>+ mStartPos(LL_INIT(0xffffffff, 0xffffffff))

are you trying to avoid the overhead of calling LL_MaxUint?
i don't like hardcoding magic numbers like this. what if
you were to accidentally type only 7 f's? ;-)

>Index: netwerk/protocol/ftp/src/nsFtpConnectionThread.cpp

>+ mFileSize = LL_INIT(0xffffffff, 0xffffffff);

more occurances of this guy.

>+ PR_sscanf(mResponseMsg.get() + 4, "%llu", &mFileSize);
>+ // XXX this so sucks
>+ PRUint32 size32;
>+ LL_L2UI(size32, mFileSize);
>+ if (NS_FAILED(mChannel->SetContentLength(size32))) return FTP_ERROR;

so create a new internal API for FTP... or is it the case that the
nsFTPChannel's mContentLength is only ever used with GetContentLength?

>Index: netwerk/protocol/http/src/nsHttpChannel.cpp

>+ PRUint64 size = LL_INIT(0xffffffff, 0xffffffff);

ding!

>Index: netwerk/protocol/http/src/nsHttpResponseHead.cpp

>+ mHeaders.SetHeader(nsHttp::Content_Length, nsPrintfCString("%lld", len));

hmm... the default buffer size for nsPrintfCString is not large
enough for 64-bit integers. perhaps we should increase the default
buffer size from 15 to 20 (not including trailing null byte).

ok, please come up with a better solution for LL_MAXUINT if you don't
want to simply use it.

it also seems like nsUint64 would be handy in some cases, no?

overall this patch looks really good... thanks for doing this biesi!!

i also saw this code passing 64-bit values to nsITransportEventSink. i guess
those get silently downcast to 32-bit values. at some point we need to rev all
of our progress APIs to work with 64-bit values. my gut feeling is that we'll
need to preserve existing progress APIs (even though they are not frozen) just
because of the fact that so many extensions and embedders already use them.
nsIWebProgressListener is effectively frozen whether we like it or not! :-(

(In reply to comment #6)
> you didn't mean to keep this printf in, right?

oops, indeed not

> so create a new internal API for FTP... or is it the case that the
> nsFTPChannel's mContentLength is only ever used with GetContentLength?

yeah, it is.

> ok, please come up with a better solution for LL_MAXUINT if you don't
> want to simply use it.

I'll use LL_MaxUint() - I was just not aware it existed. bug 245923 filed.

> it also seems like nsUint64 would be handy in some cases, no?

indeed. I was kinda trying to only touch netwerk/ in this patch. Bug 245927
filed; adding dependency for now...

(In reply to comment #7)
> i also saw this code passing 64-bit values to nsITransportEventSink. i guess
> those get silently downcast to 32-bit values.

yes, nsInt64 has an operator PRUint32()

> at some point we need to rev all
> of our progress APIs to work with 64-bit values. my gut feeling is that we'll
> need to preserve existing progress APIs (even though they are not frozen) just
> because of the fact that so many extensions and embedders already use them.
> nsIWebProgressListener is effectively frozen whether we like it or not! :-(

that's unfortunate :( there's always the possibility of nsIWebProgressListener2
of course...

Created an attachment (id=150552)
patch v2

now using LL_MaxUint, and a few other changes

(From update of attachment 150552)
Looks good.

this was checked in 2004-06-16 12:51

Is bug 247599 related to this? Probably its the same issue of 32 vs 64 bits
representation, but maybe in a different part of Mozilla.

68 comments hidden view all 148 comments

*** Bug 334496 has been marked as a duplicate of this bug. ***

This 4GB limit size is a problem that I've run into in developing FireFTP. It seems the problem lies with nsIBinaryOutputStream.writeBytes:

Error: [Exception... "Component returned failure code: 0x80004005 (NS_ERROR_FAILURE) [nsIBinaryOutputStream.writeBytes]" nsresult: "0x80004005 (NS_ERROR_FAILURE)" location: "JS frame :: chrome://fireftp/content/js/connection/dataSocket.js :: anonymous :: line 258" data: no]

1 comments hidden view all 148 comments

Can be confirmed trying to download the x64-version of Windows Vista beta 2, exceeding 4 GB by 13,7 MB.

(In reply to comment #10)
> Can be confirmed trying to download the x64-version of Windows Vista beta 2

Can you give the URL of this file?

(In reply to comment #11)
> (In reply to comment #10)
> > Can be confirmed trying to download the x64-version of Windows Vista beta 2
>
> Can you give the URL of this file?
>

The URL is http://download.windowsvista.com/dl/preview/beta2/en/x64/iso/vista_5384.4.060518-1455_winmain_beta2_x64fre_client-LB2CxFRE_EN_DVD.iso

I ran into the same problem trying to download this (with Firefox 1.5.0.4, Windows XP SP2, NTFS) -- it fails at exactly 4GB.

*** Bug 350903 has been marked as a duplicate of this bug. ***

*** Bug 350903 has been marked as a duplicate of this bug. ***

Besides DVDs and video media files, I keep running into this bug downloading the English wikipedia for offline processing. The compressed version of the database (one entry per article without revision history) has been over 2 Gbytes for some time. A recent image is at http://download.wikipedia.org/enwiki/20060920/enwiki-20060920-pages-meta-current.xml.bz2

I hope SeaMonkey/Firefox/Mozilla would be able to download this before Microsoft fixes XP's builtin ftp utility....

Hello ,Give my vote to you

*** Bug 304161 has been marked as a duplicate of this bug. ***

*** Bug 336001 has been marked as a duplicate of this bug. ***

Sites that post multi-gig data files do commonly become the reality.
Firefox cannot handle them gracefully (neither can IE, but who cares?).
As of today's mainstream browsers (not concerning command line tools)
only the latest Opera was able to download this file correctly
(i.e. completely):

ftp://ftp.ncbi.nih.gov/pub/geo/DATA/supplementary/series/GSE2109/GSE2109_RAW.tar
(and there are many more similar huge files there)

So I am voting for this bug!

My download (MSDN Servicepack 1)
http://download.microsoft.com/download/8/5/4/854f7409-47bd-41a2-b3b2-1a4875294550/MSDVDEUDVDX1370478.img
stopped at 2 GB (90 %).
But this download has 2,320,840,704 Bytes.
So I cannot download files greater than 2 GB by Firefox.

mass reassigning to nobody.

(In reply to comment #94 by Josef Goldbrunner)
> http://download.microsoft.com/download/8/5/4/854f7409-47bd-41a2-b3b2-1a4875294550/MSDVDEUDVDX1370478.img

This download (2.2G) works for me without problems (Firefox 2.0.0.6, actually Debian's Iceweasel). The download completes without errors, and the DVD image is ok. Which version of Firefox were you using?
Can this possibly be a built difference between Josef's version and mine (Windows vs. Linux)?

Also, I have no problems downloading files larger than 4G via http and ftp. Why am I not hitting this bug?

*** Bug 412262 has been marked as a duplicate of this bug. ***

Ulli Horlacher (framstag) wrote :

Binary package hint: firefox-3.0

System is:

Ubuntu 8.04 with kernel 2.6.24-16-generic x86_64
Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
3.0~b5+nobinonly-0ubuntu3

When using a HTML upload form (multipart/form-data) and attaching a file with 5 GB, firefox only sends 1 GB.

This is the file locally:

root@pussy:/export/tmp# ll
-rw-r--r-- root root 5368709120 2008-05-27 14:21:08 5GB.tmp

In the webserver log I see:

CONNECT 2008-05-27 14:27:25 pussy.rus.uni-stuttgart.de 129.69.1.33
POST /fup?to=framstag&from=framstag HTTP/1.1
Host: fex:8080
User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
Keep-Alive: 300
Connection: keep-alive
Referer: http://fex:8080/fup
Content-Type: multipart/form-data; boundary=---------------------------70512304014310478592060806319
Content-Length: 1073742513

As you can see: the Content-Length is wrong! It should be:
Content-Length: 5368709120

Webserver is http://fex.rus.uni-stuttgart.de/ and designed for HUGE uploads.

ProblemType: Bug
Architecture: amd64
Date: Tue May 27 14:42:14 2008
DistroRelease: Ubuntu 8.04
Package: firefox-3.0 3.0~b5+nobinonly-0ubuntu3
PackageArchitecture: amd64
ProcEnviron:
 LANGUAGE=en_US:en
 LC_COLLATE=en_US
 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/client/bin
 LANG=en_US
 SHELL=/bin/bash
SourcePackage: firefox-3.0
Uname: Linux 2.6.24-16-generic x86_64

Ulli Horlacher (framstag) wrote :

On Tue, May 27, 2008 at 12:57:04PM -0000, Ulli Horlacher wrote:
> Public bug reported:
>
> Binary package hint: firefox-3.0
>
> System is:
>
> Ubuntu 8.04 with kernel 2.6.24-16-generic x86_64
> Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
> 3.0~b5+nobinonly-0ubuntu3
>
> When using a HTML upload form (multipart/form-data) and attaching a file
> with 5 GB, firefox only sends 1 GB.

most likely known upstream. we should find the right bug id in
bugzilla.mozilla.org

 affects xulrunner
 status incomplete

 affects ubuntu/firefox-3.0
 status incomplete
 importance low

 - Alexander

Changed in firefox-3.0:
importance: Undecided → Low
status: New → Incomplete

I am having this problem with both firefox 3.0 and 2.0.0.14 for windows. truncates them to 2GiB.
somebody's using a signed 32-bit integer somewhere...
one of the places I am having this problem with is www.opensuse.com trying to download opensuse 11.0 DVD which is 4.3GB.
I have files on my web site which are also 4.6GB.

*please* fix!

Alexander Sack (asac) on 2008-11-23
Changed in xulrunner:
importance: Undecided → Unknown
status: Incomplete → Unknown
Changed in firefox-3.0:
status: Incomplete → Triaged
Changed in xulrunner:
status: Unknown → Confirmed

Created an attachment (id=406896)
Firefox 3.5 passed the 2GB mark successfully

Running Firefox 3.5.3, I was able to download the 10.1 Gentoo Live DVD, a 2.6 GB file found at http://distfiles.gentoo.org/releases/amd64/10.1/livedvd-amd64-multilib-10.1.iso

User Agent string: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.3) Gecko/20090910 Ubuntu/9.04 (jaunty) Firefox/3.5.3

Package information: 3.5.3+build1+nobinonly-0ubuntu0.9.04.2

This package was built from http://archive.ubuntu.com/ubuntu/pool/universe/f/firefox-3.5/firefox-3.5_3.5.3+build1+nobinonly.orig.tar.gz patched with http://archive.ubuntu.com/ubuntu/pool/universe/f/firefox-3.5/firefox-3.5_3.5.3+build1+nobinonly-0ubuntu0.9.04.2.diff.gz with cosmetic local modifications following the steps on the first post of http://ubuntuforums.org/showthread.php?t=1225754

Is it safe to mark this bug as fixed?

I know the FF 3.52 download manager handles files over 2GiB. I just tested it with a local web page and a DVD ISO file.

Agreed. Just downloaded the 10,181.73 MiB (9.94 GiB) file from http://download.wikimedia.org/enwiki/20091009/enwiki-20091009-pages-meta-current.xml.bz2 using FF 3.5.3 on Windows XP SP2. It's fixed. It works great.

Looks like all the bugs this is depending on are fixed and so this is now working. Marking fixed.

I always saw this bug to be about all parts of necko that handle 32-bit file sizes, and not all of those are fixed. in particular, upload isn't...

Oh, true, completely forgot about upload. Can't seem to find any place where I could even begin to upload >2gig file.

You do it locally, with a .html page saved on your disk (or a local webserver) that has some sort of upload system on it. Unfortunately the only upload system I know is Uber-Uploader, and I've never checked if it handles 2 GB files.

(In reply to comment #104)
> Oh, true, completely forgot about upload. Can't seem to find any place where I
> could even begin to upload >2gig file.

You can upload multi-GB files on YouTube (through regular POST, not flash upload): http://www.youtube.com/my_videos_upload?nobeta

Cheers from the YT team - we hope this can be fixed soon obviously :P

For reference, to make large files:
Linux:
dd if=/dev/zero of=4gbfile bs=1024 count=4194304
Windows:
fsutil file createnew d:\temp\4gbfile.txt 4294967296

That page specifically states that the files are to be up to 2 GB.

(In reply to comment #107)
> That page specifically states that the files are to be up to 2 GB.

So, if you try testing with a larger than 2GB file, yes, it will be marked as 'too big'. But the POST should upload completely (that is, if this bug were fixed :P)

If you need an additional partner account to test with, let me know (those are up to 20GB - email me directly). Although, I do see that you guys have the http://www.youtube.com/firefox channel - maybe you can use that one for testing? (obviously making the test videos just private ones)

(In reply to comment #106)
> For reference, to make large files:
> Linux:
> dd if=/dev/zero of=4gbfile bs=1024 count=4194304

You really want to use:
dd if=/dev/zero of=/tmp/4gbfile bs=1024 seek=4194304 count=1
faster, and doesn't actually require 4 GB of disk space :)

Changed in xulrunner:
importance: Unknown → Medium
ruediix@gmail.com (ruedii) wrote :

Unfixable in Firefox 3.0 is discontinued and the patch was not implemented until 3.5.3, thus no patch is available for people insisting on staying with 3.0.x for some reason.

However, Hardy and later have 3.6.x in their security updates for Firefox 3.0. This also fixes this bug as well as several security issues.

Fix:
Use security update that switches to newer 3.6.x version on Hardy and later.
Use upgrade packages from Mozilla.org or other sources for releases older than Hardy.

This is generally a good idea anyways, as any older releases of Firefox have notable security bugs that are fixed in the latest releases.

Changed in firefox-3.0 (Ubuntu):
status: Triaged → Fix Released
ruediix@gmail.com (ruedii) wrote :

I found further data, I am uncertain of the upload status. However, all related download bugs included are fixed.

I have downloaded a very large number of 4.7GB by HTTP in firefox files since this release.
However, my biggest uploads have never hit the 4GB limit yet.

I get mixed reports on upload. There may or may not be remaining issues with upload. The standard workaround is to use an external program for file-specific transfers when handling uploads over 4GB. However, this isn't nearly as easy on forms.

Many server advanced server software has limitations of 2GB or 4GB. This usually isn't related to the HTTP server software itself, but the additional programs and libraries used by the PHP or other CGI .

Since many servers don't support the upload of large files by HTTP, this could be an issue. You need to verify that it can function.

Here is an outline for test of verification of presence of bug.
1. Set up on your controlled home network, loopback device, or VM, an HTTP server verified to handle large files that does not utilize external programs for any file transfer handling.

2. Set it up to have a Forms based page that simply tells the server to upload the specific file from the client without having any external programs handle the file. (External programs may handle the filename and forms to process them and relay them back to the server.)

3. Try to upload of large file to said local server utilizing Firefox.

4. Check the file in directory of server, to see if it uploaded properly.

On Thu 2011-04-07 (14:28), <email address hidden> wrote:

> I found further data, I am uncertain of the upload status. However,
> all related download bugs included are fixed.
>
> I have downloaded a very large number of 4.7GB by HTTP in firefox files since this release.
> However, my biggest uploads have never hit the 4GB limit yet.

Download was never a problem with firefox, but the upload (HTTP POST) was.
And the bug is still there with firefox 3.6.16 (tested on Ubuntu 8.04.4).

> issues with upload. The standard workaround is to use an external
> program for file-specific transfers when handling uploads over 4GB.

This workaround does not work for user who are not allowed to install
software. Besides this, "fixing bugs" by substituting the software with
other software is a bad idea.

> Many server advanced server software has limitations of 2GB or 4GB.
> This usually isn't related to the HTTP server software itself, but the
> additional programs and libraries used by the PHP or other CGI .
>
> Since many servers don't support the upload of large files by HTTP, this
> could be an issue. You need to verify that it can function.

I have written an own webserver for this kind of file transfer, which
already is an ubuntu package, see
http://packages.ubuntu.com/lucid/fex

> Here is an outline for test of verification of presence of bug.
> 1. Set up on your controlled home network, loopback device, or VM, an HTTP server verified to handle large files that does not utilize external programs for any file transfer handling.
>
> 2. Set it up to have a Forms based page that simply tells the server to
> upload the specific file from the client without having any external
> programs handle the file. (External programs may handle the filename
> and forms to process them and relay them back to the server.)
>
> 3. Try to upload of large file to said local server utilizing Firefox.
>
> 4. Check the file in directory of server, to see if it uploaded
> properly.

I have testet it with a 2 GB file and in the fexserver log I see no
connect at all. Uploading files < 2 GB are ok:

CONNECT:8080 2011-04-18 11:58:28 diaspora.rus.uni-stuttgart.de 129.69.13.139 [26471_0]
POST /fup HTTP/1.1
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.16) Gecko/20110323 Ubuntu/8.04 (hardy) Firefox/3.6.16
Referer: http://fex.rus.uni-stuttgart.de:8080/fup
Content-Length: 2,098,221

You can test it by yourself with:
http://fex.rus.uni-stuttgart.de:8080/fup?skey=57e5ae9d103e4f19622bf4322f1a8609

--
Ullrich Horlacher Server- und Arbeitsplatzsysteme
Rechenzentrum E-Mail: <email address hidden>
Universitaet Stuttgart Tel: ++49-711-685-65868
Allmandring 30 Fax: ++49-711-682357
70550 Stuttgart (Germany) WWW: http://www.rus.uni-stuttgart.de/

So what's up with this bug?

Firefox 15 still truncates downloads to 4 GB

Chromium downloaded all the 4.2 GB. I'm not trolling anything here, just trying to prove that it wasn't a server or filesystem/OS problem (openSUSE 12.3 Factory)

This might work in Firefox 18 now that bug 784912 has been fixed.

Also probably relevant that bug 215450 was fixed recently.

Guys, downloads are supposed to work for a long time now. If they're not, please do file a new bug with steps to reproduce.

Since uploads are also fixed, resolving this bug.

Changed in xulrunner:
status: Confirmed → Fix Released
Displaying first 40 and last 40 comments. View all 148 comments or add a comment.
This report contains Public information  Edit
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.