cannot upload files greater than 4 GB

Bug #235200 reported by Ulli Horlacher
2
Affects Status Importance Assigned to Milestone
XULRunner
Fix Released
Medium
firefox-3.0 (Ubuntu)
Fix Released
Low
Unassigned

Bug Description

Binary package hint: firefox-3.0

System is:

Ubuntu 8.04 with kernel 2.6.24-16-generic x86_64
Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
3.0~b5+nobinonly-0ubuntu3

When using a HTML upload form (multipart/form-data) and attaching a file with 5 GB, firefox only sends 1 GB.

This is the file locally:

root@pussy:/export/tmp# ll
-rw-r--r-- root root 5368709120 2008-05-27 14:21:08 5GB.tmp

In the webserver log I see:

CONNECT 2008-05-27 14:27:25 pussy.rus.uni-stuttgart.de 129.69.1.33
POST /fup?to=framstag&from=framstag HTTP/1.1
Host: fex:8080
User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
Keep-Alive: 300
Connection: keep-alive
Referer: http://fex:8080/fup
Content-Type: multipart/form-data; boundary=---------------------------70512304014310478592060806319
Content-Length: 1073742513

As you can see: the Content-Length is wrong! It should be:
Content-Length: 5368709120

Webserver is http://fex.rus.uni-stuttgart.de/ and designed for HUGE uploads.

ProblemType: Bug
Architecture: amd64
Date: Tue May 27 14:42:14 2008
DistroRelease: Ubuntu 8.04
Package: firefox-3.0 3.0~b5+nobinonly-0ubuntu3
PackageArchitecture: amd64
ProcEnviron:
 LANGUAGE=en_US:en
 LC_COLLATE=en_US
 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/client/bin
 LANG=en_US
 SHELL=/bin/bash
SourcePackage: firefox-3.0
Uname: Linux 2.6.24-16-generic x86_64

Tags: apport-bug
Revision history for this message
In , Darin-moz (darin-moz) wrote :

the only way to really support >2G downloads would be to switch all interfaces
over and make those interfaces be the only way to do things. this means
deprecating several key xpcom/necko interfaces (namely nsIInputStream and
nsIStreamListener).

Revision history for this message
In , Dbradley (dbradley) wrote :

Someone needs to really see how common these large files are now, and/or gauge
when they will be common. I've seen one game demo that broke the 1gig mark, and
I suspect this to be more and more common. That's still not 2gig's but I'm sure
it's comming in the not too distant future.

Another alternative is to do something similiar to what Microsoft did. Provide
alternatives that allow a secondary 32 bit value to be passed. This isn't as
clean, but allows existing code to work unchanged if it's known it doesn't have
to deal with such large files.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

darin: are those interfaces frozen?

Revision history for this message
In , Darin-moz (darin-moz) wrote :

biesi: last i checked ;-)

Revision history for this message
In , Benc-meer (benc-meer) wrote :

Shouldn't this really be "allow 2GB filesizes via 64 bit interfaces" ?

Revision history for this message
In , Dbradley (dbradley) wrote :

I guess it depends on what you mean 64 bit interfaces? Win32 a 32 bit interface
API provides 32 bit API functions that handle access to files larger than 2 gigs.

Revision history for this message
In , Darin-moz (darin-moz) wrote :

*** Bug 205443 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Bugzilla-mcsmurf (bugzilla-mcsmurf) wrote :

*** Bug 215450 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Bugzilla-mcsmurf (bugzilla-mcsmurf) wrote :

*** Bug 131439 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Dbradley (dbradley) wrote :

Going to make this more general, so that it can apply to upload as well as download.

Revision history for this message
In , Bzbarsky (bzbarsky) wrote :

*** Bug 215091 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Bzbarsky (bzbarsky) wrote :

*** Bug 225866 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Bugzilla-accessibleinter (bugzilla-accessibleinter) wrote :

*** Bug 226391 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Dbradley (dbradley) wrote :

Well I guess we've arrived at the not too distant future

Revision history for this message
In , Darin-moz (darin-moz) wrote :

Inf / 10 == Inf :(

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

ok, list of frozen interfaces that would require changes for files >2gb:

nsIChannel.idl:
  attribute long contentLength;

nsIStreamListener:
 67 in unsigned long aOffset,
 68 in unsigned long aCount);
(parameters of onDataAvailable)

nsIFile.idl: surprisingly this requires no changes.

nsIInputStream.idl: several functions + nsWriteSegmentFun
nsIOutputStream.idl: basically same as nsIInputStream

nsIScriptableInputStream:
 49 unsigned long available();
 55 string read(in unsigned long aCount);

-afaik this is a complete list of the frozen interfaces that would require changes-

of the unfrozen ones, nsIWebProgressListener.idl comes to mind, but most likely
there are others.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

unfrozen ifaces in xpcom:
nsIAsync{Input,Output}Stream: in unsigned long aRequestedCount
nsIByteArrayInputStream.idl: (maybe)
NS_NewByteArrayInputStream (nsIByteArrayInputStream ** aResult, char * buffer,
unsigned long size);
(size would be the part to change, but who would create a byte input stream with
more than 4 GB?)

nsIObjectInputStream: unlikely (putBuffer(in charPtr aBuffer, in PRUint32 aLength))

nsIObservableOutputStream.idl: void onWrite(in nsIOutputStream outStr, in
unsigned long amount);

nsIPipe.idl: segmentSize, segmentCount

nsISeekableStream.idl:
    void seek(in long whence, in long offset);
    unsigned long tell();

nsIStringStream.idl: similar to nsIByteArrayInputStream

Revision history for this message
In , Jhatax (jhatax) wrote :

*** Bug 215450 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

*** Bug 228968 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Robert-accettura (raccettura) wrote :

I think were going to see more 2GB+ downloads in the future, as said earlier,
some gaming demo's are creeping up already to the 1GB mark. 2GB is only a
matter of time.

DVD images could also be that large.

Revision history for this message
In , Bugzilla-mcsmurf (bugzilla-mcsmurf) wrote :

*** Bug 229979 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Bienvenu (bienvenu) wrote :

I think that list of frozen interfaces is wrong - if unsigned longs are used,
that gives us 4GB, not 2GB. The content length is an issue, though happily, not
for mailnews, since we only open streams on parts of a file :-)

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

sure, 4 GB are better than 2 GB, but I don't think we should limit these APIs to
4 GB either.

Revision history for this message
In , StefanHuszics (stefan-huszics) wrote :

2GB -> 4GB is a bandaid, not a fix
For starters a DVD image is easily above 4 GB...

48 or 64bit sizes feels like the way to go here, unless people want to keep
revisiting this issue every 6 months.

Revision history for this message
In , Bienvenu (bienvenu) wrote :

nsISeekableStream now supports 64 bit streams (though some implementations will
truncate at 32 bits and ASSERT)

Revision history for this message
In , Wd-pobox (wd-pobox) wrote :

*** Bug 242859 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Anmeldungen (anmeldungen) wrote :

User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.6) Gecko/20040113
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.6) Gecko/20040113

The Download Manager has problems with files over 4gb. For example the DVD-ISO
from Linux Fedora Core 2 (=> URL) has a size of 4gb+79mb. The download manager
shows only the 79mb and cancels the download after the 79mb are reached.

Other software like e.g. Opera 7.5 has similar problems (shows only 79mb, too),
but doesn't stop the transfer.

Reproducible: Always
Steps to Reproduce:
1. start a download with a size bigger than 4gb (e.g. 4gb+79mb)
2. download manager shows wrong size (79mb)
2. wait until the file size above the 4gb is reached (=> >79mb)

Actual Results:
The download stops after 79mb.

Expected Results:
- recognize the real file size
- not cancel the download

Some other programs have similar problems. Checked with wget 1.8 (Linux),
Leechget (WinXP), Opera 7.5 (WinXP). Only Opera continued downloading but
without showing the real file size or correct estimated transfer time.

Tried to save the file on a WinXP NTFS partition that can handle files >4gb.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

hmm, interesting. haven't seen this specific bug before.

probably httpchannel truncates the content-length to 32bit, and stops reading
from the socket after having read the truncated number of bytes?

Revision history for this message
In , Darin-moz (darin-moz) wrote :

yeah, that's probably true. if the file were served using 'Transfer-Encoding:
chunked', then it's possible that we might be able to download the whole thing.
 if a Content-Length header is specified, then we will indeed treat it as a
32-bit numeric value :-(

Revision history for this message
In , M-kato (m-kato) wrote :

dup 184452, please.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

(In reply to comment #3)
> dup 184452, please.

hmm, this is about a very specific problem of the http channel... don't think it
should be marked as duplicate

Revision history for this message
In , Bugzilla-accessibleinter (bugzilla-accessibleinter) wrote :

*** Bug 245115 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

Created an attachment (id=150153)
patch

this is not exactly the minimal patch to fix this bug... it contains parts of
bug 227057, but is not complete for that (nsIResumableEntityID should really
just be a string...) but it would be some effort to untangle that now, so I
decided to leave this as-is; if you want I can try to split this up again.

Yes, the XPCOM part is needed. Otherwise, nsInputStreamPump cancels the
channel, thinking the consumer didn't read any data, while in fact the counter
just overflowed and a number like 10 is smaller than a number like
PR_UINT32_MAX.

with this patch, TestProtocols can successfully read a 4294967306 bytes file
(should be 4 GB + 10 bytes) over HTTP (using a simple script as a "server",
apache doesn't seem to support files > 4 GB on x86, much to my annoyance)

other things that would need doing is make some (more) interfaces use 64 bit
numbers (like nsIWebProgressListener, nsITransportEventSink,
nsIProgressEventSink), but that would make this a tree-wide change, which I
didn't feel like currently

Revision history for this message
In , Darin-moz (darin-moz) wrote :

(From update of attachment 150153)
>Index: netwerk/base/src/nsInputStreamPump.cpp

>+ printf("*** offsets: after: %lld, before: %lld\n", offsetAfter, offsetBefore);

you didn't mean to keep this printf in, right?

>Index: netwerk/base/src/nsInputStreamPump.h

>- PRUint32 mStreamOffset;
>- PRUint32 mStreamLength;
>+ nsInt64 mStreamOffset;
>+ nsInt64 mStreamLength;

slightly concerning to me that this also means a change from
unsigned to signed arithmetic. please be sure to double-check
that we aren't making any assumptions anywhere about these being
unsigned.

>Index: netwerk/base/src/nsResumableEntityID.cpp

>+ mSize(LL_INIT(0xffffffff, 0xffffffff)) {

LL_MAXUINT?

>+ if (LL_EQ(mSize, LL_INIT(0xffffffff, 0xffffffff)))

same here,

>+ size = LL_INIT(0xffffffff, 0xffffffff);

and here.

>Index: netwerk/protocol/ftp/src/nsFTPChannel.cpp

>+ mStartPos(LL_INIT(0xffffffff, 0xffffffff))

are you trying to avoid the overhead of calling LL_MaxUint?
i don't like hardcoding magic numbers like this. what if
you were to accidentally type only 7 f's? ;-)

>Index: netwerk/protocol/ftp/src/nsFtpConnectionThread.cpp

>+ mFileSize = LL_INIT(0xffffffff, 0xffffffff);

more occurances of this guy.

>+ PR_sscanf(mResponseMsg.get() + 4, "%llu", &mFileSize);
>+ // XXX this so sucks
>+ PRUint32 size32;
>+ LL_L2UI(size32, mFileSize);
>+ if (NS_FAILED(mChannel->SetContentLength(size32))) return FTP_ERROR;

so create a new internal API for FTP... or is it the case that the
nsFTPChannel's mContentLength is only ever used with GetContentLength?

>Index: netwerk/protocol/http/src/nsHttpChannel.cpp

>+ PRUint64 size = LL_INIT(0xffffffff, 0xffffffff);

ding!

>Index: netwerk/protocol/http/src/nsHttpResponseHead.cpp

>+ mHeaders.SetHeader(nsHttp::Content_Length, nsPrintfCString("%lld", len));

hmm... the default buffer size for nsPrintfCString is not large
enough for 64-bit integers. perhaps we should increase the default
buffer size from 15 to 20 (not including trailing null byte).

ok, please come up with a better solution for LL_MAXUINT if you don't
want to simply use it.

it also seems like nsUint64 would be handy in some cases, no?

overall this patch looks really good... thanks for doing this biesi!!

Revision history for this message
In , Darin-moz (darin-moz) wrote :

i also saw this code passing 64-bit values to nsITransportEventSink. i guess
those get silently downcast to 32-bit values. at some point we need to rev all
of our progress APIs to work with 64-bit values. my gut feeling is that we'll
need to preserve existing progress APIs (even though they are not frozen) just
because of the fact that so many extensions and embedders already use them.
nsIWebProgressListener is effectively frozen whether we like it or not! :-(

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

(In reply to comment #6)
> you didn't mean to keep this printf in, right?

oops, indeed not

> so create a new internal API for FTP... or is it the case that the
> nsFTPChannel's mContentLength is only ever used with GetContentLength?

yeah, it is.

> ok, please come up with a better solution for LL_MAXUINT if you don't
> want to simply use it.

I'll use LL_MaxUint() - I was just not aware it existed. bug 245923 filed.

> it also seems like nsUint64 would be handy in some cases, no?

indeed. I was kinda trying to only touch netwerk/ in this patch. Bug 245927
filed; adding dependency for now...

(In reply to comment #7)
> i also saw this code passing 64-bit values to nsITransportEventSink. i guess
> those get silently downcast to 32-bit values.

yes, nsInt64 has an operator PRUint32()

> at some point we need to rev all
> of our progress APIs to work with 64-bit values. my gut feeling is that we'll
> need to preserve existing progress APIs (even though they are not frozen) just
> because of the fact that so many extensions and embedders already use them.
> nsIWebProgressListener is effectively frozen whether we like it or not! :-(

that's unfortunate :( there's always the possibility of nsIWebProgressListener2
of course...

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

Created an attachment (id=150552)
patch v2

now using LL_MaxUint, and a few other changes

Revision history for this message
In , Bryner (bryner) wrote :

(From update of attachment 150552)
Looks good.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

this was checked in 2004-06-16 12:51

Revision history for this message
In , Bogdan-stroe (bogdan-stroe) wrote :

Is bug 247599 related to this? Probably its the same issue of 32 vs 64 bits
representation, but maybe in a different part of Mozilla.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

(In reply to comment #28)
> Is bug 247599 related to this? Probably its the same issue of 32 vs 64 bits
> representation, but maybe in a different part of Mozilla.

um, 4 MB can fit easily into a 32 bit variable. that bug is not related.

Revision history for this message
In , Mike Connor (mconnor) wrote :

*** Bug 248482 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Bugzilla-mcsmurf (bugzilla-mcsmurf) wrote :

*** Bug 252872 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Quark29 (quark29) wrote :

*** Bug 254643 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Peter-vanderwoude (peter-vanderwoude) wrote :

*** Bug 256338 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Jo-hermans (jo-hermans) wrote :

*** Bug 260859 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Marek-beyer (marek-beyer) wrote :

here an example for an application:

- a web-site with a form to choose software/update packages
- after submit the server sends you an ISO-image ready for burning on a CD/DVD

Problems:

- content-length-header ist limited to 2GB in mozilla (some browsers 4GB)
- without content-length-header the download stops after 2.4GB (of 5.5GB)

so it's time for the future :) or we have to use CDs forever

Revision history for this message
In , Dbradley (dbradley) wrote :

I really think this needs some serious attention. This will be just another
excuse for people not to use Gecko based browsers. I'm sure in intranet
environments such large files are going to be more and more common.

Revision history for this message
In , Bugzilla-accessibleinter (bugzilla-accessibleinter) wrote :

*** Bug 266323 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Olivierv-fnmfg (olivierv-fnmfg) wrote :

I just ran into this bug.

While I'm just a user, I'm commenting to agree with the fact that while
currently uncommon, files in excess of 2 GB (or 4 GB) will be seen with
increasing regularity. In my case, FC3 DVD ISO at 2.4 GB. I realize the bug is
mostly cosmetic, but everything else in Mozilla/Firefox is so polished that it
really stood out.

Revision history for this message
In , Allu-lumisade (allu-lumisade) wrote :

This probably will never get fixed, altough it's being constantly reported as a
new bug (even I couldn't find it on bugzilla for the first time I reported it)

So much for the 1.0 hype:
- Tabs still open to windows which have no toolbars or tab-bars making it
impossible to use them (i.e. opening links in a popup window)
- DM does not behave correctly with files larger than 4GB
- DM retry-function seldomly works
- Crashes with known _overflow_ exploits
(http://lcamtuf.coredump.cx/mangleme/gallery/)
- on win32 does not recover quickly from being minimized for few hours (over
night), it's rather funny how I can leave opera or ie windows open and start
using them in the morning with no lag whatsoever, but with mozilla? no no.

Revision history for this message
In , Logan+mozilla-bmo (logan+mozilla-bmo) wrote :

This is not a Firefox download manager tracking bug nor is it a place for you to
rant about the hype surrounding Firefox 1.0 and the problems you've had with it.

Revision history for this message
In , Allu-lumisade (allu-lumisade) wrote :

Oh, sorry. I thought this was the Firefox-product.. When I reported the bug I
put it under Firefox/DM, but this seems to be general Browser/Networking.

Once again, I'm truly sorry for the confusion here.

Revision history for this message
In , Philringnalda (philringnalda) wrote :

*** Bug 272315 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Bugzilla-spray (bugzilla-spray) wrote :

*** Bug 269531 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Peter-vanderwoude (peter-vanderwoude) wrote :

*** Bug 277785 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Thewulf (thewulf) wrote :

Created an attachment (id=177200)
DOwnload Screenshot

Screenshot of this bug in action

Revision history for this message
In , Bugzilla-mcsmurf (bugzilla-mcsmurf) wrote :

<email address hidden>: That's Bug 228968 AND DON'T ATTACH ANY SCREENSHOTS ON THAT
BUG EITHER. We really know it's displaying negative values, no need for more
screenshots.

Revision history for this message
In , Eero (eero) wrote :

This bug is on way for very long time, it should be fixed _fast_

just my .5 snts

Revision history for this message
In , Thewulf (thewulf) wrote :

I just didn't see any existing SS so I figured might as well. No biggie.

Revision history for this message
In , Kevin Brosnan (kbrosnan) wrote :

*** Bug 286187 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Matti-mversen (matti-mversen) wrote :

*** Bug 288939 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Gabrielmayrandchadwick+mozilla (gabrielmayrandchadwick+mozilla) wrote :

*** Bug 288939 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Matti-mversen (matti-mversen) wrote :

*** Bug 290236 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Andreas-fink (andreas-fink) wrote :

I already run multiple times into this bug. once with fedora DVD image and one
wihth MacOS X DVD disk images. Both are around 2.5GB. So its definitively time
to fix this.

It is VERY ANNOYING if you download a 2.5GB file (took me 4h) and you end up
with a 2.0GB file without any error message whatsoever. You will burn it to DVD,
try to boot it. YOu will burn it again and again until you realize your image
file is too small for what it should be.

So you wasted already more time on this than what it takes to fix this :).

Revision history for this message
In , Matti-mversen (matti-mversen) wrote :

*** Bug 293036 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Jaime-bugzilla (jaime-bugzilla) wrote :

*** Bug 293615 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Derek Balling (dredd) wrote :

2.5 Years into this bug, and I still have to go use "some other browser" if I
want to download DVD ISOs (for things like Linux, etc.).

Just another user voting that this really really needs to get fixed at some
point in the near future.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

(In reply to comment #53)
> 2.5 Years into this bug, and I still have to go use "some other browser" if I
> want to download DVD ISOs (for things like Linux, etc.).

that is fixed (in versions newer than 1.0.x). my understanding is that this bug
refers to other places as well, not just downloads, and not all of those are fixed.

Revision history for this message
In , Andreas-fink (andreas-fink) wrote :

(In reply to comment #54)
> (In reply to comment #53)
> > 2.5 Years into this bug, and I still have to go use "some other browser" if I
> > want to download DVD ISOs (for things like Linux, etc.).
>
> that is fixed (in versions newer than 1.0.x). my understanding is that this bug
> refers to other places as well, not just downloads, and not all of those are
fixed.

This is not true. it is not fixed. Of course you can download 2.5GB files with
Firefox. The download just stops after 2048MB and you THINK it has downloaded
everything. It would have been nice to have a dialogbox popping up in the
beginning saying the file is too big or such. But no, you have to wait hours and
hours to realize that all your download is wasted bandwith.

A very SERIOUS bug, especially its so old by now.
I had this again in Firefox 1.0.2. And I'm sure its still there in 1.0.4 (And
no, I wont try it until someone confirms its fixed, as downloading >2GB takes an
awful lot of time for me).

Revision history for this message
In , Rparenton (rparenton) wrote :

He said newer than 1.0.x, which means 1.1, which is still in alpha

Revision history for this message
In , Tachyon-eagle (tachyon-eagle) wrote :

yes, I can confirm that it's fixed for 1.1 tree , I am using nightly snapshot of
Deer Park alpha 1 (2005.05.30) and I've just donwloaded 2.5 GB iso which passes
its own crc test ok. However I haven't found > 4GB file on fast nearby network
so I don't know anything about > 4GB files yet, but if there are 64bit
interfaces already (and it seems like that from source codes to me) it will be
ok too.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

> but if there are 64bit
> interfaces already (and it seems like that from source codes to me)

yeah, for downloads there are. not for uploads... and maybe some other stuff too.

Revision history for this message
In , Mozilla-potophobia (mozilla-potophobia) wrote :

(In reply to comment #55)
> And I'm sure its still there in 1.0.4

I am using 1.0.4, and I just downloaded FC3 via FTP, which is 2.3GB. The
download counter went negative near the end, but the download was still valid.
Firefox produced a 2.3GB file that contains data throughout the entire file, so
I believe it's legit. That would suggest the bug is /partially/ fixed.

Revision history for this message
In , Andreas-fink (andreas-fink) wrote :

I tried the same but in my case it used HTTP instead of FTP.
I have a file of 2117734496 bytes (1.97GB) instead of 2.7GB as it should be.
So the protocol type does make a difference.

Revision history for this message
In , Bojan-antonovic (bojan-antonovic) wrote :

(In reply to comment #57)
> yes, I can confirm that it's fixed for 1.1 tree , I am using nightly snapshot of
> Deer Park alpha 1 (2005.05.30) and I've just donwloaded 2.5 GB iso which passes
> its own crc test ok. However I haven't found > 4GB file on fast nearby network
> so I don't know anything about > 4GB files yet, but if there are 64bit
> interfaces already (and it seems like that from source codes to me) it will be
> ok too.

Debian 3.1 is >4 GB. See:
http://cdimage.debian.org/debian-cd/3.1_r0a/i386/iso-dvd/debian-31r0a-i386-binary-1.iso

(or use a mirror)

Firefox 1.0.4 stopps at 2 GB.

Revision history for this message
In , Tachyon-eagle (tachyon-eagle) wrote :

(In reply to comment #61)
> Firefox 1.0.4 stopps at 2 GB.

yes, indeed, but use 1.1 version tree (
http://ftp.mozilla.org/pub/mozilla.org/firefox/nightly/latest-trunk/ ), which
has it fixed by now, and you'll see that it work for > 4 GB with FTP or HTTP
(I've tried your iso link from both and run verification on files,they are
absolutely ok and over 4 GB)

Revision history for this message
In , Ken2006 (ken2006) wrote :

It appears that the download manager.. or at least support for byte-range-resume
does not work yet. When I try to resume ('retry' as download mgr calls it), the
download manager app does not actually send the new (range) request...

sample: http://up.ascentmedia.com/upweb/test.jsp?file=FC4-i386-DVD.iso

Revision history for this message
In , Ken2006 (ken2006) wrote :

(In reply to comment #63)
> .. or at least support for byte-range-resume does not work yet.

Seems to be true for files less than 2GB as well.

Revision history for this message
In , Elmar-ludwig (elmar-ludwig) wrote :

*** Bug 301543 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Rflint (rflint) wrote :

*** Bug 308424 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Jrl-cats (jrl-cats) wrote :

*** Bug 231788 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Heinz-stuesser (heinz-stuesser) wrote :

I think there is another serious problem - and reason for fixing this bug. When
trying to download files > 4GB (for example
ftp://sunsite.informatik.rwth-aachen.de/pub/Linux/suse/i386/9.3/iso/SUSE-9.3-Eval-DVD.iso)
the browser (firefox 1.0.7, running under SuSE Linux 9.1) crashes when reaching
the 4GB border. I suppose the reason might be a "division by zero" as the
average transfer rate swapped to neg. numbers after passing the 2GB and grew
afterwards until it reached zero - and the browser crashed.

Revision history for this message
In , Ajschult (ajschult) wrote :

Heinz: that's a different problem. please filea new bug if you can reproduce it
with firefox 1.5 beta.

Revision history for this message
In , Kevin Brosnan (kbrosnan) wrote :

*** Bug 311344 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Bojan-antonovic (bojan-antonovic) wrote :

Firefox RC1 (Mac) cuts files of length over 4 GB to 4 GB when "downloading" from a file in the harddisk. The Debian ISOs are over 4 GB.

Bojan

Revision history for this message
In , Bojan-antonovic (bojan-antonovic) wrote :

Firefox 1.0.7 (Mac) cuts files of length over 4 GB to 4 GB, like Firefox 1.5 RC1, when "downloading" from a file in the harddisk. It shows the negative size of the downloaded part as described elsewhere in this bug report.

The local download is a good trick to test the download manager. Can please someone, at best with a local network at home, retest the 4 GB limit with all downloading possibilites (HTTP, FTP, harddisk) before Firefox 1.5 is released? Tests with HTTP and FTP seemed to be made and work. However, this bug should be fixed, doesn't matter how curious the download possibilities are.

Bojan

PS: I meant Firefox 1.5 RC1 instead of Firefox RC1 on my previous post. :)

Revision history for this message
In , Rflint (rflint) wrote :

*** Bug 317085 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Bojan-antonovic (bojan-antonovic) wrote :

Local-to-local downloads are still limited to 4 GB in Firefox 1.5.

Bojan

Revision history for this message
In , Rflint (rflint) wrote :

*** Bug 320136 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Oldiesmann (oldiesmann) wrote :

This is still broken in Firefox 1.5... I just spent the past couple of hours trying to download the DVD image of Fedora Core 5 (~3GB in size), and it was running smoothly the entire time. Once it hit 2GB though, it just stopped and Firefox said the download was finished when in fact it really wasn't. The download manager showed the correct status and file size, but gave me no indication that it would quit automatically after 2GB.

Revision history for this message
In , Bugs-mozilla-org (bugs-mozilla-org) wrote :

I'm using FireFox 1.5.0.1 on Windows, and was downloading the purchase of Oblivion (which is slightly bigger than 4 GB). After 4 GB (and three hours), it stopped with a write error that suggested I try saving the file somewhere else.

Two problems:

1) It really needs to support > 32 bits. fpos_t exists for a reason.

2) As long as it doesn't support > 32 bits, it needs to tell me that it won't work, with a good error message, ideally before it tries to download.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

this is the wrong bug. but as everyone keeps posting to it...

What filesystem are you saving to when it doesn't work in 1.5?

Revision history for this message
In , Cls-seawood (cls-seawood) wrote :

biesi, what's the correct bug? I noticed that after I downloaded the FC5 iso images that the DVD iso doesn't even show up in the directory lists. Via http, this could very well be a bug in apache 2.0 but via file://, it's a moz bug. In fact, via file://, ff 1.5 won't even load the directory that contains the image. The FS is ext3.

Revision history for this message
In , Bugs-mozilla-org (bugs-mozilla-org) wrote :

I'm saving to NTFS when it doesn't work in 1.5.0.1, running Windows XP SP2.

Btw: If this is the wrong bug, then what bug should I re-direct to? This bug was what came up when doing a search.

Revision history for this message
In , Ken2006 (ken2006) wrote :

Yes - if this is the 'wrong bug', which one is correct? Is there one that is general to the file & stream interfaces and not just networking or Necko? The interfaces listed earlier appear to be core? I could not find a 'closer' bug...

Or do we need a new bug/RFE for the core file/stream (not network) interfaces? One that gets the attention of the specific owners? My searching did not find another bug/RFE for file/stream that suggests (rightfully) depecating ALL 32 bit file/stream interfaces (deprecate 32 bit and adjunct new 64 bit interfaces - not change existing ones as seems to have been implied here and the reaons for no action).

Otherwise this seems to be closet match for well-doers (who are not product specialists) to express frustration (it is 3-1/2 years old and 'new', after all).

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

bug 243974 is right, and FIXED, which is what I thought the state of that issue is. the relevant download interfaces do support this. in fact last I tested this it worked for me. new issues for specific download problems should get new bugs.

comment 54 describes what this bug is about.

Revision history for this message
In , Bugs-mozilla-org (bugs-mozilla-org) wrote :

User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.1) Gecko/20060111 Firefox/1.5.0.1
Build Identifier: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.8.0.1) Gecko/20060111 Firefox/1.5.0.1

This does not appear to be a dup of https://bugzilla.mozilla.org/show_bug.cgi?id=243974
This does not appear to be a dup of https://bugzilla.mozilla.org/show_bug.cgi?id=184452

When I purchased Elder Scrolls IV: Oblivion from Direct2Drive.com, and downloaded it, the download went for a few hours, and then failed with a disk write error ("could not write the file to this location, try saving it elsewhere").

The downloaded file is actually somewhat larger than 4 GB, and the failure was consistent with a failure at the 4 GB point; the file created was also 4 GB in size after the write error.

Because of the error message and the symptoms, I believe this to be different from the other two bugs mentioned.

Reproducible: Always

Steps to Reproduce:
1. Purchase Oblivion from Direct2Drive.com (on WXP SP2 machine running NTFS).
2. Download in Mozilla (rejecting their custom ActiveX download control).
3. Wait a few hours while downloading.

Actual Results:
Get error about not being able to write in the download directory, suggesting trying to save elsewhere.

Expected Results:
Download of file > 4 GB should complete correctly, or, absolutely worst case, give me the error up-front.

Given that this was a file write error, I think the problem is with the file output part, not with the network reading part.

Revision history for this message
In , Bugs-mozilla-org (bugs-mozilla-org) wrote :

This is not the right bug for file download disk writing problems with files > 4 GB. Also, bug 243974 was not consistent with the symptoms I saw in 1.5.0.1.

See new bug: https://bugzilla.mozilla.org/show_bug.cgi?id=331647

Revision history for this message
In , Dtownsend (dtownsend) wrote :

What is the URL of the file you are downloading?

Revision history for this message
In , Steve-england (steve-england) wrote :

NTFS has a 4GB file limit?

Revision history for this message
In , Steve-england (steve-england) wrote :

OK Ignore comment 2, the limit is for FAT32. Which leads me to ask, were you downloading to a FAT32 partition?

Revision history for this message
In , Bugs-mozilla-org (bugs-mozilla-org) wrote :

As I said, I'm using NTFS on Windows XP SP2. No FAT anywhere.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

that testing this requires a purchase first makes it somewhat hard to debug :(

Revision history for this message
In , Dbradley (dbradley) wrote :

My first thought was that something probably wasn't using the extended size of a Win32 API function, but I would expect that to have just processed a small part of the file. So probably something a little less straight forward.

I would suspect a quick test would be to host a >4gb file on a local web server and then try and download it. I wouldn't think that would be all that hard.

Revision history for this message
In , Vseerror (vseerror) wrote :

*** Bug 299598 has been marked as a duplicate of this bug. ***

Revision history for this message
In , efa (efa) wrote :

until this bug is directly linked from kernel.org FAQ, we got a lot of request to fix.
http://www.kernel.org/faq/#largefiles

Revision history for this message
In , Bugzilla-mcsmurf (bugzilla-mcsmurf) wrote :

Ok, i did some tests here on Windows 2000 with a NTFS drive with a current SeaMonkey trunk build. I created a test file for this with "cat /dev/zero>4_gb.test".
First test: Download a 4,76 GB (5.119.916.032 Bytes) file from file:///I:/4_gb.test to I:\4_gb.test2.test (I: is a NTFS drive), so local download. At the size of 3,99 GB (4.294.967.295 Bytes) it stops the download with "Finished, -1 KB downloaded" (in the download progress window).
Second test: Download the same file from an Apache 2.0.55 server via http://localhost/i/4_gb.test. At the size of 4,00 GB (4.294.967.296 Bytes) you get a alert "I:\4_gb.test3.test.part could not be saved, because an unknown error occoured. Try saving to a different location."
Third test: Download the same file from a Apache 2.0.55 server via http://localhost/i/4_gb.test, but this time with wget. The download works fine.

Feel free to repeat those tests with Windows XP if you want.

Revision history for this message
In , Bugzilla-mcsmurf (bugzilla-mcsmurf) wrote :

*** Bug 334496 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Mime (mime) wrote :

This 4GB limit size is a problem that I've run into in developing FireFTP. It seems the problem lies with nsIBinaryOutputStream.writeBytes:

Error: [Exception... "Component returned failure code: 0x80004005 (NS_ERROR_FAILURE) [nsIBinaryOutputStream.writeBytes]" nsresult: "0x80004005 (NS_ERROR_FAILURE)" location: "JS frame :: chrome://fireftp/content/js/connection/dataSocket.js :: anonymous :: line 258" data: no]

Revision history for this message
In , Dispater42 (dispater42) wrote :

Can be confirmed trying to download the x64-version of Windows Vista beta 2, exceeding 4 GB by 13,7 MB.

Revision history for this message
In , S-chapel (s-chapel) wrote :

(In reply to comment #10)
> Can be confirmed trying to download the x64-version of Windows Vista beta 2

Can you give the URL of this file?

Revision history for this message
In , Hsds04a (hsds04a) wrote :

(In reply to comment #11)
> (In reply to comment #10)
> > Can be confirmed trying to download the x64-version of Windows Vista beta 2
>
> Can you give the URL of this file?
>

The URL is http://download.windowsvista.com/dl/preview/beta2/en/x64/iso/vista_5384.4.060518-1455_winmain_beta2_x64fre_client-LB2CxFRE_EN_DVD.iso

I ran into the same problem trying to download this (with Firefox 1.5.0.4, Windows XP SP2, NTFS) -- it fails at exactly 4GB.

Revision history for this message
In , Vyv03354 (vyv03354) wrote :

Fixed by bug 340956.

Revision history for this message
In , Ria-klaassen (ria-klaassen) wrote :

*** Bug 350903 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Sciguyryan (sciguyryan) wrote :

*** Bug 350903 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Wallykramer (wallykramer) wrote :

Besides DVDs and video media files, I keep running into this bug downloading the English wikipedia for offline processing. The compressed version of the database (one entry per article without revision history) has been over 2 Gbytes for some time. A recent image is at http://download.wikipedia.org/enwiki/20060920/enwiki-20060920-pages-meta-current.xml.bz2

I hope SeaMonkey/Firefox/Mozilla would be able to download this before Microsoft fixes XP's builtin ftp utility....

Revision history for this message
In , Dongjian-btest (dongjian-btest) wrote :

Hello ,Give my vote to you

Revision history for this message
In , Hskupin (hskupin) wrote :

*** Bug 304161 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Cbook (cbook) wrote :

*** Bug 336001 has been marked as a duplicate of this bug. ***

Revision history for this message
In , Lavr (lavr) wrote :

Sites that post multi-gig data files do commonly become the reality.
Firefox cannot handle them gracefully (neither can IE, but who cares?).
As of today's mainstream browsers (not concerning command line tools)
only the latest Opera was able to download this file correctly
(i.e. completely):

ftp://ftp.ncbi.nih.gov/pub/geo/DATA/supplementary/series/GSE2109/GSE2109_RAW.tar
(and there are many more similar huge files there)

So I am voting for this bug!

Revision history for this message
In , Auru (auru) wrote :

My download (MSDN Servicepack 1)
http://download.microsoft.com/download/8/5/4/854f7409-47bd-41a2-b3b2-1a4875294550/MSDVDEUDVDX1370478.img
stopped at 2 GB (90 %).
But this download has 2,320,840,704 Bytes.
So I cannot download files greater than 2 GB by Firefox.

Revision history for this message
In , Mozbugz-dougt (mozbugz-dougt) wrote :

mass reassigning to nobody.

Revision history for this message
In , bytewise (bytewise) wrote :

(In reply to comment #94 by Josef Goldbrunner)
> http://download.microsoft.com/download/8/5/4/854f7409-47bd-41a2-b3b2-1a4875294550/MSDVDEUDVDX1370478.img

This download (2.2G) works for me without problems (Firefox 2.0.0.6, actually Debian's Iceweasel). The download completes without errors, and the DVD image is ok. Which version of Firefox were you using?
Can this possibly be a built difference between Josef's version and mine (Windows vs. Linux)?

Also, I have no problems downloading files larger than 4G via http and ftp. Why am I not hitting this bug?

Revision history for this message
In , Jo-hermans (jo-hermans) wrote :

*** Bug 412262 has been marked as a duplicate of this bug. ***

Revision history for this message
Ulli Horlacher (framstag) wrote :

Binary package hint: firefox-3.0

System is:

Ubuntu 8.04 with kernel 2.6.24-16-generic x86_64
Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
3.0~b5+nobinonly-0ubuntu3

When using a HTML upload form (multipart/form-data) and attaching a file with 5 GB, firefox only sends 1 GB.

This is the file locally:

root@pussy:/export/tmp# ll
-rw-r--r-- root root 5368709120 2008-05-27 14:21:08 5GB.tmp

In the webserver log I see:

CONNECT 2008-05-27 14:27:25 pussy.rus.uni-stuttgart.de 129.69.1.33
POST /fup?to=framstag&from=framstag HTTP/1.1
Host: fex:8080
User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
Keep-Alive: 300
Connection: keep-alive
Referer: http://fex:8080/fup
Content-Type: multipart/form-data; boundary=---------------------------70512304014310478592060806319
Content-Length: 1073742513

As you can see: the Content-Length is wrong! It should be:
Content-Length: 5368709120

Webserver is http://fex.rus.uni-stuttgart.de/ and designed for HUGE uploads.

ProblemType: Bug
Architecture: amd64
Date: Tue May 27 14:42:14 2008
DistroRelease: Ubuntu 8.04
Package: firefox-3.0 3.0~b5+nobinonly-0ubuntu3
PackageArchitecture: amd64
ProcEnviron:
 LANGUAGE=en_US:en
 LC_COLLATE=en_US
 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/client/bin
 LANG=en_US
 SHELL=/bin/bash
SourcePackage: firefox-3.0
Uname: Linux 2.6.24-16-generic x86_64

Revision history for this message
Ulli Horlacher (framstag) wrote :
Revision history for this message
Alexander Sack (asac) wrote : Re: [Bug 235200] [NEW] cannot upload files greater than 4 GB

On Tue, May 27, 2008 at 12:57:04PM -0000, Ulli Horlacher wrote:
> Public bug reported:
>
> Binary package hint: firefox-3.0
>
> System is:
>
> Ubuntu 8.04 with kernel 2.6.24-16-generic x86_64
> Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9b5) Gecko/2008041515 Firefox/3.0b5
> 3.0~b5+nobinonly-0ubuntu3
>
> When using a HTML upload form (multipart/form-data) and attaching a file
> with 5 GB, firefox only sends 1 GB.

most likely known upstream. we should find the right bug id in
bugzilla.mozilla.org

 affects xulrunner
 status incomplete

 affects ubuntu/firefox-3.0
 status incomplete
 importance low

 - Alexander

Changed in firefox-3.0:
importance: Undecided → Low
status: New → Incomplete
Revision history for this message
In , Jim Michaels (jmichae3-yahoo) wrote :

I am having this problem with both firefox 3.0 and 2.0.0.14 for windows. truncates them to 2GiB.
somebody's using a signed 32-bit integer somewhere...
one of the places I am having this problem with is www.opensuse.com trying to download opensuse 11.0 DVD which is 4.3GB.
I have files on my web site which are also 4.6GB.

*please* fix!

Alexander Sack (asac)
Changed in xulrunner:
importance: Undecided → Unknown
status: Incomplete → Unknown
Changed in firefox-3.0:
status: Incomplete → Triaged
Changed in xulrunner:
status: Unknown → Confirmed
Revision history for this message
In , Joe Amenta (airbreather) wrote :

Created an attachment (id=406896)
Firefox 3.5 passed the 2GB mark successfully

Running Firefox 3.5.3, I was able to download the 10.1 Gentoo Live DVD, a 2.6 GB file found at http://distfiles.gentoo.org/releases/amd64/10.1/livedvd-amd64-multilib-10.1.iso

User Agent string: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.9.1.3) Gecko/20090910 Ubuntu/9.04 (jaunty) Firefox/3.5.3

Package information: 3.5.3+build1+nobinonly-0ubuntu0.9.04.2

This package was built from http://archive.ubuntu.com/ubuntu/pool/universe/f/firefox-3.5/firefox-3.5_3.5.3+build1+nobinonly.orig.tar.gz patched with http://archive.ubuntu.com/ubuntu/pool/universe/f/firefox-3.5/firefox-3.5_3.5.3+build1+nobinonly-0ubuntu0.9.04.2.diff.gz with cosmetic local modifications following the steps on the first post of http://ubuntuforums.org/showthread.php?t=1225754

Is it safe to mark this bug as fixed?

Revision history for this message
In , Jim Michaels (jmichae3-yahoo) wrote :

I know the FF 3.52 download manager handles files over 2GiB. I just tested it with a local web page and a DVD ISO file.

Revision history for this message
In , Wallykramer (wallykramer) wrote :

Agreed. Just downloaded the 10,181.73 MiB (9.94 GiB) file from http://download.wikimedia.org/enwiki/20091009/enwiki-20091009-pages-meta-current.xml.bz2 using FF 3.5.3 on Windows XP SP2. It's fixed. It works great.

Revision history for this message
In , Dbradley (dbradley) wrote :

Looks like all the bugs this is depending on are fixed and so this is now working. Marking fixed.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

I always saw this bug to be about all parts of necko that handle 32-bit file sizes, and not all of those are fixed. in particular, upload isn't...

Revision history for this message
In , Dbradley (dbradley) wrote :

Oh, true, completely forgot about upload. Can't seem to find any place where I could even begin to upload >2gig file.

Revision history for this message
In , Guillaume Parent (gparent) wrote :

You do it locally, with a .html page saved on your disk (or a local webserver) that has some sort of upload system on it. Unfortunately the only upload system I know is Uber-Uploader, and I've never checked if it handles 2 GB files.

Revision history for this message
In , Mime (mime) wrote :

(In reply to comment #104)
> Oh, true, completely forgot about upload. Can't seem to find any place where I
> could even begin to upload >2gig file.

You can upload multi-GB files on YouTube (through regular POST, not flash upload): http://www.youtube.com/my_videos_upload?nobeta

Cheers from the YT team - we hope this can be fixed soon obviously :P

For reference, to make large files:
Linux:
dd if=/dev/zero of=4gbfile bs=1024 count=4194304
Windows:
fsutil file createnew d:\temp\4gbfile.txt 4294967296

Revision history for this message
In , Guillaume Parent (gparent) wrote :

That page specifically states that the files are to be up to 2 GB.

Revision history for this message
In , Mime (mime) wrote :

(In reply to comment #107)
> That page specifically states that the files are to be up to 2 GB.

So, if you try testing with a larger than 2GB file, yes, it will be marked as 'too big'. But the POST should upload completely (that is, if this bug were fixed :P)

If you need an additional partner account to test with, let me know (those are up to 20GB - email me directly). Although, I do see that you guys have the http://www.youtube.com/firefox channel - maybe you can use that one for testing? (obviously making the test videos just private ones)

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

(In reply to comment #106)
> For reference, to make large files:
> Linux:
> dd if=/dev/zero of=4gbfile bs=1024 count=4194304

You really want to use:
dd if=/dev/zero of=/tmp/4gbfile bs=1024 seek=4194304 count=1
faster, and doesn't actually require 4 GB of disk space :)

Changed in xulrunner:
importance: Unknown → Medium
Revision history for this message
ruediix@gmail.com (ruedii) wrote :

Unfixable in Firefox 3.0 is discontinued and the patch was not implemented until 3.5.3, thus no patch is available for people insisting on staying with 3.0.x for some reason.

However, Hardy and later have 3.6.x in their security updates for Firefox 3.0. This also fixes this bug as well as several security issues.

Fix:
Use security update that switches to newer 3.6.x version on Hardy and later.
Use upgrade packages from Mozilla.org or other sources for releases older than Hardy.

This is generally a good idea anyways, as any older releases of Firefox have notable security bugs that are fixed in the latest releases.

Changed in firefox-3.0 (Ubuntu):
status: Triaged → Fix Released
Revision history for this message
ruediix@gmail.com (ruedii) wrote :

I found further data, I am uncertain of the upload status. However, all related download bugs included are fixed.

I have downloaded a very large number of 4.7GB by HTTP in firefox files since this release.
However, my biggest uploads have never hit the 4GB limit yet.

I get mixed reports on upload. There may or may not be remaining issues with upload. The standard workaround is to use an external program for file-specific transfers when handling uploads over 4GB. However, this isn't nearly as easy on forms.

Many server advanced server software has limitations of 2GB or 4GB. This usually isn't related to the HTTP server software itself, but the additional programs and libraries used by the PHP or other CGI .

Since many servers don't support the upload of large files by HTTP, this could be an issue. You need to verify that it can function.

Here is an outline for test of verification of presence of bug.
1. Set up on your controlled home network, loopback device, or VM, an HTTP server verified to handle large files that does not utilize external programs for any file transfer handling.

2. Set it up to have a Forms based page that simply tells the server to upload the specific file from the client without having any external programs handle the file. (External programs may handle the filename and forms to process them and relay them back to the server.)

3. Try to upload of large file to said local server utilizing Firefox.

4. Check the file in directory of server, to see if it uploaded properly.

Revision history for this message
Ulli Horlacher (framstag) wrote : Re: [Bug 235200] Re: cannot upload files greater than 4 GB

On Thu 2011-04-07 (14:28), <email address hidden> wrote:

> I found further data, I am uncertain of the upload status. However,
> all related download bugs included are fixed.
>
> I have downloaded a very large number of 4.7GB by HTTP in firefox files since this release.
> However, my biggest uploads have never hit the 4GB limit yet.

Download was never a problem with firefox, but the upload (HTTP POST) was.
And the bug is still there with firefox 3.6.16 (tested on Ubuntu 8.04.4).

> issues with upload. The standard workaround is to use an external
> program for file-specific transfers when handling uploads over 4GB.

This workaround does not work for user who are not allowed to install
software. Besides this, "fixing bugs" by substituting the software with
other software is a bad idea.

> Many server advanced server software has limitations of 2GB or 4GB.
> This usually isn't related to the HTTP server software itself, but the
> additional programs and libraries used by the PHP or other CGI .
>
> Since many servers don't support the upload of large files by HTTP, this
> could be an issue. You need to verify that it can function.

I have written an own webserver for this kind of file transfer, which
already is an ubuntu package, see
http://packages.ubuntu.com/lucid/fex

> Here is an outline for test of verification of presence of bug.
> 1. Set up on your controlled home network, loopback device, or VM, an HTTP server verified to handle large files that does not utilize external programs for any file transfer handling.
>
> 2. Set it up to have a Forms based page that simply tells the server to
> upload the specific file from the client without having any external
> programs handle the file. (External programs may handle the filename
> and forms to process them and relay them back to the server.)
>
> 3. Try to upload of large file to said local server utilizing Firefox.
>
> 4. Check the file in directory of server, to see if it uploaded
> properly.

I have testet it with a 2 GB file and in the fexserver log I see no
connect at all. Uploading files < 2 GB are ok:

CONNECT:8080 2011-04-18 11:58:28 diaspora.rus.uni-stuttgart.de 129.69.13.139 [26471_0]
POST /fup HTTP/1.1
User-Agent: Mozilla/5.0 (X11; U; Linux i686; en-US; rv:1.9.2.16) Gecko/20110323 Ubuntu/8.04 (hardy) Firefox/3.6.16
Referer: http://fex.rus.uni-stuttgart.de:8080/fup
Content-Length: 2,098,221

You can test it by yourself with:
http://fex.rus.uni-stuttgart.de:8080/fup?skey=57e5ae9d103e4f19622bf4322f1a8609

--
Ullrich Horlacher Server- und Arbeitsplatzsysteme
Rechenzentrum E-Mail: <email address hidden>
Universitaet Stuttgart Tel: ++49-711-685-65868
Allmandring 30 Fax: ++49-711-682357
70550 Stuttgart (Germany) WWW: http://www.rus.uni-stuttgart.de/

Revision history for this message
In , Silviu Marin-Caea (silviumc) wrote :

So what's up with this bug?

Firefox 15 still truncates downloads to 4 GB

Chromium downloaded all the 4.2 GB. I'm not trolling anything here, just trying to prove that it wasn't a server or filesystem/OS problem (openSUSE 12.3 Factory)

Revision history for this message
In , Josh Aas (joshmoz) wrote :

This might work in Firefox 18 now that bug 784912 has been fixed.

Revision history for this message
In , Josh Aas (joshmoz) wrote :

Also probably relevant that bug 215450 was fixed recently.

Revision history for this message
In , Christian Biesinger (cbiesinger) wrote :

Guys, downloads are supposed to work for a long time now. If they're not, please do file a new bug with steps to reproduce.

Since uploads are also fixed, resolving this bug.

Changed in xulrunner:
status: Confirmed → Fix Released
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.