I had a similar problem with files of about 6Mb. I investigated the PHP file read limit size and indeed, the download function makes use of a function which has a read limit problem for some servers. I located the download code and replaced it with a function which reads the file in chunks, eliminating the readfile limit problem.
It took ages to work out and it seems to have worked for me, and I would be interested to hear if it works for others too....
In file download.php;
Before "
function get_remote_file($url) {" which is approximately line 33, insert;
function readfile_chunked($filename,$retbytes=true) {
$chunksize = 1*(1024*1024); // how many bytes per chunk
$buffer = '';
$cnt =0;
$handle = fopen($filename, 'rb');
if ($handle === false) {
return false;
}
while (!feof($handle)) {
$buffer = fread($handle, $chunksize);
// echo $buffer;
ob_flush();
flush();
if ($retbytes) {
$cnt += strlen($buffer);
}
}
$status = fclose($handle);
if ($retbytes && $status) {
return $cnt; // return num. bytes delivered like readfile() does.
}
return $status;
} also, to refer to the alternative function, within
function get_file_data($file_path) {, replace;
@readfile($file_path);with
@readfile_chunked($file_path) ;The readfile_chunked() function is not originally of my own hand, but extracted from
http://ch2.php.net/manual/en/function.readfile.php#54295