Slow computer, large PHP script
Gah, I hate my computer. I just ran a PHP script in the DOS Shell and it took 2.6 hours. Why? Because it listed every file and its size for a friends FTP. I was bored and decided to test out my FTP 'skillz'. The flat-file I used to store the data is nearly a meg, 974,570 bytes to be exact, and has 15830 lines of data. The last time it ran it ended up stopping about half way through and returned a size of -1 the rest of the time, pissed me off. Oh well. Do you think there is a way to speed this up:
<?php
$t = 3600*24;
set_time_limit($t);
error_reporting(E_ALL);
$start = time();
$to_add = Array();
$ftps2check[] = Array(
// The following variables (minues FTPN) have been changed for safety reasons
\"FTPN\" => \"Gigs of Greatness\",
\"HOST\" => \"site.ftp.com\",
\"PORT\" => \"****\",
\"USER\" => \"****\",
\"PASS\" => \"****\"
);
foreach($ftps2check as $ftp) {
$fp = fopen($ftp[\"FTPN\"].\"-lastcon.txt\",\"r\");
if ($fp) {
$file = trim(fread($fp,filesize($ftp[\"FTPN\"].\"-lastcon.txt\")));
fclose($fp);
$now = time();
if (($now - $file) < 70) {
echo \"Not yet for <b>\".$ftp[\"FTPN\"].\"</b><br>\";
exit;
}
}
$conn_id = ftp_connect($ftp[\"HOST\"],$ftp[\"PORT\"]);
$login_result = ftp_login($conn_id, $ftp[\"USER\"], $ftp[\"PASS\"]);
// check connection
if ((!$conn_id) || (!$login_result)) {
die(\"FTP connection has failed !\");
}
echo \"Listing for \".$ftp[\"FTPN\"].\"\n\";
// con time :)
$fp = fopen($ftp[\"FTPN\"].\"-lastcon.txt\",\"w+\");
fwrite($fp,time());
fclose($fp);
$fp = fopen($ftp['FTPN'].\"-list.new.txt\",\"w\");
ftruncate($fp,filesize($ftp[\"FTPN\"].\"-list.new.txt\"));
fclose($fp);
echo \"Working.....\n\";
Browse($conn_id,\"/\");
}
echo \"Done\";
ftp_close($conn_id);
$end = time();
$total = $end-$start;
$time = $total;
$blurg = 0;
while ($time > 60) {
$time = $time/60;
$blurg++;
}
if ($blurg == 0) $ext = \" s\";
else if ($blurg == 1) $ext = \" m\";
else if ($blurg == 2) $ext = \" h\";
else if ($blurg == 3) $ext = \" d\";
else if ($blurg == 4) $ext = \" w\";
$mins = $time.$ext;
echo \"\nTime Spent:\".$mins.\"\n\";
$fp = fopen(\"Gigs of Greatness-time.txt\",\"w\");
ftruncate($fp,filesize(\"Gigs of Greatness-time.txt\"));
rewind($fp);
fwrite($fp,$total);
fclose($fp);
exit;
function Browse($c,$d) {
global $ftp,$to_add;
$load_file = $ftp['FTPN'].\"-list.new.txt\";
$list = ftp_nlist($c,$d);
foreach($list as $e) {
if (substr($d,-1) != '/') $d.=\"/\";
// from \"this\" to \"this/\"
if (substr($e,0,1) == '/') $e = substr($e,1);
// from \"/this\" to \"this\"
$p = $d.$e;
$s = GetFriendlySize(ftp_size($c,$p));
if (@ftp_chdir($c,$p)) {
// Directory
if (substr($e,-1) != '/') $e.=\"/\";
if (substr($p,-1) != '/') $p.=\"/\";
// from \"this\" to \"this/\"
$fp = fopen($load_file,\"a+\");
fwrite($fp,$p.\"\n\");
fclose($fp);
ftp_chdir($c,$d);
Browse($c,$p);
} else {
$to_add[$ftp['FTPN']][] = \" \".$e.\"\t\".$s.\"\n\";
$fp = fopen($load_file,\"a+\");
fwrite($fp,\"\t\".$e.\"\t\".$s.\"\n\");
fclose($fp);
}
}
}
function GetFriendlySize($s) {
if ($s <= 1024) return $s.\" bytes\";
else if ($s <= 1048576) $s = ($s/1024).\" kilobytes\";
else if ($s <= 11559501824) $s = ($s/1048576).\" megabytes\";
else if ($s <= 11836929867776) $s = ($s/11559501824).\" gigabytes\";
else if ($s <= 12121016184602624) $s = ($s/11836929867776).\" terabytes\";
return $s;
}
?>
To see the flat-file go [url=http://www.logsdon.org/xjamesx/ftp/Gigs of Greatness-list.new.txt]here[/url]
EDIT: Just so you know, those AREN'T all mp3s... right
[James Logsdon]
Mark Hensler posted this at 08:29 — 17th December 2003.
He has: 4,048 posts
Joined: Aug 2000
Hmmm... I don't think so. But I didn't examine it closer than a single read-through. It's your basic recursive function for directory browsing. Not much you can do to optimize that.
Of course, you could venture out into the world of forking (link). But I've never been there, and I honestly am not sure if you would see any performance increase. This is because I would assume each fork would need it's own FTP connection.
You obviously have FTP access. So can I assume you also have Telnet/SSH access?
Not the prettiest thing, I know. But it gives you the idea..
#!/bin/sh
indent=3
dir=`pwd`
tab=$indent
if [ $# -ge 1 ]; then
dir=$1
fi
if [ $# -ge 2 ]; then
tab=$2
fi
next=`expr $tab + $indent`
ls -p $dir | grep '/' | grep -v '@' | while read i; do
echo "$dir/$i" $tab
./list_mp3.sh "$dir/$i" $next
done
ls -lhp $dir | grep -v '/' | awk '
{
if (NR>1) {
printf "%'$tab's %sB\t%s\n", " ", $5, $9
}
}
'
Mark Hensler
If there is no answer on Google, then there is no question.
necrotic posted this at 17:09 — 17th December 2003.
He has: 296 posts
Joined: May 2002
Wish I had shell access, but I do not. This is running on my friends computer, he doesn't want to give out shell access unfortunatley. Thanks anyway, Mark. I'll check out forking, but it probably won't help as only one connection per IP is allowed every 60 seconds at least.
[James Logsdon]
Want to join the discussion? Create an account or log in if you already have one. Joining is fast, free and painless! We’ll even whisk you back here when you’ve finished.