Question on files
Hi,
I am using PHP to make a website related to biology. In this site I use PHP code to communicate with external programms (not in PHP, but some biological packages). These programms write their output in temporary files, which I then read using fread() or file_get_contents() functions.
My problem is that these functions seem to have a restriction as to the size of the file that they can read. Is there any other way I read big files (for files close to 20-30MBs for example)?
Please note that there is no other way for me to do this because these programs, by default, write their output to a text file, so I can't use PHP's system commands and read the output on the fly. The output file is first created and then I read it into a string and parse it according to what I need.
Busy posted this at 08:42 — 2nd May 2006.
He has: 6,151 posts
Joined: May 2001
I think the limit is 8m
You can change the limit if you have control of the php.ini file, which only the host can edit.
Your bandwidth must be massive
CptAwesome posted this at 16:30 — 4th May 2006.
He has: 370 posts
Joined: Dec 2004
Your other alternative is to do partial reads and clear the memory so as to not hit the 8M max, you can use fgets
see: http://ca.php.net/manual/en/function.fgets.php
Example 1. Reading a file line by line
Want to join the discussion? Create an account or log in if you already have one. Joining is fast, free and painless! We’ll even whisk you back here when you’ve finished.