anyone know how to save a url request to a txt file? - through cgi, I want to access another page an

They have: 40 posts

Joined: Apr 1999

I wondered if anybody has any examples of how to (through perl) have the program access another page (say yahoo.com for example) and save the actual htm to a file.

For the project I am doing, I can do all my manipulation with the htm file already saved, but now I need to do it on the fly

Thanks

They have: 2 posts

Joined: May 1999

I believe you are referring to the LWP module, examples of which can be found in the excellent Perl Cookbook.

I believe the examples are even downloadble from:
http://perl.oreilly.com/

They have: 5,633 posts

Joined: Jan 1970

Example code:

use LWP::Simple;
$page = get("http://www.yahoo.com") ;
open (F,">>file.txt") ;
print F $page;
close F;

This will gather Yahoo!'s main page and save it to file.txt

Cheers!
Federico

They have: 40 posts

Joined: Apr 1999

thank you for the responses!

They have: 40 posts

Joined: Apr 1999

actually, I wanted to use this as an excercise in Perl. I just started reading up on it and I am very new to this! I was trying to pull stock tickers from yahoo. I was able to write the script, changed permissions to 755 and it errored out. But when I telnet to my server and type "perl (program name).pl" it works?

you said there were some better ways. Could you please expand a little on that? you can email me directly or put it on the forum

They have: 1,587 posts

Joined: Mar 1999

to take this a step further how could one specify which parts of the page to save and which parts not to save to a txt file. this way the exact info wanted could be saved without all the other garbage.

thanks in advance Wink

----------
My Site Got Hacked, Check It Out!
http://www.birminghamnet.com

Traffic-Website.com free traffic, affiliate programs, hosting, & domain names.
My Site got hacked, but i'm coming back?

They have: 40 posts

Joined: Apr 1999

fairhousing-

i parsed the .htm file (which is now a .txt file) until I found the phrase that was above the stock line. Then saved the stock line . It could probably be done much quicker and cleaner, like I said - I am just starting to learn Perl!

They have: 32 posts

Joined: Jun 1999

Keithl, what's the source for this CGI file. It sounds interesting and I would like to check it out.

-steve

They have: 40 posts

Joined: Apr 1999

here is the source, Please keep in mind that I am a complete novice at this !!! so dont laugh too hard Smiling

#!/usr/bin/perl
use LWP::Simple;
$ft=get("http://finance.yahoo.com/q?s=hhs&d=v1");
open (F,">>hhs.htm");
print F $ft;
close F;

$t="hhs.htm";
open (FILE, $t);
my $att=0;

while ( ( $att ==0 ))
{
$line=<FILE>;
chomp ( $line );
if ($line eq "<th nowrap>More Info</th>" )
{
$att=1;
}
}
$line=<FILE>;
$line=<FILE>;
$line=<FILE>;
$line=<FILE>;
$line=<FILE>;
print $line;
close (FILE);
open (QUOTE, ">>stock.txt");
print QUOTE $line;
close (QUOTE);

would anybody have any idea why this would work when I execute the script through telnet and NOT when I execute through the browser? (I set all permissions on ALL files to 755)

He has: 150 posts

Joined: Apr 1999

keithl: Try this modification.

...
open (FILE, $t);
my $att=0;
print "Content-Type: text/html\n\n";
while ( ( $att ==0 ))
{
$line=<FILE>;
...

They have: 5,633 posts

Joined: Jan 1970

You must add the correct content headers before printing anything to a browser.
add this line: print "content-type: text/html\n\n";

Something that catched my attention is the repetition of $line=<FILE>; -- What are you trying to acomplish with this?

They have: 1,587 posts

Joined: Mar 1999

i'll try this question again.

how exactly would i pinpoint just the info i wanted to be saved to a text file instead of the whole page. for example, the header on my site "birmingham alabama jobs..." birminghamnet.com. how would i specify the header as the only thing i wanted saved to a txt file?

thanks in advance Wink

----------
My Site Got Hacked, Check It Out!
http://www.birminghamnet.com

Traffic-Website.com free traffic, affiliate programs, hosting, & domain names.
My Site got hacked, but i'm coming back?

They have: 40 posts

Joined: Apr 1999

this is the only way (I know of right now) to advance the pointer to where I want it to point to (I am SURE there is a better way!)

I just wanted to thank everybody for
a) not laughing to hard !
b) your suggestions

They have: 5,633 posts

Joined: Jan 1970

FH,

You would have to parse the fetched page before writing it to the local file. To do that, you'd have to use a couple of regular expressions.

You can read more on regular expressions at http://www.perl.com/CPAN.

Want to join the discussion? Create an account or log in if you already have one. Joining is fast, free and painless! We’ll even whisk you back here when you’ve finished.