It is possible that the site is rejecting the script since it is not a
browser. You can fake a browser by using the User-Agent header.
I am also going to take the liberty to suggest an alternate means to the
same end. If you are on a *nix system you can run the wget command to
connect to the site and retrieve the data. With the right options to
wget the user agent can be faked and the output be sent to stdout. Then
using output buffering you can capture the output to a string.
Senior Web Developer
WPT Enterprises, Inc.
5700 Wilshire Blvd., Suite 350
Los Angeles, CA 90036
Confidentiality Notice: This e-mail transmission (and/or the
attachments accompanying) it may contain confidential information
belonging to the sender which is protected. The information is intended
only for the use of the intended recipient. If you are not the intended
recipient, you are hereby notified that any disclosure, copying,
distribution or taking of any action in reliance on the contents of this
information is prohibited. If you have received this transmission in
error, please notify the sender by reply e-mail and destroy all copies
of this transmission.
From: Mike Dunlop
Sent: Monday, March 27, 2006 2:15 PM
Subject: [PHP] file_get_contents / url wrappers
If i am trying to read the contents of url into a string and I am
getting 403 errors yet the page will load perfectly through a normal
browser, do u think the site is looking at the http request headers
and determining it's not a browser and thus blocking access? If
something like that was happening, does anyone know what headers i
should put into a request to simulate a browser?
error msg :: <...> failed to open stream: HTTP request failed! HTTP/
1.1 403 Forbidden in <...>
Any info is much appreciated.
Thanks - MD
Director of Technology Development
[ e ] firstname.lastname@example.org
[ p ] 323.644.7808