Hello all :-),
I am not an expert with linux and hence am looking
for some pointers about how to solve a problem I am facing. Any
comments, suggestions or solutions would be very helpful. I have
searched this list and did find one solution pretty close to what I am
looking for but it uses iptables and I was thinking of something
simpler.

The problem: I want to log the URLs requested by a browser on a linux
system into a file.

What I have explored: I installed the Livehttpheaders extension for
firefox, it does log URLs but does not seem to do it continuously to a
file. You have to press the save-all button. I need a continuous
logging mechanism. I have also read up Apache and other web servers
which when installed will do the trick, but I want my script to be
standalone and depend as little as possible on a server/module being
pre-installed. A proxy server could be a solution, Could anyone please
point me to a really small implementation which I can then customize
to just log requests and leave the http connections alone.

The best solution for me, as I see it (feel free to correct me) is a
small script or usage of a command-line utility I may not know as yet
which will simply dump connection/URL to a file. I could use tcpdump/
ethereal(wireshark), but this will be a bad solution compared to a one-
liner or a small script.

Would definitely appreciate any help people!
Thanks in advance,
-A