webserver - web server and various redirect -
how can take request web server , turn on, perhaps, several request web server on other hosts , on, until, 1 or more of various web server obtain response, , return client initiated request.
it sounds asking this:
[client] --> [your webserver] ----> [multitude of servers]
- client initiates request webserver
- your web server issues request 1 or more remote webservers
- when @ least 1 of remote webservers answers or when timeout runs out, webserver sends response client.
if asking, scope broad. ie, implementation sitting between? python question? :)
if of multitude of servers on backend same , controlled you, server sitting in front going load balancer or proxy server. ala ha-proxy, nginx, apache, varnish, etc.
if talking setting site partial proxy or web scraping or data capture, else entirely.
in either case, question/case vague, posted.
edit: in response comment.
that's not particularly difficult. need setup python daemon process, listens on web port. depending on how want file reading behaviour work, can either read file in @ startup, in memory, or can take performance hit , read file each attempt.
initiating connection remote sites matter of constructing , calling appropriate python library.
to create python web server:
http://docs.python.org/2/library/simplehttpserver.html
to initiate connection remote sites:
using python http libraries: http://docs.python.org/2/library/httplib.html
building python web client: http://python.about.com/od/networkingwithpython/ss/beg_web_client_all.htm
python web scraping: web scraping python http://scrapy.org/
what describing not ideal way perform task. better off haproxy/nginx in regard. however, given requirement python, above resource links point in right direction.
Comments
Post a Comment