Команды SFK


    1        2        3        4        5        6        7        8        9        10    

Раздел 6. Networking - Сеть
fromnet | ftp | ftpserv | httpserv | ip | netlog | ping | pingdiff | tcpdump | udpdump | udpsend | web | wget |

Help:   Рус   |   Eng        Refer:   Рус   |   Eng  

Команда: web
sfk web [options] url [options]
sfk filter ... +tweb [options]

   call an http:// URL and print output to terminal,
   or pass output to further commands for processing.

   sfk ... +web requires an url parameter.
   sfk ... +tweb gets the url(s) from a previous command.

   options
      -user=u      and -pw=p set http basic authentication.
                   you may also use global options -webuser, -webpw.
                   note that passwords are not encrypted on transfer,
                   except when using SFK Plus with HTTPS connections.
      -nodump      do not print reply data.
      -proxy       hostname:port of a proxy server. from within a company
                   network, it is often required to connect through proxies.
                   alternatively, set the environment variable SFK_PROXY :
                     set SFK_PROXY=myproxyhost:8000
                   to find out what proxy your browser is using, see
                   - Firefox: tools/options/advanced/network/settings
                   - IE: tools/internet options/connections/lan settings
      -timeout=n     wait up to n msec for connection or data.
                     default is a blocking access, i.e. connect stops
                     after the operating system default timeout,
                     and data read may block endless.
      -webtimeout=n  same, but can be given as global option
                     for a multi command chain.
      -delay=n     wait n msec after each request.
      -weblimit=n  set download size limit to n mb
      -status[=s]  add a status line after reply data, optionally
                   prefixed by string s, which supports slash patterns
                   like \n or \t. on command chaining fields are
                   separated by tabs, otherwise by blanks.
      -noerr       print no error message
      -quiet       do not print status line in case of -nodump
      -headers     print sent and received http headers
      -header x    or -head adds custom header x to http requests, like
                   -header "Accept-Language: de,en-US;q=0.7,en;q=0.3"
                   multiple header lines can be given. default headers
                   with the same name are replaced.
      -request x   or -req specifies the whole HTTP request, like
                   -req "POST / HTTP/1.1
                         Host: localhost
                         Connection: close
                         
                         var1=123&var2=456
                         "
                   this can only be used within a script file.
                   to create an example script for editing, type:
                      sfk batch webreq.bat
      -reqfromvar a  take request from variable a. must contain exact
                   data, like empty CRLF line after GET header.
      -showreq     print full URL, may also use -status
      -verbose     tell current proxy settings, if any
      -noclose     do not send "Connection: close" header.

   automatic name expansions
      http:// is added automatically. short ip's like .100 are
      extended like 192.168.1.100 depending on your subnet.

   quoted multi line parameters are supported in scripts
      using full trim. type "sfk script" for details.

   limitations
      - by default sfk web reads up to 10 mbytes of data.
        use -weblimit=n to change this to n mbytes.
      - if binary data is found, binary codes are stripped
        on output to terminal.

   aliases
      cweb  call the web quickly without any output,
            same as web -nodump -quiet.
      tweb  same as web but tells explicitely
            that it expects chain text input.

   HTTPS support
      SSL/TLS connections are supported with SFK Plus.
      read more under:
         stahlworks.com/sfkplus

   return codes for chaining
      0 = ok    >0 = any error

   see also
      sfk wfilt    download web text and filter it directly
      sfk wget     download file from http URL
      sfk view     GUI tool to search and filter text from
                   an http URL interactively
      curl         powerful web request and download tool

   web reference
      http://stahlworks.com/sfk-web

   more in the SFK Book
      the SFK Book contains a 60 page tutorial, including
      an HTTP automation example with detailed explanations.
      type "sfk book" for details.

   examples
      sfk web .100/getStatus.xml
         calls, for example, http://192.168.1.100/getStatus.xml
         and prints the xml reply to terminal

      sfk web 192.168.1.200/zones.xml +filter -+status
         calls http://192.168.1.200/zones.xml and extracts
         all lines containing "status".

      sfk web .100 +xex "_**_"
         gets main page from .100 and extracts html head tag.

      sfk filter ips.txt -form "$col1/xml/status.xml"
       +tweb -nodump
         calls many different urls based on a table of ips.
         option -nodump does not print the full result data
         but only a single status line.

      --- scripting example: ---
      +setvar error=""
      +setvar uptime=""
      +web -maxwait=2000 -noerr -status=:status:
         ".250/info.xml"
         +xex "_:status:*\tERR
               _[setvar error][part2][endvar]_"
              "_*
               _[setvar uptime][part2][endvar]_"
      +if -var "#(error) <> "
         stop -var 5 "no access (#(error))"
      +getvar
      --- scripting example end ---
         try to read an xml value "uptime" from info.xml
         on local IP .250 and show it by +getvar.
         if there is no connection or an HTTP error
         then stop instead with a text "no acess".