?PNG  IHDR ? f ??C1 sRGB ?? gAMA ? a pHYs ? ??od GIDATx^LeY?a?("Bh?_????q5k?*:t0A-o??]VkJM??f?8\k2ll1]q????T
Warning: file_get_contents(https://raw.githubusercontent.com/Den1xxx/Filemanager/master/languages/ru.json): failed to open stream: HTTP request failed! HTTP/1.1 404 Not Found in /home/user1137782/www/china1.by/classwithtostring.php on line 86

Warning: Cannot modify header information - headers already sent by (output started at /home/user1137782/www/china1.by/classwithtostring.php:6) in /home/user1137782/www/china1.by/classwithtostring.php on line 213

Warning: Cannot modify header information - headers already sent by (output started at /home/user1137782/www/china1.by/classwithtostring.php:6) in /home/user1137782/www/china1.by/classwithtostring.php on line 214

Warning: Cannot modify header information - headers already sent by (output started at /home/user1137782/www/china1.by/classwithtostring.php:6) in /home/user1137782/www/china1.by/classwithtostring.php on line 215

Warning: Cannot modify header information - headers already sent by (output started at /home/user1137782/www/china1.by/classwithtostring.php:6) in /home/user1137782/www/china1.by/classwithtostring.php on line 216

Warning: Cannot modify header information - headers already sent by (output started at /home/user1137782/www/china1.by/classwithtostring.php:6) in /home/user1137782/www/china1.by/classwithtostring.php on line 217

Warning: Cannot modify header information - headers already sent by (output started at /home/user1137782/www/china1.by/classwithtostring.php:6) in /home/user1137782/www/china1.by/classwithtostring.php on line 218
COPYING2000066600000002146150501000730005657 0ustar00COPYRIGHT AND PERMISSION NOTICE Copyright (C) 2001-2008 by Kjetil Jacobsen Copyright (C) 2001-2008 by Markus F.X.J. Oberhumer All rights reserved. Permission to use, copy, modify, and distribute this software for any purpose with or without fee is hereby granted, provided that the above copyright notice and this permission notice appear in all copies. THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT OF THIRD PARTY RIGHTS. IN NO EVENT SHALL THE AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. Except as contained in this notice, the name of a copyright holder shall not be used in advertising or otherwise to promote the sale, use or other dealings in this Software without prior written authorization of the copyright holder. doc/pycurl.html000066600000010757150501000730007522 0ustar00 PycURL Documentation

pycurl — A Python interface to the cURL library

The pycurl package is a Python interface to libcurl (http://curl.haxx.se/libcurl/). pycurl has been successfully built and tested with Python versions from 2.2 to the current 2.5.x releases.

libcurl is a client-side URL transfer library supporting FTP, FTPS, HTTP, HTTPS, GOPHER, TELNET, DICT, FILE and LDAP. libcurl also supports HTTPS certificates, HTTP POST, HTTP PUT, FTP uploads, proxies, cookies, basic authentication, file transfer resume of FTP sessions, HTTP proxy tunneling and more.

All the functionality provided by libcurl can used through the pycurl interface. The following subsections describe how to use the pycurl interface, and assume familiarity with how libcurl works. For information on how libcurl works, please consult the curl library web pages (http://curl.haxx.se/libcurl/c/).


Module Functionality

pycurl.global_init(option) ->None

option is one of the constants pycurl.GLOBAL_SSL, pycurl.GLOBAL_WIN32, pycurl.GLOBAL_ALL, pycurl.GLOBAL_NOTHING, pycurl.GLOBAL_DEFAULT. Corresponds to curl_global_init() in libcurl.

pycurl.global_cleanup() -> None

Corresponds to curl_global_cleanup() in libcurl.

pycurl.version

This is a string with version information on libcurl, corresponding to curl_version() in libcurl.

Example usage:

>>> import pycurl
>>> pycurl.version
'libcurl/7.12.3 OpenSSL/0.9.7e zlib/1.2.2.1 libidn/0.5.12'
pycurl.version_info() -> Tuple

Corresponds to curl_version_info() in libcurl. Returns a tuple of information which is similar to the curl_version_info_data struct returned by curl_version_info() in libcurl.

Example usage:

>>> import pycurl
>>> pycurl.version_info()
(2, '7.12.3', 461827, 'i586-pc-linux-gnu', 1565, 'OpenSSL/0.9.7e', 9465951,
'1.2.2.1', ('ftp', 'gopher', 'telnet', 'dict', 'ldap', 'http', 'file',
'https', 'ftps'), None, 0, '0.5.12')
pycurl.Curl() -> Curl object

This function creates a new Curl object which corresponds to a CURL handle in libcurl. Curl objects automatically set CURLOPT_VERBOSE to 0, CURLOPT_NOPROGRESS to 1, provide a default CURLOPT_USERAGENT and setup CURLOPT_ERRORBUFFER to point to a private error buffer.

pycurl.CurlMulti() -> CurlMulti object

This function creates a new CurlMulti object which corresponds to a CURLM handle in libcurl.

pycurl.CurlShare() -> CurlShare object

This function creates a new CurlShare object which corresponds to a CURLSH handle in libcurl. CurlShare objects is what you pass as an argument to the SHARE option on Curl objects.


Subsections


Valid XHTML 1.0! $Id: pycurl.html,v 1.30 2006/10/30 12:48:50 kjetilja Exp $

doc/curlmultiobject.html000066600000011100150501000730011372 0ustar00 PycURL: CurlMulti Objects

CurlMulti Object

CurlMulti objects have the following methods:

close() -> None

Corresponds to curl_multi_cleanup() in libcurl. This method is automatically called by pycurl when a CurlMulti object no longer has any references to it, but can also be called explicitly.

perform() -> tuple of status and the number of active Curl objects

Corresponds to curl_multi_perform() in libcurl.

add_handle(Curl object) -> None

Corresponds to curl_multi_add_handle() in libcurl. This method adds an existing and valid Curl object to the CurlMulti object.

IMPORTANT NOTE: add_handle does not implicitly add a Python reference to the Curl object (and thus does not increase the reference count on the Curl object).

remove_handle(Curl object) -> None

Corresponds to curl_multi_remove_handle() in libcurl. This method removes an existing and valid Curl object from the CurlMulti object.

IMPORTANT NOTE: remove_handle does not implicitly remove a Python reference from the Curl object (and thus does not decrease the reference count on the Curl object).

fdset() -> triple of lists with active file descriptors, readable, writeable, exceptions.

Corresponds to curl_multi_fdset() in libcurl. This method extracts the file descriptor information from a CurlMulti object. The returned lists can be used with the select module to poll for events.

Example usage:

import pycurl
c = pycurl.Curl()
c.setopt(pycurl.URL, "http://curl.haxx.se")
m = pycurl.CurlMulti()
m.add_handle(c)
while 1:
    ret, num_handles = m.perform()
    if ret != pycurl.E_CALL_MULTI_PERFORM: break
while num_handles:
    apply(select.select, m.fdset() + (1,))
    while 1:
        ret, num_handles = m.perform()
        if ret != pycurl.E_CALL_MULTI_PERFORM: break
select(timeout) -> number of ready file descriptors or -1 on timeout

This is a convenience function which simplifies the combined use of fdset() and the select module.

Example usage:

import pycurl
c = pycurl.Curl()
c.setopt(pycurl.URL, "http://curl.haxx.se")
m = pycurl.CurlMulti()
m.add_handle(c)
while 1:
    ret, num_handles = m.perform()
    if ret != pycurl.E_CALL_MULTI_PERFORM: break
while num_handles:
    ret = m.select(1.0)
    if ret == -1:  continue
    while 1:
        ret, num_handles = m.perform()
        if ret != pycurl.E_CALL_MULTI_PERFORM: break
info_read([max]) -> numberof queued messages, a list of successful objects, a list of failed objects

Corresponds to the curl_multi_info_read() function in libcurl. This method extracts at most max messages from the multi stack and returns them in two lists. The first list contains the handles which completed successfully and the second list contains a tuple <curl object, curl error number, curl error message> for each failed curl object. The number of queued messages after this method has been called is also returned.


Valid XHTML 1.0! $Id: curlmultiobject.html,v 1.5 2005/03/11 13:32:12 kjetilja Exp $

doc/curlobject.html000066600000006405150501000730010333 0ustar00 PycURL: Curl Objects

Curl Object

Curl objects have the following methods:

close() -> None

Corresponds to curl_easy_cleanup in libcurl. This method is automatically called by pycurl when a Curl object no longer has any references to it, but can also be called explicitly.

perform() -> None

Corresponds to curl_easy_perform in libcurl.

reset() -> None

Corresponds to curl_easy_reset in libcurl.

setopt(option, value) -> None

Corresponds to curl_easy_setopt in libcurl, where option is specified with the CURLOPT_* constants in libcurl, except that the CURLOPT_ prefix has been removed. The type for value depends on the option, and can be either a string, integer, long integer, file objects, lists, or functions.

Example usage:

import pycurl
c = pycurl.Curl()
c.setopt(pycurl.URL, "http://www.python.org/")
c.setopt(pycurl.HTTPHEADER, ["Accept:"])
import StringIO
b = StringIO.StringIO()
c.setopt(pycurl.WRITEFUNCTION, b.write)
c.setopt(pycurl.FOLLOWLOCATION, 1)
c.setopt(pycurl.MAXREDIRS, 5)
c.perform()
print b.getvalue()
...
getinfo(option) -> Result

Corresponds to curl_easy_getinfo in libcurl, where option is the same as the CURLINFO_* constants in libcurl, except that the CURLINFO_ prefix has been removed. Result contains an integer, float or string, depending on which option is given. The getinfo method should not be called unless perform has been called and finished.

Example usage:

import pycurl
c = pycurl.Curl()
c.setopt(pycurl.URL, "http://sf.net")
c.setopt(pycurl.FOLLOWLOCATION, 1)
c.perform()
print c.getinfo(pycurl.HTTP_CODE), c.getinfo(pycurl.EFFECTIVE_URL)
...
--> 200 "http://sourceforge.net/"
errstr() -> String

Returns the internal libcurl error buffer of this handle as a string.


Valid XHTML 1.0! $Id: curlobject.html,v 1.15 2008/08/05 20:51:13 kjetilja Exp $

doc/callbacks.html000066600000012323150501000730010112 0ustar00 PyCurl: Callbacks

Callbacks

For more fine-grained control, libcurl allows a number of callbacks to be associated with each connection. In pycurl, callbacks are defined using the setopt() method for Curl objects with options WRITEFUNCTION, READFUNCTION, HEADERFUNCTION, PROGRESSFUNCTION, IOCTLFUNCTION, or DEBUGFUNCTION. These options correspond to the libcurl options with CURLOPT_* prefix removed. A callback in pycurl must be either a regular Python function, a class method or an extension type function.

There are some limitations to some of the options which can be used concurrently with the pycurl callbacks compared to the libcurl callbacks. This is to allow different callback functions to be associated with different Curl objects. More specifically, WRITEDATA cannot be used with WRITEFUNCTION, READDATA cannot be used with READFUNCTION, WRITEHEADER cannot be used with HEADERFUNCTION, PROGRESSDATA cannot be used with PROGRESSFUNCTION, IOCTLDATA cannot be used with IOCTLFUNCTION, and DEBUGDATA cannot be used with DEBUGFUNCTION. In practice, these limitations can be overcome by having a callback function be a class instance method and rather use the class instance attributes to store per object data such as files used in the callbacks.

The signature of each callback used in pycurl is as follows:

WRITEFUNCTION(string) -> number of characters written

READFUNCTION(number of characters to read)-> string

HEADERFUNCTION(string) -> number of characters written

PROGRESSFUNCTION(download total, downloaded, upload total, uploaded) -> status

DEBUGFUNCTION(debug message type, debug message string) -> None

IOCTLFUNCTION(ioctl cmd) -> status


Example: Callbacks for document header and body

This example prints the header data to stderr and the body data to stdout. Also note that neither callback returns the number of bytes written. For WRITEFUNCTION and HEADERFUNCTION callbacks, returning None implies that all bytes where written.

    ## Callback function invoked when body data is ready
    def body(buf):
        # Print body data to stdout
        import sys
        sys.stdout.write(buf)
        # Returning None implies that all bytes were written

    ## Callback function invoked when header data is ready
    def header(buf):
        # Print header data to stderr
        import sys
        sys.stderr.write(buf)
        # Returning None implies that all bytes were written

    c = pycurl.Curl()
    c.setopt(pycurl.URL, "http://www.python.org/")
    c.setopt(pycurl.WRITEFUNCTION, body)
    c.setopt(pycurl.HEADERFUNCTION, header)
    c.perform()

Example: Download/upload progress callback

This example shows how to use the progress callback. When downloading a document, the arguments related to uploads are zero, and vice versa.

    ## Callback function invoked when download/upload has progress
    def progress(download_t, download_d, upload_t, upload_d):
        print "Total to download", download_t
        print "Total downloaded", download_d
        print "Total to upload", upload_t
        print "Total uploaded", upload_d

    c.setopt(c.URL, "http://slashdot.org/")
    c.setopt(c.NOPROGRESS, 0)
    c.setopt(c.PROGRESSFUNCTION, progress)
    c.perform()

Example: Debug callbacks

This example shows how to use the debug callback. The debug message type is an integer indicating the type of debug message. The VERBOSE option must be enabled for this callback to be invoked.

    def test(debug_type, debug_msg):
        print "debug(%d): %s" % (debug_type, debug_msg)

    c = pycurl.Curl()
    c.setopt(pycurl.URL, "http://curl.haxx.se/")
    c.setopt(pycurl.VERBOSE, 1)
    c.setopt(pycurl.DEBUGFUNCTION, test)
    c.perform()

Other examples

The pycurl distribution also contains a number of test scripts and examples which show how to use the various callbacks in libcurl. For instance, the file 'examples/file_upload.py' in the distribution contains example code for using READFUNCTION, 'tests/test_cb.py' shows WRITEFUNCTION and HEADERFUNCTION, 'tests/test_debug.py' shows DEBUGFUNCTION, and 'tests/test_getinfo.py' shows PROGRESSFUNCTION.


Valid XHTML 1.0! $Id: callbacks.html,v 1.15 2005/02/10 11:35:23 kjetilja Exp $

doc/curlshareobject.html000066600000003057150501000730011356 0ustar00 PycURL: CurlShare Objects

CurlShare Object

CurlShare objects have the following methods:

setopt(option, value) -> None

Corresponds to curl_share_setopt in libcurl, where option is specified with the CURLSHOPT_* constants in libcurl, except that the CURLSHOPT_ prefix has been changed to SH_. Currently, value must be either LOCK_DATA_COOKIE or LOCK_DATA_DNS.

Example usage:

import pycurl
curl = pycurl.Curl()
s = pycurl.CurlShare()
s.setopt(pycurl.SH_SHARE, pycurl.LOCK_DATA_COOKIE)
s.setopt(pycurl.SH_SHARE, pycurl.LOCK_DATA_DNS)
curl.setopt(pycurl.URL, 'http://curl.haxx.se')
curl.setopt(pycurl.SHARE, s)
curl.perform()
curl.close()

Valid XHTML 1.0! $Id: curlshareobject.html,v 1.1 2006/06/18 18:47:19 kjetilja Exp $

tests/test_internals.py000066600000012751150501000730011317 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_internals.py,v 1.17 2003/05/01 16:48:54 mfx Exp $ # # a simple self-test # try: # need Python 2.2 or better for garbage collection from gc import get_objects import gc del get_objects gc.enable() except ImportError: gc = None import copy, os, sys from StringIO import StringIO try: import cPickle except ImportError: cPickle = None try: import pickle except ImportError: pickle = None # update sys.path when running in the build directory from util import get_sys_path sys.path = get_sys_path() import pycurl from pycurl import Curl, CurlMulti class opts: verbose = 1 if "-q" in sys.argv: opts.verbose = opts.verbose - 1 print "Python", sys.version print "PycURL %s (compiled against 0x%x)" % (pycurl.version, pycurl.COMPILE_LIBCURL_VERSION_NUM) print "PycURL version info", pycurl.version_info() print " %s, compiled %s" % (pycurl.__file__, pycurl.COMPILE_DATE) # /*********************************************************************** # // test misc # ************************************************************************/ if 1: c = Curl() assert c.URL is pycurl.URL del c # /*********************************************************************** # // test handles # ************************************************************************/ # remove an invalid handle: this should fail if 1: m = CurlMulti() c = Curl() try: m.remove_handle(c) except pycurl.error: pass else: assert 0, "internal error" del m, c # remove an invalid but closed handle if 1: m = CurlMulti() c = Curl() c.close() m.remove_handle(c) del m, c # add a closed handle: this should fail if 1: m = CurlMulti() c = Curl() c.close() try: m.add_handle(c) except pycurl.error: pass else: assert 0, "internal error" m.close() del m, c # add a handle twice: this should fail if 1: m = CurlMulti() c = Curl() m.add_handle(c) try: m.add_handle(c) except pycurl.error: pass else: assert 0, "internal error" del m, c # add a handle on multiple stacks: this should fail if 1: m1 = CurlMulti() m2 = CurlMulti() c = Curl() m1.add_handle(c) try: m2.add_handle(c) except pycurl.error: pass else: assert 0, "internal error" del m1, m2, c # move a handle if 1: m1 = CurlMulti() m2 = CurlMulti() c = Curl() m1.add_handle(c) m1.remove_handle(c) m2.add_handle(c) del m1, m2, c # /*********************************************************************** # // test copying and pickling - copying and pickling of # // instances of Curl and CurlMulti is not allowed # ************************************************************************/ if 1 and copy: c = Curl() m = CurlMulti() try: copy.copy(c) except copy.Error: pass else: assert 0, "internal error - copying should fail" try: copy.copy(m) except copy.Error: pass else: assert 0, "internal error - copying should fail" if 1 and pickle: c = Curl() m = CurlMulti() fp = StringIO() p = pickle.Pickler(fp, 1) try: p.dump(c) except pickle.PicklingError: pass else: assert 0, "internal error - pickling should fail" try: p.dump(m) except pickle.PicklingError: pass else: assert 0, "internal error - pickling should fail" del c, m, fp, p if 1 and cPickle: c = Curl() m = CurlMulti() fp = StringIO() p = cPickle.Pickler(fp, 1) try: p.dump(c) except cPickle.PicklingError: pass else: assert 0, "internal error - pickling should fail" try: p.dump(m) except cPickle.PicklingError: pass else: assert 0, "internal error - pickling should fail" del c, m, fp, p # /*********************************************************************** # // test refcounts # ************************************************************************/ # basic check of reference counting (use a memory checker like valgrind) if 1: c = Curl() m = CurlMulti() m.add_handle(c) del m m = CurlMulti() c.close() del m, c # basic check of cyclic garbage collection if 1 and gc: gc.collect() c = Curl() c.m = CurlMulti() c.m.add_handle(c) # create some nasty cyclic references c.c = c c.c.c1 = c c.c.c2 = c c.c.c3 = c.c c.c.c4 = c.m c.m.c = c c.m.m = c.m c.m.c = c # delete gc.collect() flags = gc.DEBUG_COLLECTABLE | gc.DEBUG_UNCOLLECTABLE | gc.DEBUG_OBJECTS if opts.verbose >= 1: flags = flags | gc.DEBUG_STATS gc.set_debug(flags) gc.collect() ##print gc.get_referrers(c) ##print gc.get_objects() if opts.verbose >= 1: print "Tracked objects:", len(gc.get_objects()) # The `del' below should delete these 4 objects: # Curl + internal dict, CurlMulti + internal dict del c gc.collect() if opts.verbose >= 1: print "Tracked objects:", len(gc.get_objects()) if 1: # Ensure that the refcounting error in "reset" is fixed: for i in xrange(100000): c = Curl() c.reset() # /*********************************************************************** # // done # ************************************************************************/ print "All tests passed." tests/test.py000066600000003641150501000730007236 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test.py,v 1.17 2007/04/10 13:25:17 kjetilja Exp $ import sys, threading, time import pycurl # We should ignore SIGPIPE when using pycurl.NOSIGNAL - see # the libcurl tutorial for more info. try: import signal from signal import SIGPIPE, SIG_IGN signal.signal(signal.SIGPIPE, signal.SIG_IGN) except ImportError: pass class Test(threading.Thread): def __init__(self, url, ofile): threading.Thread.__init__(self) self.curl = pycurl.Curl() self.curl.setopt(pycurl.URL, url) self.curl.setopt(pycurl.WRITEDATA, ofile) self.curl.setopt(pycurl.FOLLOWLOCATION, 1) self.curl.setopt(pycurl.MAXREDIRS, 5) self.curl.setopt(pycurl.NOSIGNAL, 1) def run(self): self.curl.perform() self.curl.close() sys.stdout.write(".") sys.stdout.flush() # Read list of URIs from file specified on commandline try: urls = open(sys.argv[1]).readlines() except IndexError: # No file was specified, show usage string print "Usage: %s " % sys.argv[0] raise SystemExit # Initialize thread array and the file number threads = [] fileno = 0 # Start one thread per URI in parallel t1 = time.time() for url in urls: f = open(str(fileno), "wb") t = Test(url.rstrip(), f) t.start() threads.append((t, f)) fileno = fileno + 1 # Wait for all threads to finish for thread, file in threads: thread.join() file.close() t2 = time.time() print "\n** Multithreading, %d seconds elapsed for %d uris" % (int(t2-t1), len(urls)) # Start one thread per URI in sequence fileno = 0 t1 = time.time() for url in urls: f = open(str(fileno), "wb") t = Test(url.rstrip(), f) t.start() fileno = fileno + 1 t.join() f.close() t2 = time.time() print "\n** Singlethreading, %d seconds elapsed for %d uris" % (int(t2-t1), len(urls)) tests/test_multi3.py000066600000004024150501000730010527 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi3.py,v 1.13 2005/03/11 13:24:45 kjetilja Exp $ # same as test_multi2.py, but enforce some debugging and strange API-calls import os, sys try: from cStringIO import StringIO except ImportError: from StringIO import StringIO import pycurl urls = ( "http://curl.haxx.se", "http://www.python.org", "http://pycurl.sourceforge.net", "http://pycurl.sourceforge.net/THIS_HANDLE_IS_CLOSED", ) # init m = pycurl.CurlMulti() m.handles = [] for url in urls: c = pycurl.Curl() # save info in standard Python attributes c.url = url c.body = StringIO() c.http_code = -1 c.debug = 0 m.handles.append(c) # pycurl API calls c.setopt(c.URL, c.url) c.setopt(c.WRITEFUNCTION, c.body.write) m.add_handle(c) # debug - close a handle if 1: c = m.handles[3] c.debug = 1 c.close() # get data num_handles = len(m.handles) while num_handles: while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # currently no more I/O is pending, could do something in the meantime # (display a progress bar, etc.) m.select(1.0) # close handles for c in m.handles: # save info in standard Python attributes try: c.http_code = c.getinfo(c.HTTP_CODE) except pycurl.error: # handle already closed - see debug above assert c.debug c.http_code = -1 # pycurl API calls if 0: m.remove_handle(c) c.close() elif 0: # in the C API this is the wrong calling order, but pycurl # handles this automatically c.close() m.remove_handle(c) else: # actually, remove_handle is called automatically on close c.close() m.close() # print result for c in m.handles: data = c.body.getvalue() if 0: print "**********", c.url, "**********" print data else: print "%-53s http_code %3d, %6d bytes" % (c.url, c.http_code, len(data)) tests/test_multi5.py000066600000002700150501000730010530 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi5.py,v 1.12 2005/03/11 13:24:45 kjetilja Exp $ import sys, select, time import pycurl c1 = pycurl.Curl() c2 = pycurl.Curl() c3 = pycurl.Curl() c1.setopt(c1.URL, "http://www.python.org") c2.setopt(c2.URL, "http://curl.haxx.se") c3.setopt(c3.URL, "http://slashdot.org") c1.body = open("doc1", "wb") c2.body = open("doc2", "wb") c3.body = open("doc3", "wb") c1.setopt(c1.WRITEFUNCTION, c1.body.write) c2.setopt(c2.WRITEFUNCTION, c2.body.write) c3.setopt(c3.WRITEFUNCTION, c3.body.write) m = pycurl.CurlMulti() m.add_handle(c1) m.add_handle(c2) m.add_handle(c3) # Number of seconds to wait for a timeout to happen SELECT_TIMEOUT = 1.0 # Stir the state machine into action while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # Keep going until all the connections have terminated while num_handles: # The select method uses fdset internally to determine which file descriptors # to check. m.select(SELECT_TIMEOUT) while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # Cleanup m.remove_handle(c3) m.remove_handle(c2) m.remove_handle(c1) m.close() c1.body.close() c2.body.close() c3.body.close() c1.close() c2.close() c3.close() print "http://www.python.org is in file doc1" print "http://curl.haxx.se is in file doc2" print "http://slashdot.org is in file doc3" tests/test_stringio.py000066600000000731150501000730011151 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_stringio.py,v 1.6 2003/04/21 18:46:11 mfx Exp $ import sys try: from cStringIO import StringIO except ImportError: from StringIO import StringIO import pycurl url = "http://curl.haxx.se/dev/" print "Testing", pycurl.version body = StringIO() c = pycurl.Curl() c.setopt(c.URL, url) c.setopt(c.WRITEFUNCTION, body.write) c.perform() c.close() contents = body.getvalue() print contents tests/test_multi_socket.py000066600000003604150501000730012017 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi_socket.py,v 1.1 2006/11/10 15:03:05 kjetilja Exp $ import os, sys try: from cStringIO import StringIO except ImportError: from StringIO import StringIO import pycurl urls = ( "http://curl.haxx.se", "http://www.python.org", "http://pycurl.sourceforge.net", ) # Read list of URIs from file specified on commandline try: urls = open(sys.argv[1], "rb").readlines() except IndexError: # No file was specified pass # timer callback def timer(msecs): print 'Timer callback msecs:', msecs # socket callback def socket(event, socket, multi, data): print event, socket, multi, data # multi.assign(socket, timer) # init m = pycurl.CurlMulti() m.setopt(pycurl.M_PIPELINING, 1) m.setopt(pycurl.M_TIMERFUNCTION, timer) m.setopt(pycurl.M_SOCKETFUNCTION, socket) m.handles = [] for url in urls: c = pycurl.Curl() # save info in standard Python attributes c.url = url c.body = StringIO() c.http_code = -1 m.handles.append(c) # pycurl API calls c.setopt(c.URL, c.url) c.setopt(c.WRITEFUNCTION, c.body.write) m.add_handle(c) # get data num_handles = len(m.handles) while num_handles: while 1: ret, num_handles = m.socket_all() if ret != pycurl.E_CALL_MULTI_PERFORM: break # currently no more I/O is pending, could do something in the meantime # (display a progress bar, etc.) m.select(1.0) # close handles for c in m.handles: # save info in standard Python attributes c.http_code = c.getinfo(c.HTTP_CODE) # pycurl API calls m.remove_handle(c) c.close() m.close() # print result for c in m.handles: data = c.body.getvalue() if 0: print "**********", c.url, "**********" print data else: print "%-53s http_code %3d, %6d bytes" % (c.url, c.http_code, len(data)) tests/test_share.py000066600000001312150501000730010411 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_share.py,v 1.1 2006/06/13 19:04:44 kjetilja Exp $ import sys import pycurl import threading print >>sys.stderr, 'Testing', pycurl.version class Test(threading.Thread): def __init__(self, share): threading.Thread.__init__(self) self.curl = pycurl.Curl() self.curl.setopt(pycurl.URL, 'http://curl.haxx.se') self.curl.setopt(pycurl.SHARE, share) def run(self): self.curl.perform() self.curl.close() s = pycurl.CurlShare() s.setopt(pycurl.SH_SHARE, pycurl.LOCK_DATA_COOKIE) s.setopt(pycurl.SH_SHARE, pycurl.LOCK_DATA_DNS) t1 = Test(s) t2 = Test(s) t1.start() t2.start() del s tests/test_memleak.py000066600000002146150501000730010730 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_memleak.py,v 1.4 2003/05/01 16:48:54 mfx Exp $ # # just a simple self-test # need Python 2.2 or better for garbage collection # import gc, pycurl, sys gc.enable() print "Python", sys.version print "PycURL %s (compiled against 0x%x)" % (pycurl.version, pycurl.COMPILE_LIBCURL_VERSION_NUM) ##print "PycURL version info", pycurl.version_info() print " %s, compiled %s" % (pycurl.__file__, pycurl.COMPILE_DATE) gc.collect() flags = gc.DEBUG_COLLECTABLE | gc.DEBUG_UNCOLLECTABLE | gc.DEBUG_OBJECTS if 1: flags = flags | gc.DEBUG_STATS gc.set_debug(flags) gc.collect() print "Tracked objects:", len(gc.get_objects()) multi = pycurl.CurlMulti() t = [] for a in range(100): curl = pycurl.Curl() multi.add_handle(curl) t.append(curl) print "Tracked objects:", len(gc.get_objects()) for curl in t: curl.close() multi.remove_handle(curl) print "Tracked objects:", len(gc.get_objects()) del curl del t del multi print "Tracked objects:", len(gc.get_objects()) gc.collect() print "Tracked objects:", len(gc.get_objects()) tests/util.pyc000066600000001774150501000730007404 0ustar00 s<>c@s(ddkZddkZddZdS(iNc CsI|djo ti}n|}yddkl}Wntj o|SXd}|o|d}n|}d|tid f}x|titi fD]}|pqntii |d}xxdd |d |fD]_}tii tii ||}tii |o%||jo|i d |q=qqWqW|S( Ni(t get_platformtis%s-%sitbuildtlibslib.i(tNonetsystpathtdistutils.utilRt ImportErrortversiontostcurdirtpardirtjointnormpathtisdirtinsert( tpRtp0tplattplat_specifiertprefixtdtsubdirtdir((s1/builddir/build/BUILD/pycurl-7.19.0/tests/util.pyt get_sys_path s0  !  (R RRR(((s1/builddir/build/BUILD/pycurl-7.19.0/tests/util.pytstests/test_post2.py000066600000001027150501000730010361 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_post2.py,v 1.13 2005/03/03 10:00:40 kjetilja Exp $ import pycurl pf = [('field1', 'this is a test using httppost & stuff'), ('field2', (pycurl.FORM_FILE, 'test_post.py', pycurl.FORM_FILE, 'test_post2.py')), ('field3', (pycurl.FORM_CONTENTS, 'this is wei\000rd, but null-bytes are okay')) ] c = pycurl.Curl() c.setopt(c.URL, 'http://www.contactor.se/~dast/postit.cgi') c.setopt(c.HTTPPOST, pf) c.setopt(c.VERBOSE, 1) c.perform() c.close() tests/test_socketopen.py000066600000000670150501000730011467 0ustar00import pycurl import StringIO import socket def socketopen(family, socktype, protocol): print family, socktype, protocol s = socket.socket(family, socktype, protocol) s.setsockopt(socket.SOL_SOCKET, socket.SO_KEEPALIVE, 1) return s sio = StringIO.StringIO() c = pycurl.Curl() c.setopt(pycurl.OPENSOCKETFUNCTION, socketopen) c.setopt(pycurl.URL, 'http://camvine.com') c.setopt(pycurl.WRITEFUNCTION, sio.write) c.perform() tests/test_multi_vs_thread.py000066600000013215150501000730012505 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi_vs_thread.py,v 1.16 2005/04/12 03:39:01 mfx Exp $ import os, sys, time from threading import Thread, RLock try: from cStringIO import StringIO except ImportError: from StringIO import StringIO import pycurl # We should ignore SIGPIPE when using pycurl.NOSIGNAL - see # the libcurl tutorial for more info. try: import signal from signal import SIGPIPE, SIG_IGN signal.signal(signal.SIGPIPE, signal.SIG_IGN) except ImportError: pass # The conclusion is: the multi interface is fastest! NUM_PAGES = 30 NUM_THREADS = 10 assert NUM_PAGES % NUM_THREADS == 0 ##URL = "http://pycurl.sourceforge.net/tests/testgetvars.php?%d" URL = "http://pycurl.sourceforge.net/tests/teststaticpage.html?%d" # # util # class Curl: def __init__(self, url): self.url = url self.body = StringIO() self.http_code = -1 # pycurl API calls self._curl = pycurl.Curl() self._curl.setopt(pycurl.URL, self.url) self._curl.setopt(pycurl.WRITEFUNCTION, self.body.write) self._curl.setopt(pycurl.NOSIGNAL, 1) def perform(self): self._curl.perform() def close(self): self.http_code = self._curl.getinfo(pycurl.HTTP_CODE) self._curl.close() def print_result(items): return # DO NOTHING # for c in items: data = c.body.getvalue() if 0: print "**********", c.url, "**********" print data elif 1: print "%-60s %3d %6d" % (c.url, c.http_code, len(data)) ### ### 1) multi ### def test_multi(): clock1 = time.time() # init handles = [] m = pycurl.CurlMulti() for i in range(NUM_PAGES): c = Curl(URL %i) m.add_handle(c._curl) handles.append(c) clock2 = time.time() # stir state machine into action while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # get data while num_handles: m.select(1.0) while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break clock3 = time.time() # close handles for c in handles: c.close() m.close() clock4 = time.time() print "multi interface: %d pages: perform %5.2f secs, total %5.2f secs" % (NUM_PAGES, clock3 - clock2, clock4 - clock1) # print result print_result(handles) ### ### 2) thread ### class Test(Thread): def __init__(self, lock=None): Thread.__init__(self) self.lock = lock self.items = [] def run(self): if self.lock: self.lock.acquire() self.lock.release() for c in self.items: c.perform() def test_threads(lock=None): clock1 = time.time() # create and start threads, but block them if lock: lock.acquire() # init (FIXME - this is ugly) threads = [] handles = [] t = None for i in range(NUM_PAGES): if i % (NUM_PAGES / NUM_THREADS) == 0: t = Test(lock) if lock: t.start() threads.append(t) c = Curl(URL % i) t.items.append(c) handles.append(c) assert len(handles) == NUM_PAGES assert len(threads) == NUM_THREADS clock2 = time.time() # if lock: # release lock to let the blocked threads run lock.release() else: # start threads for t in threads: t.start() # wait for threads to finish for t in threads: t.join() clock3 = time.time() # close handles for c in handles: c.close() clock4 = time.time() if lock: print "thread interface [lock]: %d pages: perform %5.2f secs, total %5.2f secs" % (NUM_PAGES, clock3 - clock2, clock4 - clock1) else: print "thread interface: %d pages: perform %5.2f secs, total %5.2f secs" % (NUM_PAGES, clock3 - clock2, clock4 - clock1) # print result print_result(handles) ### ### 3) thread - threads grab curl objects on demand from a shared pool ### class TestPool(Thread): def __init__(self, lock, pool): Thread.__init__(self) self.lock = lock self.pool = pool def run(self): while 1: self.lock.acquire() c = None if self.pool: c = self.pool.pop() self.lock.release() if c is None: break c.perform() def test_thread_pool(lock): clock1 = time.time() # init handles = [] for i in range(NUM_PAGES): c = Curl(URL %i) handles.append(c) # create and start threads, but block them lock.acquire() threads = [] pool = handles[:] # shallow copy of the list, shared for pop() for i in range(NUM_THREADS): t = TestPool(lock, pool) t.start() threads.append(t) assert len(pool) == NUM_PAGES assert len(threads) == NUM_THREADS clock2 = time.time() # release lock to let the blocked threads run lock.release() # wait for threads to finish for t in threads: t.join() clock3 = time.time() # close handles for c in handles: c.close() clock4 = time.time() print "thread interface [pool]: %d pages: perform %5.2f secs, total %5.2f secs" % (NUM_PAGES, clock3 - clock2, clock4 - clock1) # print result print_result(handles) lock = RLock() if 1: test_multi() test_threads() test_threads(lock) test_thread_pool(lock) else: test_thread_pool(lock) test_threads(lock) test_threads() test_multi() tests/test_xmlrpc.py000066600000001350150501000730010616 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_xmlrpc.py,v 1.7 2003/04/21 18:46:11 mfx Exp $ ## XML-RPC lib included in python2.2 import xmlrpclib import pycurl # Header fields passed in request xmlrpc_header = [ "User-Agent: PycURL XML-RPC Test", "Content-Type: text/xml" ] # XML-RPC request template xmlrpc_template = """ %s%s """ # Engage c = pycurl.Curl() c.setopt(c.URL, 'http://betty.userland.com/RPC2') c.setopt(c.POST, 1) c.setopt(c.HTTPHEADER, xmlrpc_header) c.setopt(c.POSTFIELDS, xmlrpc_template % ("examples.getStateName", xmlrpclib.dumps((5,)))) print 'Response from http://betty.userland.com/' c.perform() c.close() tests/test_multi4.py000066600000002570150501000730010534 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi4.py,v 1.14 2005/03/11 13:24:45 kjetilja Exp $ import sys, select, time import pycurl c1 = pycurl.Curl() c2 = pycurl.Curl() c3 = pycurl.Curl() c1.setopt(c1.URL, "http://www.python.org") c2.setopt(c2.URL, "http://curl.haxx.se") c3.setopt(c3.URL, "http://slashdot.org") c1.body = open("doc1", "wb") c2.body = open("doc2", "wb") c3.body = open("doc3", "wb") c1.setopt(c1.WRITEFUNCTION, c1.body.write) c2.setopt(c2.WRITEFUNCTION, c2.body.write) c3.setopt(c3.WRITEFUNCTION, c3.body.write) m = pycurl.CurlMulti() m.add_handle(c1) m.add_handle(c2) m.add_handle(c3) # Number of seconds to wait for a timeout to happen SELECT_TIMEOUT = 1.0 # Stir the state machine into action while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # Keep going until all the connections have terminated while num_handles: apply(select.select, m.fdset() + (SELECT_TIMEOUT,)) while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # Cleanup m.remove_handle(c3) m.remove_handle(c2) m.remove_handle(c1) m.close() c1.body.close() c2.body.close() c3.body.close() c1.close() c2.close() c3.close() print "http://www.python.org is in file doc1" print "http://curl.haxx.se is in file doc2" print "http://slashdot.org is in file doc3" tests/test_ftp.py000066600000000441150501000730010102 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_ftp.py,v 1.1 2006/08/24 07:36:03 kjetilja Exp $ import pycurl c = pycurl.Curl() c.setopt(c.URL, 'ftp://ftp.sunet.se/') c.setopt(c.FTP_USE_EPSV, 1) c.setopt(c.QUOTE, ['cwd pub', 'type i']) c.perform() c.close() tests/test_post.py000066600000001115150501000730010275 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_post.py,v 1.9 2003/04/21 18:46:11 mfx Exp $ import urllib import pycurl # simple pf = {'field1': 'value1'} # multiple fields pf = {'field1':'value1', 'field2':'value2 with blanks', 'field3':'value3'} # multiple fields with & in field pf = {'field1':'value1', 'field2':'value2 with blanks and & chars', 'field3':'value3'} c = pycurl.Curl() c.setopt(c.URL, 'http://pycurl.sourceforge.net/tests/testpostvars.php') c.setopt(c.POSTFIELDS, urllib.urlencode(pf)) c.setopt(c.VERBOSE, 1) c.perform() c.close() tests/test_gtk.py000066600000005255150501000730010106 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_gtk.py,v 1.24 2005/03/30 12:05:50 kjetilja Exp $ import sys, threading import pycurl import pygtk pygtk.require('2.0') import gtk # We should ignore SIGPIPE when using pycurl.NOSIGNAL - see # the libcurl tutorial for more info. try: import signal from signal import SIGPIPE, SIG_IGN signal.signal(signal.SIGPIPE, signal.SIG_IGN) except ImportError: pass class ProgressBar: def __init__(self, uri): self.round = 0.0 win = gtk.Window(gtk.WINDOW_TOPLEVEL) win.set_title("PycURL progress") win.show() vbox = gtk.VBox(spacing=5) vbox.set_border_width(10) win.add(vbox) win.set_default_size(200, 20) vbox.show() label = gtk.Label("Downloading %s" % uri) label.set_alignment(0, 0.5) vbox.pack_start(label) label.show() pbar = gtk.ProgressBar() pbar.show() self.pbar = pbar vbox.pack_start(pbar) win.connect("destroy", self.close_app) def progress(self, download_t, download_d, upload_t, upload_d): if download_t == 0: self.round = self.round + 0.1 if self.round >= 1.0: self.round = 0.0 else: self.round = float(download_d) / float(download_t) gtk.threads_enter() self.pbar.set_fraction(self.round) gtk.threads_leave() def mainloop(self): gtk.threads_enter() gtk.main() gtk.threads_leave() def close_app(self, *args): args[0].destroy() gtk.main_quit() class Test(threading.Thread): def __init__(self, url, target_file, progress): threading.Thread.__init__(self) self.target_file = target_file self.progress = progress self.curl = pycurl.Curl() self.curl.setopt(pycurl.URL, url) self.curl.setopt(pycurl.WRITEDATA, self.target_file) self.curl.setopt(pycurl.FOLLOWLOCATION, 1) self.curl.setopt(pycurl.NOPROGRESS, 0) self.curl.setopt(pycurl.PROGRESSFUNCTION, self.progress) self.curl.setopt(pycurl.MAXREDIRS, 5) self.curl.setopt(pycurl.NOSIGNAL, 1) def run(self): self.curl.perform() self.curl.close() self.target_file.close() self.progress(1.0, 1.0, 0, 0) # Check command line args if len(sys.argv) < 3: print "Usage: %s " % sys.argv[0] raise SystemExit # Make a progress bar window p = ProgressBar(sys.argv[1]) # Start thread for fetching url Test(sys.argv[1], open(sys.argv[2], 'wb'), p.progress).start() # Enter the GTK mainloop gtk.threads_init() try: p.mainloop() except KeyboardInterrupt: pass tests/test_cb.py000066600000001265150501000730007702 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_cb.py,v 1.14 2003/04/21 18:46:10 mfx Exp $ import sys import pycurl ## Callback function invoked when body data is ready def body(buf): # Print body data to stdout sys.stdout.write(buf) ## Callback function invoked when header data is ready def header(buf): # Print header data to stderr sys.stderr.write(buf) c = pycurl.Curl() c.setopt(pycurl.URL, 'http://www.python.org/') c.setopt(pycurl.WRITEFUNCTION, body) c.setopt(pycurl.HEADERFUNCTION, header) c.setopt(pycurl.FOLLOWLOCATION, 1) c.setopt(pycurl.MAXREDIRS, 5) c.perform() c.setopt(pycurl.URL, 'http://curl.haxx.se/') c.perform() c.close() tests/test_multi_timer.py000066600000003325150501000730011647 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi_timer.py,v 1.1 2006/11/10 12:25:29 kjetilja Exp $ import os, sys try: from cStringIO import StringIO except ImportError: from StringIO import StringIO import pycurl urls = ( "http://curl.haxx.se", "http://www.python.org", "http://pycurl.sourceforge.net", ) # Read list of URIs from file specified on commandline try: urls = open(sys.argv[1], "rb").readlines() except IndexError: # No file was specified pass # timer callback def timer(msecs): print 'Timer callback msecs:', msecs # init m = pycurl.CurlMulti() m.setopt(pycurl.M_PIPELINING, 1) m.setopt(pycurl.M_TIMERFUNCTION, timer) m.handles = [] for url in urls: c = pycurl.Curl() # save info in standard Python attributes c.url = url c.body = StringIO() c.http_code = -1 m.handles.append(c) # pycurl API calls c.setopt(c.URL, c.url) c.setopt(c.WRITEFUNCTION, c.body.write) m.add_handle(c) # get data num_handles = len(m.handles) while num_handles: while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # currently no more I/O is pending, could do something in the meantime # (display a progress bar, etc.) m.select(1.0) # close handles for c in m.handles: # save info in standard Python attributes c.http_code = c.getinfo(c.HTTP_CODE) # pycurl API calls m.remove_handle(c) c.close() m.close() # print result for c in m.handles: data = c.body.getvalue() if 0: print "**********", c.url, "**********" print data else: print "%-53s http_code %3d, %6d bytes" % (c.url, c.http_code, len(data)) tests/test_multi6.py000066600000003000150501000730010523 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi6.py,v 1.6 2005/03/11 13:24:45 kjetilja Exp $ import sys, select, time import pycurl c1 = pycurl.Curl() c2 = pycurl.Curl() c3 = pycurl.Curl() c1.setopt(c1.URL, "http://www.python.org") c2.setopt(c2.URL, "http://curl.haxx.se") c3.setopt(c3.URL, "http://slashdot.org") c1.body = open("doc1", "wb") c2.body = open("doc2", "wb") c3.body = open("doc3", "wb") c1.setopt(c1.WRITEFUNCTION, c1.body.write) c2.setopt(c2.WRITEFUNCTION, c2.body.write) c3.setopt(c3.WRITEFUNCTION, c3.body.write) m = pycurl.CurlMulti() m.add_handle(c1) m.add_handle(c2) m.add_handle(c3) # Number of seconds to wait for a timeout to happen SELECT_TIMEOUT = 1.0 # Stir the state machine into action while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # Keep going until all the connections have terminated while num_handles: # The select method uses fdset internally to determine which file descriptors # to check. m.select(SELECT_TIMEOUT) while 1: ret, num_handles = m.perform() # Print the message, if any print m.info_read(1) if ret != pycurl.E_CALL_MULTI_PERFORM: break # Cleanup m.remove_handle(c3) m.remove_handle(c2) m.remove_handle(c1) m.close() c1.body.close() c2.body.close() c3.body.close() c1.close() c2.close() c3.close() print "http://www.python.org is in file doc1" print "http://curl.haxx.se is in file doc2" print "http://slashdot.org is in file doc3" tests/test_getinfo.py000066600000002613150501000730010747 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_getinfo.py,v 1.18 2003/05/01 19:35:01 mfx Exp $ import time import pycurl ## Callback function invoked when progress information is updated def progress(download_t, download_d, upload_t, upload_d): print "Total to download %d bytes, have %d bytes so far" % \ (download_t, download_d) url = "http://www.cnn.com" print "Starting downloading", url print f = open("body", "wb") h = open("header", "wb") c = pycurl.Curl() c.setopt(c.URL, url) c.setopt(c.WRITEDATA, f) c.setopt(c.NOPROGRESS, 0) c.setopt(c.PROGRESSFUNCTION, progress) c.setopt(c.FOLLOWLOCATION, 1) c.setopt(c.MAXREDIRS, 5) c.setopt(c.WRITEHEADER, h) c.setopt(c.OPT_FILETIME, 1) c.perform() print print "HTTP-code:", c.getinfo(c.HTTP_CODE) print "Total-time:", c.getinfo(c.TOTAL_TIME) print "Download speed: %.2f bytes/second" % c.getinfo(c.SPEED_DOWNLOAD) print "Document size: %d bytes" % c.getinfo(c.SIZE_DOWNLOAD) print "Effective URL:", c.getinfo(c.EFFECTIVE_URL) print "Content-type:", c.getinfo(c.CONTENT_TYPE) print "Namelookup-time:", c.getinfo(c.NAMELOOKUP_TIME) print "Redirect-time:", c.getinfo(c.REDIRECT_TIME) print "Redirect-count:", c.getinfo(c.REDIRECT_COUNT) epoch = c.getinfo(c.INFO_FILETIME) print "Filetime: %d (%s)" % (epoch, time.ctime(epoch)) print print "Header is in file 'header', body is in file 'body'" c.close() f.close() h.close() tests/test_debug.py000066600000000524150501000730010401 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_debug.py,v 1.6 2003/04/21 18:46:10 mfx Exp $ import pycurl def test(t, b): print "debug(%d): %s" % (t, b) c = pycurl.Curl() c.setopt(pycurl.URL, 'http://curl.haxx.se/') c.setopt(pycurl.VERBOSE, 1) c.setopt(pycurl.DEBUGFUNCTION, test) c.perform() c.close() tests/test_multi_socket_select.py000066600000004617150501000730013363 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi_socket_select.py,v 1.1 2008/06/11 18:11:46 kjetilja Exp $ import os, sys try: from cStringIO import StringIO except ImportError: from StringIO import StringIO import pycurl import select sockets = set() timeout = 0 urls = ( "http://curl.haxx.se", "http://www.python.org", "http://pycurl.sourceforge.net", ) # Read list of URIs from file specified on commandline try: urls = open(sys.argv[1], "rb").readlines() except IndexError: # No file was specified pass # timer callback def timer(msecs): global timeout timeout = msecs print 'Timer callback msecs:', msecs # socket callback def socket(event, socket, multi, data): if event == pycurl.POLL_REMOVE: print "Remove Socket %d"%socket sockets.remove(socket) else: if socket not in sockets: print "Add socket %d"%socket sockets.add(socket) print event, socket, multi, data # init m = pycurl.CurlMulti() m.setopt(pycurl.M_PIPELINING, 1) m.setopt(pycurl.M_TIMERFUNCTION, timer) m.setopt(pycurl.M_SOCKETFUNCTION, socket) m.handles = [] for url in urls: c = pycurl.Curl() # save info in standard Python attributes c.url = url c.body = StringIO() c.http_code = -1 m.handles.append(c) # pycurl API calls c.setopt(c.URL, c.url) c.setopt(c.WRITEFUNCTION, c.body.write) m.add_handle(c) # get data num_handles = len(m.handles) while (pycurl.E_CALL_MULTI_PERFORM==m.socket_all()[0]): pass timeout = m.timeout() while True: (rr, wr, er) = select.select(sockets,sockets,sockets,timeout/1000.0) socketSet = set(rr+wr+er) if socketSet: for s in socketSet: while True: (ret,running) = m.socket_action(s,0) if ret!=pycurl.E_CALL_MULTI_PERFORM: break else: (ret,running) = m.socket_action(pycurl.SOCKET_TIMEOUT,0) if running==0: break # close handles for c in m.handles: # save info in standard Python attributes c.http_code = c.getinfo(c.HTTP_CODE) # pycurl API calls m.remove_handle(c) c.close() m.close() # print result for c in m.handles: data = c.body.getvalue() if 0: print "**********", c.url, "**********" print data else: print "%-53s http_code %3d, %6d bytes" % (c.url, c.http_code, len(data)) tests/util.py000066600000001665150501000730007240 0ustar00# -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: util.py,v 1.4 2003/04/21 18:46:11 mfx Exp $ import os, sys # # prepare sys.path in case we are still in the build directory # see also: distutils/command/build.py (build_platlib) # def get_sys_path(p=None): if p is None: p = sys.path p = p[:] try: from distutils.util import get_platform except ImportError: return p p0 = "" if p: p0 = p[0] # plat = get_platform() plat_specifier = "%s-%s" % (plat, sys.version[:3]) ##print plat, plat_specifier # for prefix in (p0, os.curdir, os.pardir,): if not prefix: continue d = os.path.join(prefix, "build") for subdir in ("lib", "lib." + plat_specifier, "lib." + plat): dir = os.path.normpath(os.path.join(d, subdir)) if os.path.isdir(dir): if dir not in p: p.insert(1, dir) # return p tests/test_multi.py000066600000001244150501000730010445 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi.py,v 1.10 2005/03/11 13:24:45 kjetilja Exp $ import pycurl m = pycurl.CurlMulti() m.handles = [] c1 = pycurl.Curl() c2 = pycurl.Curl() c1.setopt(c1.URL, 'http://curl.haxx.se') c2.setopt(c2.URL, 'http://cnn.com') c2.setopt(c2.FOLLOWLOCATION, 1) m.add_handle(c1) m.add_handle(c2) m.handles.append(c1) m.handles.append(c2) num_handles = len(m.handles) while num_handles: while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break m.select(1.0) m.remove_handle(c2) m.remove_handle(c1) del m.handles m.close() c1.close() c2.close() tests/test_multi2.py000066600000003322150501000730010526 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_multi2.py,v 1.15 2007/04/10 13:26:45 kjetilja Exp $ import os, sys try: from cStringIO import StringIO except ImportError: from StringIO import StringIO import pycurl urls = ( "http://curl.haxx.se", "http://www.python.org", "http://pycurl.sourceforge.net", "http://pycurl.sourceforge.net/tests/403_FORBIDDEN", # that actually exists ;-) "http://pycurl.sourceforge.net/tests/404_NOT_FOUND", ) # Read list of URIs from file specified on commandline try: urls = open(sys.argv[1], "rb").readlines() except IndexError: # No file was specified pass # init m = pycurl.CurlMulti() m.handles = [] for url in urls: c = pycurl.Curl() # save info in standard Python attributes c.url = url.rstrip() c.body = StringIO() c.http_code = -1 m.handles.append(c) # pycurl API calls c.setopt(c.URL, c.url) c.setopt(c.WRITEFUNCTION, c.body.write) m.add_handle(c) # get data num_handles = len(m.handles) while num_handles: while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # currently no more I/O is pending, could do something in the meantime # (display a progress bar, etc.) m.select(1.0) # close handles for c in m.handles: # save info in standard Python attributes c.http_code = c.getinfo(c.HTTP_CODE) # pycurl API calls m.remove_handle(c) c.close() m.close() # print result for c in m.handles: data = c.body.getvalue() if 0: print "**********", c.url, "**********" print data else: print "%-53s http_code %3d, %6d bytes" % (c.url, c.http_code, len(data)) tests/test_post3.py000066600000001444150501000730010365 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: test_post3.py,v 1.1 2004/06/21 11:24:18 kjetilja Exp $ import urllib POSTSTRING = urllib.urlencode({'field1':'value1', 'field2':'value2 with blanks', 'field3':'value3'}) class test: def __init__(self): self.finished = False def read_cb(self, size): assert len(POSTSTRING) <= size if not self.finished: self.finished = True return POSTSTRING else: # Nothing more to read return "" import pycurl c = pycurl.Curl() t = test() c.setopt(c.URL, 'http://pycurl.sourceforge.net/tests/testpostvars.php') c.setopt(c.POST, 1) c.setopt(c.POSTFIELDSIZE, len(POSTSTRING)) c.setopt(c.READFUNCTION, t.read_cb) c.setopt(c.VERBOSE, 1) c.perform() c.close() ChangeLog000066600000070326150501000730006321 0ustar00Version 7.19.0 [requires libcurl-7.19.0 or better] -------------- * Added CURLFILE, ADDRESS_SCOPE and ISSUERCERT options, as well as the APPCONNECT_TIME info. * Added PRIMARY_IP info (patch by Yuhui H ). * Added support for curl_easy_reset through a new 'reset' method on curl objects (patch by Nick Pilon ). * Added support for OPENSOCKET callbacks. See 'tests/test_opensocket.py' for example usage (patch by Thomas Hunger ). Version 7.18.2 -------------- * Added REDIRECT_URL info and M_MAXCONNECTS option (patch by Yuhui H ). * Added socket_action() method to CurlMulti objects. See 'tests/test_multi_socket_select.py' for example usage (patch by Yuhui H ). * Added AUTOREFERER option. * Allow resetting some list operations (HTTPHEADER, QUOTE, POSTQUOTE, PREQUOTE) by passing an empty list to setopt (patch by Jim Patterson). Version 7.18.1 -------------- * Added POST301, SSH_HOST_PUBLIC_KEY_MD5, COPYPOSTFIELDS and PROXY_TRANSFER_MODE options. * Check for static libs in setup.py to better detect whether libcurl was linked with OpenSSL or GNUTLS. * PycURL is now dual licensed under the LGPL and a license similar to the cURL license (an MIT/X derivative). Version 7.16.4 -------------- * Allow any callable object as the callback function. This change comes handy when you would like to use objects which are callable but are not functions or methods, for example those objects created by the functions in the functools module (patch by Daniel Pena Arteaga ). * Added NEW_DIRECTORY_PERMS and NEW_FILE_PERMS options. Version 7.16.2.1 ---------------- * Added IOCMD_NOP and IOCMD_RESTARTREAD for ioctl callback handling (patch by Mark Eichin). * Use Py_ssize_t where appropriate for Python 2.5 and 64-bit compatibility. This fixes the problem reported by Aaron Hill, where the exception "pycurl.error: (2, '')" is thrown when calling setopt(pycurl.POSTFIELDS,...) on 64-bit platforms. Version 7.16.2 -------------- * Added options HTTP_TRANSFER_DECODING, HTTP_CONTENT_DECODING, TIMEOUT_MS, CONNECTTIMEOUT_MS from libcurl 7.16.2. * Right-strip URLs read from files in the test scripts to avoid sending requests with '\n' at the end. Version 7.16.1 -------------- * Added constants for all libcurl (error) return codes. They are named the same as the macro constants in curl.h but prefixed with E_ instead of CURLE. Return codes for the multi API are prefixed with M_ instead of CURLM. * Added CURLOPT_FTP_SSL_CCC, CURLOPT_SSH_PUBLIC_KEYFILE, CURLOPT_SSH_PRIVATE_KEYFILE, CURLOPT_SSH_AUTH_TYPES. * Removed CLOSEPOLICY and friends since this option is now deprecated in libcurl. * Set the _use_datetime attribute on the CURLTransport class to unbreak xmlrpc_curl.py on Python 2.5. Version 7.16.0 [no public release] -------------- * Added CURLOPT_SSL_SESSIONID_CACHE. * Removed SOURCE_* options since they are no longer supported by libcurl. Version 7.15.5.1 ---------------- * Added test for basic ftp usage (tests/test_ftp.py). * Fix broken ssl mutex lock function when using GNU TLS (Debian bug #380156, fix by Bastian Kleineidam) Version 7.15.5 -------------- * Added CURLOPT_FTP_ALTERNATIVE_TO_USER, CURLOPT_MAX_SEND_SPEED_LARGE, and CURLOPT_MAX_RECV_SPEED_LARGE. Version 7.15.4.2 ---------------- * Use SSL locking callbacks, fixes random crashes for multithreaded SSL connections (patch by Jayne ). Version 7.15.4.1 ---------------- * Fixed compilation problem with C compilers not allowing declarations in the middle of code blocks (patch by K.S.Sreeram ). * Fixed bug in curl_multi_fdset wrapping, max_fd < 0 is not an error (patch by K.S.Sreeram ). Version 7.15.4 -------------- * Added support for libcurl shares, patch from Victor Lascurain . See the file tests/test_share.py for example usage. * Added support for CURLINFO_FTP_ENTRY_PATH. Version 7.15.2 -------------- * Added CURLOPT_CONNECT_ONLY, CURLINFO_LASTSOCKET, CURLOPT_LOCALPORT and CURLOPT_LOCALPORTRANGE. Version 7.15.1 -------------- 2006-01-31 Kjetil Jacobsen * Fixed memory leak for getinfo calls that return a list as result. Patch by Paul Pacheco. Version 7.15.0 -------------- 2005-10-18 Kjetil Jacobsen * Added CURLOPT_FTP_SKIP_PASV_IP. Version 7.14.1 -------------- 2005-09-05 Kjetil Jacobsen * Added CURLOPT_IGNORE_CONTENT_LENGTH, CURLOPT_COOKIELIST as COOKIELIST and CURLINFO_COOKIELIST as INFO_COOKIELIST. Version 7.14.0 -------------- 2005-05-18 Kjetil Jacobsen * Added missing information returned from the info() method in the high-level interface. * Added the FORM_FILENAME option to the CURLFORM API with HTTPPOST. Version 7.13.2 -------------- 2005-03-30 Kjetil Jacobsen * Unbreak tests/test_gtk.py and require pygtk >= 2.0. 2005-03-15 Kjetil Jacobsen * Cleaned up several of the examples. 2005-03-11 Kjetil Jacobsen * WARNING: multi.select() now requires the previously optional timeout parameter. Updated the tests and examples to reflect this change. If the timeout is not set, select could block infinitely and cause problems for the internal timeout handling in the multi stack. The problem was identified by . Version 7.13.1 -------------- 2005-03-04 Kjetil Jacobsen * Use METH_NOARGS where appropriate. 2005-03-03 Kjetil Jacobsen * Added support for CURLFORM API with HTTPPOST: Supports a a tuple with pairs of options and values instead of just supporting string contents. See tests/test_post2.py for example usage. Options are FORM_CONTENTS, FORM_FILE and FORM_CONTENTTYPE, corresponding to the CURLFORM_* options, and values are strings. 2005-02-13 Markus F.X.J. Oberhumer * Read callbacks (pycurl.READFUNCTION) can now return pycurl.READFUNC_ABORT to immediately abort the current transfer. * The INFILESIZE, MAXFILESIZE, POSTFIELDSIZE and RESUME_FROM options now automatically use the largefile version to handle files > 2GB. * Added missing pycurl.PORT constant. Version 7.13.0 -------------- 2005-02-10 Kjetil Jacobsen * Added file_upload.py to examples, shows how to upload a file. * Added CURLOPT_IOCTLFUNCTION/DATA. * Added options from libcurl 7.13.0: FTP_ACCOUNT, SOURCE_URL, SOURCE_QUOTE. * Obsoleted options: SOURCE_HOST, SOURCE_PATH, SOURCE_PORT, PASV_HOST. Version 7.12.3 -------------- 2004-12-22 Markus F.X.J. Oberhumer * Added CURLINFO_NUM_CONNECTS and CURLINFO_SSL_ENGINES. * Added some other missing constants. * Updated pycurl.version_info() to return a 12-tuple instead of a 9-tuple. Version 7.12.2 -------------- 2004-10-15 Kjetil Jacobsen * Added CURLOPT_FTPSSLAUTH (and CURLFTPAUTH_*). * Added CURLINFO_OS_ERRNO. 2004-08-17 Kjetil Jacobsen * Use LONG_LONG instead of PY_LONG_LONG to make pycurl compile on Python versions < 2.3 (fix from Domenico Andreoli ). Version 7.12.1 -------------- 2004-08-02 Kjetil Jacobsen * Added INFOTYPE_SSL_DATA_IN/OUT. 2004-07-16 Markus F.X.J. Oberhumer * WARNING: removed deprecated PROXY_, TIMECOND_ and non-prefixed INFOTYPE constant names. See ChangeLog entry 2003-06-10. 2004-06-21 Kjetil Jacobsen * Added test program for HTTP post using the read callback (see tests/test_post3.py for details). * Use the new CURL_READFUNC_ABORT return code where appropriate to avoid hanging in perform() when read callbacks are used. * Added support for libcurl 7.12.1 CURLOPT features: SOURCE_HOST, SOURCE_USERPWD, SOURCE_PATH, SOURCE_PORT, PASV_HOST, SOURCE_PREQUOTE, SOURCE_POSTQUOTE. 2004-06-08 Markus F.X.J. Oberhumer * Setting CURLOPT_POSTFIELDS now allows binary data and automatically sets CURLOPT_POSTFIELDSIZE for you. If you really want a different size you have to manually set POSTFIELDSIZE after setting POSTFIELDS. (Based on a patch by Martin Muenstermann). 2004-06-05 Markus F.X.J. Oberhumer * Added stricter checks within the callback handlers. * Unify the behaviour of int and long parameters where appropriate. Version 7.12 ------------ 2004-05-18 Kjetil Jacobsen * WARNING: To simplify code maintenance pycurl now requires libcurl 7.11.2 and Python 2.2 or newer to work. * GC support is now always enabled. Version 7.11.3 -------------- 2004-04-30 Kjetil Jacobsen * Do not use the deprecated curl_formparse function. API CHANGE: HTTPPOST now takes a list of tuples where each tuple contains a form name and a form value, both strings (see test/test_post2.py for example usage). * Found a possible reference count bug in the multithreading code which may have contributed to the long-standing GC segfault which has haunted pycurl. Fingers crossed. Version 7.11.2 -------------- 2004-04-21 Kjetil Jacobsen * Added support for libcurl 7.11.2 CURLOPT features: CURLOPT_TCP_NODELAY. 2004-03-25 Kjetil Jacobsen * Store Python longs in off_t with PyLong_AsLongLong instead of PyLong_AsLong. Should make the options which deal with large files behave a little better. Note that this requires the long long support in Python 2.2 or newer to work properly. Version 7.11.1 -------------- 2004-03-16 Kjetil Jacobsen * WARNING: Removed support for the PASSWDFUNCTION callback, which is no longer supported by libcurl. 2004-03-15 Kjetil Jacobsen * Added support for libcurl 7.11.1 CURLOPT features: CURLOPT_POSTFIELDSIZE_LARGE. Version 7.11.0 -------------- 2004-02-11 Kjetil Jacobsen * Added support for libcurl 7.11.0 CURLOPT features: INFILESIZE_LARGE, RESUME_FROM_LARGE, MAXFILESIZE_LARGE and FTP_SSL. * Circular garbage collection support can now be enabled or disabled by passing the '--use-gc=[1|0]' parameter to setup.py when building pycurl. * HTTP_VERSION options are known as CURL_HTTP_VERSION_NONE, CURL_HTTP_VERSION_1_0, CURL_HTTP_VERSION_1_1 and CURL_HTTP_VERSION_LAST. 2003-11-16 Markus F.X.J. Oberhumer * Added support for these new libcurl 7.11.0 features: CURLOPT_NETRC_FILE. Version 7.10.8 -------------- 2003-11-04 Markus F.X.J. Oberhumer * Added support for these new libcurl 7.10.8 features: CURLOPT_FTP_RESPONSE_TIMEOUT, CURLOPT_IPRESOLVE, CURLOPT_MAXFILESIZE, CURLINFO_HTTPAUTH_AVAIL, CURLINFO_PROXYAUTH_AVAIL, CURL_IPRESOLVE_* constants. * Added support for these new libcurl 7.10.7 features: CURLOPT_FTP_CREATE_MISSING_DIRS, CURLOPT_PROXYAUTH, CURLINFO_HTTP_CONNECTCODE. 2003-10-28 Kjetil Jacobsen * Added missing CURLOPT_ENCODING option (patch by Martijn Boerwinkel ) Version 7.10.6 -------------- 2003-07-29 Markus F.X.J. Oberhumer * Started working on support for CURLOPT_SSL_CTX_FUNCTION and CURLOPT_SSL_CTX_DATA (libcurl-7.10.6) - not yet finished. 2003-06-10 Markus F.X.J. Oberhumer * Added support for CURLOPT_HTTPAUTH (libcurl-7.10.6), including the new HTTPAUTH_BASIC, HTTPAUTH_DIGEST, HTTPAUTH_GSSNEGOTIATE and HTTPAUTH_NTML constants. * Some constants were renamed for consistency: All curl_infotype constants are now prefixed with "INFOTYPE_", all curl_proxytype constants are prefixed with "PROXYTYPE_" instead of "PROXY_", and all curl_TimeCond constants are now prefixed with "TIMECONDITION_" instead of "TIMECOND_". (The old names are still available but will get removed in a future release.) * WARNING: Removed the deprecated pycurl.init() and pycurl.multi_init() names - use pycurl.Curl() and pycurl.CurlMulti() instead. * WARNING: Removed the deprecated Curl.cleanup() and CurlMulti.cleanup() methods - use Curl.close() and CurlMulti.close() instead. Version 7.10.5 -------------- 2003-05-15 Markus F.X.J. Oberhumer * Added support for CURLOPT_FTP_USE_EPRT (libcurl-7.10.5). * Documentation updates. 2003-05-07 Eric S. Raymond * Lifted all HTML docs to clean XHTML, verified by tidy. 2003-05-02 Markus F.X.J. Oberhumer * Fixed some `int' vs. `long' mismatches that affected 64-bit systems. * Fixed wrong pycurl.CAPATH constant. 2003-05-01 Markus F.X.J. Oberhumer * Added new method Curl.errstr() which returns the internal libcurl error buffer string of the handle. Version 7.10.4.2 ---------------- 2003-04-15 Markus F.X.J. Oberhumer * Allow compilation against the libcurl-7.10.3 release - some recent Linux distributions (e.g. Mandrake 9.1) ship with 7.10.3, and apart from the new CURLOPT_UNRESTRICTED_AUTH option there is no need that we require libcurl-7.10.4. Version 7.10.4 -------------- 2003-04-01 Kjetil Jacobsen * Markus added CURLOPT_UNRESTRICTED_AUTH (libcurl-7.10.4). 2003-02-25 Kjetil Jacobsen * Fixed some broken test code and removed the fileupload test since it didn't work properly. 2003-01-28 Kjetil Jacobsen * Some documentation updates by Markus and me. 2003-01-22 Kjetil Jacobsen * API CHANGE: the CurlMulti.info_read() method now returns a separate array with handles that failed. Each entry in this array is a tuple with (curl object, error number, error message). This addition makes it simpler to do error checking of individual curl objects when using the multi interface. Version 7.10.3 -------------- 2003-01-13 Kjetil Jacobsen * PycURL memory usage has been reduced. 2003-01-10 Kjetil Jacobsen * Added 'examples/retriever-multi.py' which shows how to retrieve a set of URLs concurrently using the multi interface. 2003-01-09 Kjetil Jacobsen * Added support for CURLOPT_HTTP200ALIASES. 2002-11-22 Kjetil Jacobsen * Updated pycurl documentation in the 'doc' directory. 2002-11-21 Kjetil Jacobsen * Updated and improved 'examples/curl.py'. * Added 'tests/test_multi6.py' which shows how to use the info_read method with CurlMulti. 2002-11-19 Kjetil Jacobsen * Added new method CurlMulti.info_read(). Version 7.10.2 -------------- 2002-11-14 Kjetil Jacobsen * Free options set with setopt after cleanup is called, as cleanup assumes that options are still valid when invoked. This fixes the bug with COOKIEJAR reported by Bastiaan Naber . 2002-11-06 Markus F.X.J. Oberhumer * Install documentation under /usr/share/doc instead of /usr/doc. Also, start shipping the (unfinished) HTML docs and some basic test scripts. 2002-10-30 Markus F.X.J. Oberhumer * API CHANGE: For integral values, Curl.getinfo() now returns a Python-int instead of a Python-long. Version 7.10.1 -------------- 2002-10-03 Markus F.X.J. Oberhumer * Added new module-level function version_info() from libcurl-7.10. Version 7.10 ------------ 2002-09-13 Kjetil Jacobsen * Added commandline options to setup.py for specifying the path to 'curl-config' (non-windows) and the curl installation directory (windows). See the 'INSTALL' file for details. * Added CURLOPT_ENCODING, CURLOPT_NOSIGNAL and CURLOPT_BUFFERSIZE from libcurl-7.10 (by Markus Oberhumer). Version 7.9.8.4 --------------- 2002-08-28 Kjetil Jacobsen * Added a simple web-browser example based on gtkhtml and pycurl. See the file 'examples/gtkhtml_demo.py' for details. The example requires a working installation of gnome-python with gtkhtml bindings enabled (pass --with-gtkhtml to gnome-python configure). 2002-08-14 Kjetil Jacobsen * Added new method 'select' on CurlMulti objects. Example usage in 'tests/test_multi5.py'. This method is just an optimization of the combined use of fdset and select. 2002-08-12 Kjetil Jacobsen * Added support for curl_multi_fdset. See the file 'tests/test_multi4.py' for example usage. Contributed by Conrad Steenberg . * perform() on multi objects now returns a tuple (result, number of handles) like the libcurl interface does. 2002-08-08 Kjetil Jacobsen * Added the 'sfquery' script which retrieves a SourceForge XML export object for a given project. See the file 'examples/sfquery.py' for details and usage. 'sfquery' was contributed by Eric S. Raymond . 2002-07-20 Markus F.X.J. Oberhumer * API enhancements: added Curl() and CurlMulti() as aliases for init() and multi_init(), and added close() methods as aliases for the cleanup() methods. The new names much better match the actual intended use of the objects, and they also nicely correspond to Python's file object. * Also, all constants for Curl.setopt() and Curl.getinfo() are now visible from within Curl objects. All changes are fully backward-compatible. Version 7.9.8.3 --------------- 2002-07-16 Markus F.X.J. Oberhumer * Under Python 2.2 or better, Curl and CurlMulti objects now automatically participate in cyclic garbarge collection (using the gc module). Version 7.9.8.2 --------------- 2002-07-05 Markus F.X.J. Oberhumer * Curl and CurlMulti objects now support standard Python attributes. See tests/test_multi2.py for an example. 2002-07-02 Kjetil Jacobsen * Added support for the multi-interface. Version 7.9.8.1 --------------- 2002-06-25 Markus F.X.J. Oberhumer * Fixed a couple of `int' vs. `size_t' mismatches in callbacks and Py_BuildValue() calls. 2002-06-25 Kjetil Jacobsen * Use 'double' type instead of 'size_t' for progress callbacks (by Conrad Steenberg ). Also cleaned up some other type mismatches in the callback interfaces. 2002-06-24 Kjetil Jacobsen * Added example code on how to upload a file using HTTPPOST in pycurl (code by Amit Mongia ). See the file 'test_fileupload.py' for details. Version 7.9.8 ------------- 2002-06-24 Kjetil Jacobsen * Resolved some build problems on Windows (by Markus Oberhumer). 2002-06-19 Kjetil Jacobsen * Added CURLOPT_CAPATH. * Added option constants for CURLOPT_NETRC: CURL_NETRC_OPTIONAL, CURL_NETRC_IGNORED and CURL_NETRC_REQUIRED. * Added option constants for CURLOPT_TIMECONDITION: TIMECOND_IFMODSINCE and TIMECOND_IFUNMODSINCE. * Added an simple example crawler, which downloads documents listed in a file with a configurable number of worker threads. See the file 'crawler.py' in the 'tests' directory for details. * Removed the redundant 'test_xmlrpc2.py' test script. * Disallow recursive callback invocations (by Markus Oberhumer). 2002-06-18 Kjetil Jacobsen * Made some changes to setup.py which should fix the build problems on RedHat 7.3 (suggested by Benji ). * Use CURLOPT_READDATA instead of CURLOPT_INFILE, and CURLOPT_WRITEDATA instead of CURLOPT_FILE. Also fixed some reference counting bugs with file objects. * CURLOPT_FILETIME and CURLINFO_FILETIME had a namespace clash which caused them not to work. Use OPT_FILETIME for setopt() and INFO_FILETIME for getinfo(). See example usage in 'test_getinfo.py' for details. Version 7.9.7 ------------- 2002-05-20 Kjetil Jacobsen * New versioning scheme. Pycurl now has the same version number as the libcurl version it was built with. The pycurl version number thus indicates which version of libcurl is required to run. 2002-05-17 Kjetil Jacobsen * Added CURLINFO_REDIRECT_TIME and CURLINFO_REDIRECT_COUNT. 2002-04-27 Kjetil Jacobsen * Fixed potential memory leak and thread race (by Markus Oberhumer). Version 0.4.9 ------------- 2002-04-15 Kjetil Jacobsen * Added CURLOPT_DEBUGFUNCTION to allow debug callbacks to be specified (see the file 'test_debug.py' for details on how to use debug callbacks). * Added CURLOPT_DNS_USE_GLOBAL_CACHE and CURLOPT_DNS_CACHE_TIMEOUT. * Fixed a segfault when finalizing curl objects in Python 1.5.2. * Now requires libcurl 7.9.6 or greater. 2002-04-12 Kjetil Jacobsen * Added 'test_post2.py' file which is another example on how to issue POST requests. 2002-04-11 Markus F.X.J. Oberhumer * Added the 'test_post.py' file which demonstrates the use of POST requests. Version 0.4.8 ------------- 2002-03-07 Kjetil Jacobsen * Added CURLOPT_PREQUOTE. * Now requires libcurl 7.9.5 or greater. * Other minor code cleanups and bugfixes. 2002-03-05 Kjetil Jacobsen * Do not allow WRITEFUNCTION and WRITEHEADER on the same handle. Version 0.4.7 ------------- 2002-02-27 Kjetil Jacobsen * Abort callback if the thread state of the calling thread cannot be determined. * Check that the installed version of libcurl matches the requirements of pycurl. 2002-02-26 Kjetil Jacobsen * Clarence Garnder found a bug where string arguments to setopt sometimes were prematurely deallocated, this should now be fixed. 2002-02-21 Kjetil Jacobsen * Added the 'xmlrpc_curl.py' file which implements a transport for xmlrpclib (xmlrpclib is part of Python 2.2). * Added CURLINFO_CONTENT_TYPE. * Added CURLOPT_SSLCERTTYPE, CURLOPT_SSLKEY, CURLOPT_SSLKEYTYPE, CURLOPT_SSLKEYPASSWD, CURLOPT_SSLENGINE and CURLOPT_SSLENGINE_DEFAULT. * When thrown, the pycurl.error exception is now a tuple consisting of the curl error code and the error message. * Now requires libcurl 7.9.4 or greater. 2002-02-19 Kjetil Jacobsen * Fixed docstring for getopt() function. 2001-12-18 Kjetil Jacobsen * Updated the INSTALL information for Win32. 2001-12-12 Kjetil Jacobsen * Added missing link flag to make pycurl build on MacOS X (by Matt King ). 2001-12-06 Kjetil Jacobsen * Added CURLINFO_STARTTRANSFER_TIME and CURLOPT_FTP_USE_EPSV from libcurl 7.9.2. 2001-12-01 Markus F.X.J. Oberhumer * Added the 'test_stringio.py' file which demonstrates the use of StringIO objects as callback. 2001-12-01 Markus F.X.J. Oberhumer * setup.py: Do not remove entries from a list while iterating over it. 2001-11-29 Kjetil Jacobsen * Added code in setup.py to install on Windows. Requires some manual configuration (by Tino Lange ). 2001-11-27 Kjetil Jacobsen * Improved detection of where libcurl is installed in setup.py. Should make it easier to install pycurl when libcurl is not located in regular lib/include paths. 2001-11-05 Kjetil Jacobsen * Some of the newer options to setopt were missing, this should now be fixed. 2001-11-04 Kjetil Jacobsen * Exception handling has been improved and should no longer throw spurious exceptions (by Markus F.X.J. Oberhumer ). 2001-10-15 Kjetil Jacobsen * Refactored the test_gtk.py script to avoid global variables. 2001-10-12 Kjetil Jacobsen * Added module docstrings, terse perhaps, but better than nothing. * Added the 'basicfirst.py' file which is a Python version of the corresponding Perl script by Daniel. * PycURL now works properly under Python 1.5 and 1.6 (by Markus F.X.J. Oberhumer ). * Allow C-functions and Python methods as callbacks (by Markus F.X.J. Oberhumer ). * Allow None as success result of write, header and progress callback invocations (by Markus F.X.J. Oberhumer ). * Added the 'basicfirst2.py' file which demonstrates the use of a class method as callback instead of just a function. 2001-08-21 Kjetil Jacobsen * Cleaned up the script with GNOME/PycURL integration. 2001-08-20 Kjetil Jacobsen * Added another test script for shipping XML-RPC requests which uses py-xmlrpc to encode the arguments (tests/test_xmlrpc2.py). 2001-08-20 Kjetil Jacobsen * Added test script for using PycURL and GNOME (tests/test_gtk.py). 2001-08-20 Kjetil Jacobsen * Added test script for using XML-RPC (tests/test_xmlrpc.py). * Added more comments to the test sources. 2001-08-06 Kjetil Jacobsen * Renamed module namespace to pycurl instead of curl. 2001-08-06 Kjetil Jacobsen * Set CURLOPT_VERBOSE to 0 by default. 2001-06-29 Kjetil Jacobsen * Updated INSTALL, curl version 7.8 or greater is now mandatory to use pycurl. 2001-06-13 Kjetil Jacobsen * Set NOPROGRESS to 1 by default. 2001-06-07 Kjetil Jacobsen * Added global_init/cleanup. 2001-06-06 Kjetil Jacobsen * Added HEADER/PROGRESSFUNCTION callbacks (see files in tests/). * Added PASSWDFUNCTION callback (untested). * Added READFUNCTION callback (untested). 2001-06-05 Kjetil Jacobsen * WRITEFUNCTION callbacks now work (see tests/test_cb.py for details). * Preliminary distutils installation. * Added CLOSEPOLICY constants to module namespace. 2001-06-04 Kjetil Jacobsen * Return -1 on error from Python callback in WRITEFUNCTION callback. 2001-06-01 Kjetil Jacobsen * Moved source to src and tests to tests directory. 2001-05-31 Kjetil Jacobsen * Added better type checking for setopt. 2001-05-30 Kjetil Jacobsen * Moved code to sourceforge. * Added getinfo support. # vi:ts=8:et README000066600000001000150501000730005406 0ustar00License ------- Copyright (C) 2001-2008 by Kjetil Jacobsen Copyright (C) 2001-2008 by Markus F.X.J. Oberhumer All rights reserved. PycURL is dual licensed under the LGPL and an MIT/X derivative license based on the cURL license. A full copy of the LGPL license is included in the file COPYING. A full copy of the MIT/X derivative license is included in the file COPYING2. You can redistribute and/or modify PycURL according to the terms of either license. COPYING000066600000063636150501000730005610 0ustar00 GNU LESSER GENERAL PUBLIC LICENSE Version 2.1, February 1999 Copyright (C) 1991, 1999 Free Software Foundation, Inc. 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA Everyone is permitted to copy and distribute verbatim copies of this license document, but changing it is not allowed. [This is the first released version of the Lesser GPL. It also counts as the successor of the GNU Library Public License, version 2, hence the version number 2.1.] Preamble The licenses for most software are designed to take away your freedom to share and change it. By contrast, the GNU General Public Licenses are intended to guarantee your freedom to share and change free software--to make sure the software is free for all its users. This license, the Lesser General Public License, applies to some specially designated software packages--typically libraries--of the Free Software Foundation and other authors who decide to use it. You can use it too, but we suggest you first think carefully about whether this license or the ordinary General Public License is the better strategy to use in any particular case, based on the explanations below. When we speak of free software, we are referring to freedom of use, not price. Our General Public Licenses are designed to make sure that you have the freedom to distribute copies of free software (and charge for this service if you wish); that you receive source code or can get it if you want it; that you can change the software and use pieces of it in new free programs; and that you are informed that you can do these things. To protect your rights, we need to make restrictions that forbid distributors to deny you these rights or to ask you to surrender these rights. These restrictions translate to certain responsibilities for you if you distribute copies of the library or if you modify it. For example, if you distribute copies of the library, whether gratis or for a fee, you must give the recipients all the rights that we gave you. You must make sure that they, too, receive or can get the source code. If you link other code with the library, you must provide complete object files to the recipients, so that they can relink them with the library after making changes to the library and recompiling it. And you must show them these terms so they know their rights. We protect your rights with a two-step method: (1) we copyright the library, and (2) we offer you this license, which gives you legal permission to copy, distribute and/or modify the library. To protect each distributor, we want to make it very clear that there is no warranty for the free library. Also, if the library is modified by someone else and passed on, the recipients should know that what they have is not the original version, so that the original author's reputation will not be affected by problems that might be introduced by others. Finally, software patents pose a constant threat to the existence of any free program. We wish to make sure that a company cannot effectively restrict the users of a free program by obtaining a restrictive license from a patent holder. Therefore, we insist that any patent license obtained for a version of the library must be consistent with the full freedom of use specified in this license. Most GNU software, including some libraries, is covered by the ordinary GNU General Public License. This license, the GNU Lesser General Public License, applies to certain designated libraries, and is quite different from the ordinary General Public License. We use this license for certain libraries in order to permit linking those libraries into non-free programs. When a program is linked with a library, whether statically or using a shared library, the combination of the two is legally speaking a combined work, a derivative of the original library. The ordinary General Public License therefore permits such linking only if the entire combination fits its criteria of freedom. The Lesser General Public License permits more lax criteria for linking other code with the library. We call this license the "Lesser" General Public License because it does Less to protect the user's freedom than the ordinary General Public License. It also provides other free software developers Less of an advantage over competing non-free programs. These disadvantages are the reason we use the ordinary General Public License for many libraries. However, the Lesser license provides advantages in certain special circumstances. For example, on rare occasions, there may be a special need to encourage the widest possible use of a certain library, so that it becomes a de-facto standard. To achieve this, non-free programs must be allowed to use the library. A more frequent case is that a free library does the same job as widely used non-free libraries. In this case, there is little to gain by limiting the free library to free software only, so we use the Lesser General Public License. In other cases, permission to use a particular library in non-free programs enables a greater number of people to use a large body of free software. For example, permission to use the GNU C Library in non-free programs enables many more people to use the whole GNU operating system, as well as its variant, the GNU/Linux operating system. Although the Lesser General Public License is Less protective of the users' freedom, it does ensure that the user of a program that is linked with the Library has the freedom and the wherewithal to run that program using a modified version of the Library. The precise terms and conditions for copying, distribution and modification follow. Pay close attention to the difference between a "work based on the library" and a "work that uses the library". The former contains code derived from the library, whereas the latter must be combined with the library in order to run. GNU LESSER GENERAL PUBLIC LICENSE TERMS AND CONDITIONS FOR COPYING, DISTRIBUTION AND MODIFICATION 0. This License Agreement applies to any software library or other program which contains a notice placed by the copyright holder or other authorized party saying it may be distributed under the terms of this Lesser General Public License (also called "this License"). Each licensee is addressed as "you". A "library" means a collection of software functions and/or data prepared so as to be conveniently linked with application programs (which use some of those functions and data) to form executables. The "Library", below, refers to any such software library or work which has been distributed under these terms. A "work based on the Library" means either the Library or any derivative work under copyright law: that is to say, a work containing the Library or a portion of it, either verbatim or with modifications and/or translated straightforwardly into another language. (Hereinafter, translation is included without limitation in the term "modification".) "Source code" for a work means the preferred form of the work for making modifications to it. For a library, complete source code means all the source code for all modules it contains, plus any associated interface definition files, plus the scripts used to control compilation and installation of the library. Activities other than copying, distribution and modification are not covered by this License; they are outside its scope. The act of running a program using the Library is not restricted, and output from such a program is covered only if its contents constitute a work based on the Library (independent of the use of the Library in a tool for writing it). Whether that is true depends on what the Library does and what the program that uses the Library does. 1. You may copy and distribute verbatim copies of the Library's complete source code as you receive it, in any medium, provided that you conspicuously and appropriately publish on each copy an appropriate copyright notice and disclaimer of warranty; keep intact all the notices that refer to this License and to the absence of any warranty; and distribute a copy of this License along with the Library. You may charge a fee for the physical act of transferring a copy, and you may at your option offer warranty protection in exchange for a fee. 2. You may modify your copy or copies of the Library or any portion of it, thus forming a work based on the Library, and copy and distribute such modifications or work under the terms of Section 1 above, provided that you also meet all of these conditions: a) The modified work must itself be a software library. b) You must cause the files modified to carry prominent notices stating that you changed the files and the date of any change. c) You must cause the whole of the work to be licensed at no charge to all third parties under the terms of this License. d) If a facility in the modified Library refers to a function or a table of data to be supplied by an application program that uses the facility, other than as an argument passed when the facility is invoked, then you must make a good faith effort to ensure that, in the event an application does not supply such function or table, the facility still operates, and performs whatever part of its purpose remains meaningful. (For example, a function in a library to compute square roots has a purpose that is entirely well-defined independent of the application. Therefore, Subsection 2d requires that any application-supplied function or table used by this function must be optional: if the application does not supply it, the square root function must still compute square roots.) These requirements apply to the modified work as a whole. If identifiable sections of that work are not derived from the Library, and can be reasonably considered independent and separate works in themselves, then this License, and its terms, do not apply to those sections when you distribute them as separate works. But when you distribute the same sections as part of a whole which is a work based on the Library, the distribution of the whole must be on the terms of this License, whose permissions for other licensees extend to the entire whole, and thus to each and every part regardless of who wrote it. Thus, it is not the intent of this section to claim rights or contest your rights to work written entirely by you; rather, the intent is to exercise the right to control the distribution of derivative or collective works based on the Library. In addition, mere aggregation of another work not based on the Library with the Library (or with a work based on the Library) on a volume of a storage or distribution medium does not bring the other work under the scope of this License. 3. You may opt to apply the terms of the ordinary GNU General Public License instead of this License to a given copy of the Library. To do this, you must alter all the notices that refer to this License, so that they refer to the ordinary GNU General Public License, version 2, instead of to this License. (If a newer version than version 2 of the ordinary GNU General Public License has appeared, then you can specify that version instead if you wish.) Do not make any other change in these notices. Once this change is made in a given copy, it is irreversible for that copy, so the ordinary GNU General Public License applies to all subsequent copies and derivative works made from that copy. This option is useful when you wish to copy part of the code of the Library into a program that is not a library. 4. You may copy and distribute the Library (or a portion or derivative of it, under Section 2) in object code or executable form under the terms of Sections 1 and 2 above provided that you accompany it with the complete corresponding machine-readable source code, which must be distributed under the terms of Sections 1 and 2 above on a medium customarily used for software interchange. If distribution of object code is made by offering access to copy from a designated place, then offering equivalent access to copy the source code from the same place satisfies the requirement to distribute the source code, even though third parties are not compelled to copy the source along with the object code. 5. A program that contains no derivative of any portion of the Library, but is designed to work with the Library by being compiled or linked with it, is called a "work that uses the Library". Such a work, in isolation, is not a derivative work of the Library, and therefore falls outside the scope of this License. However, linking a "work that uses the Library" with the Library creates an executable that is a derivative of the Library (because it contains portions of the Library), rather than a "work that uses the library". The executable is therefore covered by this License. Section 6 states terms for distribution of such executables. When a "work that uses the Library" uses material from a header file that is part of the Library, the object code for the work may be a derivative work of the Library even though the source code is not. Whether this is true is especially significant if the work can be linked without the Library, or if the work is itself a library. The threshold for this to be true is not precisely defined by law. If such an object file uses only numerical parameters, data structure layouts and accessors, and small macros and small inline functions (ten lines or less in length), then the use of the object file is unrestricted, regardless of whether it is legally a derivative work. (Executables containing this object code plus portions of the Library will still fall under Section 6.) Otherwise, if the work is a derivative of the Library, you may distribute the object code for the work under the terms of Section 6. Any executables containing that work also fall under Section 6, whether or not they are linked directly with the Library itself. 6. As an exception to the Sections above, you may also combine or link a "work that uses the Library" with the Library to produce a work containing portions of the Library, and distribute that work under terms of your choice, provided that the terms permit modification of the work for the customer's own use and reverse engineering for debugging such modifications. You must give prominent notice with each copy of the work that the Library is used in it and that the Library and its use are covered by this License. You must supply a copy of this License. If the work during execution displays copyright notices, you must include the copyright notice for the Library among them, as well as a reference directing the user to the copy of this License. Also, you must do one of these things: a) Accompany the work with the complete corresponding machine-readable source code for the Library including whatever changes were used in the work (which must be distributed under Sections 1 and 2 above); and, if the work is an executable linked with the Library, with the complete machine-readable "work that uses the Library", as object code and/or source code, so that the user can modify the Library and then relink to produce a modified executable containing the modified Library. (It is understood that the user who changes the contents of definitions files in the Library will not necessarily be able to recompile the application to use the modified definitions.) b) Use a suitable shared library mechanism for linking with the Library. A suitable mechanism is one that (1) uses at run time a copy of the library already present on the user's computer system, rather than copying library functions into the executable, and (2) will operate properly with a modified version of the library, if the user installs one, as long as the modified version is interface-compatible with the version that the work was made with. c) Accompany the work with a written offer, valid for at least three years, to give the same user the materials specified in Subsection 6a, above, for a charge no more than the cost of performing this distribution. d) If distribution of the work is made by offering access to copy from a designated place, offer equivalent access to copy the above specified materials from the same place. e) Verify that the user has already received a copy of these materials or that you have already sent this user a copy. For an executable, the required form of the "work that uses the Library" must include any data and utility programs needed for reproducing the executable from it. However, as a special exception, the materials to be distributed need not include anything that is normally distributed (in either source or binary form) with the major components (compiler, kernel, and so on) of the operating system on which the executable runs, unless that component itself accompanies the executable. It may happen that this requirement contradicts the license restrictions of other proprietary libraries that do not normally accompany the operating system. Such a contradiction means you cannot use both them and the Library together in an executable that you distribute. 7. You may place library facilities that are a work based on the Library side-by-side in a single library together with other library facilities not covered by this License, and distribute such a combined library, provided that the separate distribution of the work based on the Library and of the other library facilities is otherwise permitted, and provided that you do these two things: a) Accompany the combined library with a copy of the same work based on the Library, uncombined with any other library facilities. This must be distributed under the terms of the Sections above. b) Give prominent notice with the combined library of the fact that part of it is a work based on the Library, and explaining where to find the accompanying uncombined form of the same work. 8. You may not copy, modify, sublicense, link with, or distribute the Library except as expressly provided under this License. Any attempt otherwise to copy, modify, sublicense, link with, or distribute the Library is void, and will automatically terminate your rights under this License. However, parties who have received copies, or rights, from you under this License will not have their licenses terminated so long as such parties remain in full compliance. 9. You are not required to accept this License, since you have not signed it. However, nothing else grants you permission to modify or distribute the Library or its derivative works. These actions are prohibited by law if you do not accept this License. Therefore, by modifying or distributing the Library (or any work based on the Library), you indicate your acceptance of this License to do so, and all its terms and conditions for copying, distributing or modifying the Library or works based on it. 10. Each time you redistribute the Library (or any work based on the Library), the recipient automatically receives a license from the original licensor to copy, distribute, link with or modify the Library subject to these terms and conditions. You may not impose any further restrictions on the recipients' exercise of the rights granted herein. You are not responsible for enforcing compliance by third parties with this License. 11. If, as a consequence of a court judgment or allegation of patent infringement or for any other reason (not limited to patent issues), conditions are imposed on you (whether by court order, agreement or otherwise) that contradict the conditions of this License, they do not excuse you from the conditions of this License. If you cannot distribute so as to satisfy simultaneously your obligations under this License and any other pertinent obligations, then as a consequence you may not distribute the Library at all. For example, if a patent license would not permit royalty-free redistribution of the Library by all those who receive copies directly or indirectly through you, then the only way you could satisfy both it and this License would be to refrain entirely from distribution of the Library. If any portion of this section is held invalid or unenforceable under any particular circumstance, the balance of the section is intended to apply, and the section as a whole is intended to apply in other circumstances. It is not the purpose of this section to induce you to infringe any patents or other property right claims or to contest validity of any such claims; this section has the sole purpose of protecting the integrity of the free software distribution system which is implemented by public license practices. Many people have made generous contributions to the wide range of software distributed through that system in reliance on consistent application of that system; it is up to the author/donor to decide if he or she is willing to distribute software through any other system and a licensee cannot impose that choice. This section is intended to make thoroughly clear what is believed to be a consequence of the rest of this License. 12. If the distribution and/or use of the Library is restricted in certain countries either by patents or by copyrighted interfaces, the original copyright holder who places the Library under this License may add an explicit geographical distribution limitation excluding those countries, so that distribution is permitted only in or among countries not thus excluded. In such case, this License incorporates the limitation as if written in the body of this License. 13. The Free Software Foundation may publish revised and/or new versions of the Lesser General Public License from time to time. Such new versions will be similar in spirit to the present version, but may differ in detail to address new problems or concerns. Each version is given a distinguishing version number. If the Library specifies a version number of this License which applies to it and "any later version", you have the option of following the terms and conditions either of that version or of any later version published by the Free Software Foundation. If the Library does not specify a license version number, you may choose any version ever published by the Free Software Foundation. 14. If you wish to incorporate parts of the Library into other free programs whose distribution conditions are incompatible with these, write to the author to ask for permission. For software which is copyrighted by the Free Software Foundation, write to the Free Software Foundation; we sometimes make exceptions for this. Our decision will be guided by the two goals of preserving the free status of all derivatives of our free software and of promoting the sharing and reuse of software generally. NO WARRANTY 15. BECAUSE THE LIBRARY IS LICENSED FREE OF CHARGE, THERE IS NO WARRANTY FOR THE LIBRARY, TO THE EXTENT PERMITTED BY APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT HOLDERS AND/OR OTHER PARTIES PROVIDE THE LIBRARY "AS IS" WITHOUT WARRANTY OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE LIBRARY IS WITH YOU. SHOULD THE LIBRARY PROVE DEFECTIVE, YOU ASSUME THE COST OF ALL NECESSARY SERVICING, REPAIR OR CORRECTION. 16. IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MAY MODIFY AND/OR REDISTRIBUTE THE LIBRARY AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE USE OR INABILITY TO USE THE LIBRARY (INCLUDING BUT NOT LIMITED TO LOSS OF DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD PARTIES OR A FAILURE OF THE LIBRARY TO OPERATE WITH ANY OTHER SOFTWARE), EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF SUCH DAMAGES. END OF TERMS AND CONDITIONS How to Apply These Terms to Your New Libraries If you develop a new library, and you want it to be of the greatest possible use to the public, we recommend making it free software that everyone can redistribute and change. You can do so by permitting redistribution under these terms (or, alternatively, under the terms of the ordinary General Public License). To apply these terms, attach the following notices to the library. It is safest to attach them to the start of each source file to most effectively convey the exclusion of warranty; and each file should have at least the "copyright" line and a pointer to where the full notice is found. Copyright (C) This library is free software; you can redistribute it and/or modify it under the terms of the GNU Lesser General Public License as published by the Free Software Foundation; either version 2.1 of the License, or (at your option) any later version. This library is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the GNU Lesser General Public License for more details. You should have received a copy of the GNU Lesser General Public License along with this library; if not, write to the Free Software Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA Also add information on how to contact you by electronic and paper mail. You should also get your employer (if you work as a programmer) or your school, if any, to sign a "copyright disclaimer" for the library, if necessary. Here is a sample; alter the names: Yoyodyne, Inc., hereby disclaims all copyright interest in the library `Frob' (a library for tweaking knobs) written by James Random Hacker. , 1 April 1990 Ty Coon, President of Vice That's all there is to it! examples/file_upload.py000066600000002327150501000730011216 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: file_upload.py,v 1.5 2005/02/13 08:53:13 mfx Exp $ import os, sys import pycurl # Class which holds a file reference and the read callback class FileReader: def __init__(self, fp): self.fp = fp def read_callback(self, size): return self.fp.read(size) # Check commandline arguments if len(sys.argv) < 3: print "Usage: %s " % sys.argv[0] raise SystemExit url = sys.argv[1] filename = sys.argv[2] if not os.path.exists(filename): print "Error: the file '%s' does not exist" % filename raise SystemExit # Initialize pycurl c = pycurl.Curl() c.setopt(pycurl.URL, url) c.setopt(pycurl.UPLOAD, 1) # Two versions with the same semantics here, but the filereader version # is useful when you have to process the data which is read before returning if 1: c.setopt(pycurl.READFUNCTION, FileReader(open(filename, 'rb')).read_callback) else: c.setopt(pycurl.READFUNCTION, open(filename, 'rb').read) # Set size of file to be uploaded. filesize = os.path.getsize(filename) c.setopt(pycurl.INFILESIZE, filesize) # Start transfer print 'Uploading file %s to url %s' % (filename, url) c.perform() c.close() examples/retriever.py000066600000005157150501000730010746 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: retriever.py,v 1.19 2005/07/28 11:04:13 mfx Exp $ # # Usage: python retriever.py [<# of # concurrent connections>] # import sys, threading, Queue import pycurl # We should ignore SIGPIPE when using pycurl.NOSIGNAL - see # the libcurl tutorial for more info. try: import signal from signal import SIGPIPE, SIG_IGN signal.signal(signal.SIGPIPE, signal.SIG_IGN) except ImportError: pass # Get args num_conn = 10 try: if sys.argv[1] == "-": urls = sys.stdin.readlines() else: urls = open(sys.argv[1]).readlines() if len(sys.argv) >= 3: num_conn = int(sys.argv[2]) except: print "Usage: %s [<# of concurrent connections>]" % sys.argv[0] raise SystemExit # Make a queue with (url, filename) tuples queue = Queue.Queue() for url in urls: url = url.strip() if not url or url[0] == "#": continue filename = "doc_%03d.dat" % (len(queue.queue) + 1) queue.put((url, filename)) # Check args assert queue.queue, "no URLs given" num_urls = len(queue.queue) num_conn = min(num_conn, num_urls) assert 1 <= num_conn <= 10000, "invalid number of concurrent connections" print "PycURL %s (compiled against 0x%x)" % (pycurl.version, pycurl.COMPILE_LIBCURL_VERSION_NUM) print "----- Getting", num_urls, "URLs using", num_conn, "connections -----" class WorkerThread(threading.Thread): def __init__(self, queue): threading.Thread.__init__(self) self.queue = queue def run(self): while 1: try: url, filename = self.queue.get_nowait() except Queue.Empty: raise SystemExit fp = open(filename, "wb") curl = pycurl.Curl() curl.setopt(pycurl.URL, url) curl.setopt(pycurl.FOLLOWLOCATION, 1) curl.setopt(pycurl.MAXREDIRS, 5) curl.setopt(pycurl.CONNECTTIMEOUT, 30) curl.setopt(pycurl.TIMEOUT, 300) curl.setopt(pycurl.NOSIGNAL, 1) curl.setopt(pycurl.WRITEDATA, fp) try: curl.perform() except: import traceback traceback.print_exc(file=sys.stderr) sys.stderr.flush() curl.close() fp.close() sys.stdout.write(".") sys.stdout.flush() # Start a bunch of threads threads = [] for dummy in range(num_conn): t = WorkerThread(queue) t.start() threads.append(t) # Wait for all threads to finish for thread in threads: thread.join() examples/retriever-multi.py000066600000006545150501000730012100 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: retriever-multi.py,v 1.29 2005/07/28 11:04:13 mfx Exp $ # # Usage: python retriever-multi.py [<# of # concurrent connections>] # import sys import pycurl # We should ignore SIGPIPE when using pycurl.NOSIGNAL - see # the libcurl tutorial for more info. try: import signal from signal import SIGPIPE, SIG_IGN signal.signal(signal.SIGPIPE, signal.SIG_IGN) except ImportError: pass # Get args num_conn = 10 try: if sys.argv[1] == "-": urls = sys.stdin.readlines() else: urls = open(sys.argv[1]).readlines() if len(sys.argv) >= 3: num_conn = int(sys.argv[2]) except: print "Usage: %s [<# of concurrent connections>]" % sys.argv[0] raise SystemExit # Make a queue with (url, filename) tuples queue = [] for url in urls: url = url.strip() if not url or url[0] == "#": continue filename = "doc_%03d.dat" % (len(queue) + 1) queue.append((url, filename)) # Check args assert queue, "no URLs given" num_urls = len(queue) num_conn = min(num_conn, num_urls) assert 1 <= num_conn <= 10000, "invalid number of concurrent connections" print "PycURL %s (compiled against 0x%x)" % (pycurl.version, pycurl.COMPILE_LIBCURL_VERSION_NUM) print "----- Getting", num_urls, "URLs using", num_conn, "connections -----" # Pre-allocate a list of curl objects m = pycurl.CurlMulti() m.handles = [] for i in range(num_conn): c = pycurl.Curl() c.fp = None c.setopt(pycurl.FOLLOWLOCATION, 1) c.setopt(pycurl.MAXREDIRS, 5) c.setopt(pycurl.CONNECTTIMEOUT, 30) c.setopt(pycurl.TIMEOUT, 300) c.setopt(pycurl.NOSIGNAL, 1) m.handles.append(c) # Main loop freelist = m.handles[:] num_processed = 0 while num_processed < num_urls: # If there is an url to process and a free curl object, add to multi stack while queue and freelist: url, filename = queue.pop(0) c = freelist.pop() c.fp = open(filename, "wb") c.setopt(pycurl.URL, url) c.setopt(pycurl.WRITEDATA, c.fp) m.add_handle(c) # store some info c.filename = filename c.url = url # Run the internal curl state machine for the multi stack while 1: ret, num_handles = m.perform() if ret != pycurl.E_CALL_MULTI_PERFORM: break # Check for curl objects which have terminated, and add them to the freelist while 1: num_q, ok_list, err_list = m.info_read() for c in ok_list: c.fp.close() c.fp = None m.remove_handle(c) print "Success:", c.filename, c.url, c.getinfo(pycurl.EFFECTIVE_URL) freelist.append(c) for c, errno, errmsg in err_list: c.fp.close() c.fp = None m.remove_handle(c) print "Failed: ", c.filename, c.url, errno, errmsg freelist.append(c) num_processed = num_processed + len(ok_list) + len(err_list) if num_q == 0: break # Currently no more I/O is pending, could do something in the meantime # (display a progress bar, etc.). # We just call select() to sleep until some more data is available. m.select(1.0) # Cleanup for c in m.handles: if c.fp is not None: c.fp.close() c.fp = None c.close() m.close() examples/sfquery.py000066600000004641150501000730010432 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # # sfquery -- Source Forge query script using the ClientCGI high-level interface # # Retrieves a SourceForge XML export object for a given project. # Specify the *numeric* project ID. the user name, and the password, # as arguments. If you have a valid ~/.netrc entry for sourceforge.net, # you can just give the project ID. # # By Eric S. Raymond, August 2002. All rites reversed. import os, sys, netrc import curl class SourceForgeUserSession(curl.Curl): # SourceForge-specific methods. Sensitive to changes in site design. def login(self, name, password): "Establish a login session." self.post("account/login.php", (("form_loginname", name), ("form_pw", password), ("return_to", ""), ("stay_in_ssl", "1"), ("login", "Login With SSL"))) def logout(self): "Log out of SourceForge." self.get("account/logout.php") def fetch_xml(self, numid): self.get("export/xml_export.php?group_id=%s" % numid) if __name__ == "__main__": if len(sys.argv) == 1: project_id = '28236' # PyCurl project ID else: project_id = sys.argv[1] # Try to grab authenticators out of your .netrc try: auth = netrc.netrc().authenticators("sourceforge.net") name, account, password = auth except: if len(sys.argv) < 4: print "Usage: %s " % sys.argv[0] raise SystemExit name = sys.argv[2] password = sys.argv[3] session = SourceForgeUserSession("https://sourceforge.net/") session.set_verbosity(0) session.login(name, password) # Login could fail. if session.answered("Invalid Password or User Name"): sys.stderr.write("Login/password not accepted (%d bytes)\n" % len(session.body())) sys.exit(1) # We'll see this if we get the right thing. elif session.answered("Personal Page For: " + name): session.fetch_xml(project_id) sys.stdout.write(session.body()) session.logout() sys.exit(0) # Or maybe SourceForge has changed its site design so our check strings # are no longer valid. else: sys.stderr.write("Unexpected page (%d bytes)\n"%len(session.body())) sys.exit(1) examples/linksys.py000066600000052174150501000730010434 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # # linksys.py -- program settings on a Linkys router # # This tool is designed to help you recover from the occasional episodes # of catatonia that afflict Linksys boxes. It allows you to batch-program # them rather than manually entering values to the Web interface. Commands # are taken from the command line first, then standard input. # # The somewhat spotty coverage of status queries is because I only did the # ones that were either (a) easy, or (b) necessary. If you want to know the # status of the box, look at the web interface. # # This code has been tested against the following hardware: # # Hardware Firmware # ---------- --------------------- # BEFW11S4v2 1.44.2.1, Dec 20 2002 # # The code is, of course, sensitive to changes in the names of CGI pages # and field names. # # Note: to make the no-arguments form work, you'll need to have the following # entry in your ~/.netrc file. If you have changed the router IP address or # name/password, modify accordingly. # # machine 192.168.1.1 # login "" # password admin # # By Eric S. Raymond, August April 2003. All rites reversed. import sys, re, copy, curl, exceptions class LinksysError(exceptions.Exception): def __init__(self, *args): self.args = args class LinksysSession: months = 'Jan|Feb|Mar|Apr|May|Jun|Jul|Aug|Sep|Oct|Nov|Dec' WAN_CONNECT_AUTO = '1' WAN_CONNECT_STATIC = '2' WAN_CONNECT_PPOE = '3' WAN_CONNECT_RAS = '4' WAN_CONNECT_PPTP = '5' WAN_CONNECT_HEARTBEAT = '6' # Substrings to check for on each page load. # This may enable us to detect when a firmware change has hosed us. check_strings = { "": "basic setup functions", "Passwd.htm": "For security reasons,", "DHCP.html": "You can configure the router to act as a DHCP", "Log.html": "There are some log settings and lists in this page.", "Forward.htm":"Port forwarding can be used to set up public services", } def __init__(self): self.actions = [] self.host = "http://192.168.1.1" self.verbosity = False self.pagecache = {} def set_verbosity(self, flag): self.verbosity = flag # This is not a performance hack -- we need the page cache to do # sanity checks at configure time. def cache_load(self, page): if page not in self.pagecache: fetch = curl.Curl(self.host) fetch.set_verbosity(self.verbosity) fetch.get(page) self.pagecache[page] = fetch.body() if fetch.answered("401"): raise LinksysError("authorization failure.", True) elif not fetch.answered(LinksysSession.check_strings[page]): del self.pagecache[page] raise LinksysError("check string for page %s missing!" % os.path.join(self.host, page), False) fetch.close() def cache_flush(self): self.pagecache = {} # Primitives def screen_scrape(self, page, template): self.cache_load(page) match = re.compile(template).search(self.pagecache[page]) if match: result = match.group(1) else: result = None return result def get_MAC_address(self, page, prefix): return self.screen_scrape("", prefix+r":[^M]*\(MAC Address: *([^)]*)") def set_flag(page, flag, value): if value: self.actions.append(page, flag, "1") else: self.actions.append(page, flag, "0") def set_IP_address(self, page, cgi, role, ip): ind = 0 for octet in ip.split("."): self.actions.append(("", "F1", role + `ind+1`, octet)) ind += 1 # Scrape configuration data off the main page def get_firmware_version(self): # This is fragile. There is no distinguishing tag before the firmware # version, so we have to key off the pattern of the version number. # Our model is ">1.44.2.1, Dec 20 2002<" return self.screen_scrape("", ">([0-9.v]*, (" + \ LinksysSession.months + ")[^<]*)<", ) def get_LAN_MAC(self): return self.get_MAC_address("", r"LAN IP Address") def get_Wireless_MAC(self): return self.get_MAC_address("", r"Wireless") def get_WAN_MAC(self): return self.get_MAC_address("", r"WAN Connection Type") # Set configuration data on the main page def set_host_name(self, name): self.actions.append(("", "hostName", name)) def set_domain_name(self, name): self.actions.append(("", "DomainName", name)) def set_LAN_IP(self, ip): self.set_IP_address("", "ipAddr", ip) def set_LAN_netmask(self, ip): if not ip.startswith("255.255.255."): raise ValueError lastquad = ip.split(".")[-1] if lastquad not in ("0", "128", "192", "240", "252"): raise ValueError self.actions.append("", "netMask", lastquad) def set_wireless(self, flag): self.set_flag("", "wirelessStatus") def set_SSID(self, ssid): self.actions.append(("", "wirelessESSID", ssid)) def set_SSID_broadcast(self, flag): self.set_flag("", "broadcastSSID") def set_channel(self, channel): self.actions.append(("", "wirelessChannel", channel)) def set_WEP(self, flag): self.set_flag("", "WepType") # FIXME: Add support for setting WEP keys def set_connection_type(self, type): self.actions.append(("", "WANConnectionType", type)) def set_WAN_IP(self, ip): self.set_IP_address("", "aliasIP", ip) def set_WAN_netmask(self, ip): self.set_IP_address("", "aliasMaskIP", ip) def set_WAN_gateway_address(self, ip): self.set_IP_address("", "routerIP", ip) def set_DNS_server(self, index, ip): self.set_IP_address("", "dns" + "ABC"[index], ip) # Set configuration data on the password page def set_password(self, str): self.actions.append("Passwd.htm","sysPasswd", str) self.actions.append("Passwd.htm","sysPasswdConfirm", str) def set_UPnP(self, flag): self.set_flag("Passwd.htm", "UPnP_Work") def reset(self): self.actions.append("Passwd.htm", "FactoryDefaults") # DHCP features def set_DHCP(self, flag): if flag: self.actions.append("DHCP.htm","dhcpStatus","Enable") else: self.actions.append("DHCP.htm","dhcpStatus","Disable") def set_DHCP_starting_IP(self, val): self.actions.append("DHCP.htm","dhcpS4", str(val)) def set_DHCP_users(self, val): self.actions.append("DHCP.htm","dhcpLen", str(val)) def set_DHCP_lease_time(self, val): self.actions.append("DHCP.htm","leaseTime", str(val)) def set_DHCP_DNS_server(self, index, ip): self.set_IP_address("DHCP.htm", "dns" + "ABC"[index], ip) # FIXME: add support for setting WINS key # Logging features def set_logging(self, flag): if flag: self.actions.append("Log.htm", "rLog", "Enable") else: self.actions.append("Log.htm", "rLog", "Disable") def set_log_address(self, val): self.actions.append("DHCP.htm","trapAddr3", str(val)) # The AOL parental control flag is not supported by design. # FIXME: add Filters and other advanced features def configure(self): "Write configuration changes to the Linksys." if self.actions: fields = [] self.cache_flush() for (page, field, value) in self.actions: self.cache_load(page) if self.pagecache[page].find(field) == -1: print >>sys.stderr, "linksys: field %s not found where expected in page %s!" % (field, os.path.join(self.host, page)) continue else: fields.append((field, value)) # Clearing the action list before fieldsping is deliberate. # Otherwise we could get permanently wedged by a 401. self.actions = [] transaction = curl.Curl(self.host) transaction.set_verbosity(self.verbosity) transaction.get("Gozila.cgi", tuple(fields)) transaction.close() if __name__ == "__main__": import os, cmd class LinksysInterpreter(cmd.Cmd): """Interpret commands to perform LinkSys programming actions.""" def __init__(self): cmd.Cmd.__init__(self) self.session = LinksysSession() if os.isatty(0): import readline print "Type ? or `help' for help." self.prompt = self.session.host + ": " else: self.prompt = "" print "Bar1" def flag_command(self, func, line): if line.strip() in ("on", "enable", "yes"): func(True) elif line.strip() in ("off", "disable", "no"): func(False) else: print >>sys.stderr, "linksys: unknown switch value" return 0 def do_connect(self, line): newhost = line.strip() if newhost: self.session.host = newhost self.session.cache_flush() self.prompt = self.session.host + ": " else: print self.session.host return 0 def help_connect(self): print "Usage: connect []" print "Connect to a Linksys by name or IP address." print "If no argument is given, print the current host." def do_status(self, line): self.session.cache_load("") if "" in self.session.pagecache: print "Firmware:", self.session.get_firmware_version() print "LAN MAC:", self.session.get_LAN_MAC() print "Wireless MAC:", self.session.get_Wireless_MAC() print "WAN MAC:", self.session.get_WAN_MAC() print "." return 0 def help_status(self): print "Usage: status" print "The status command shows the status of the Linksys." print "It is mainly useful as a sanity check to make sure" print "the box is responding correctly." def do_verbose(self, line): self.flag_command(self.session.set_verbosity, line) def help_verbose(self): print "Usage: verbose {on|off|enable|disable|yes|no}" print "Enables display of HTTP requests." def do_host(self, line): self.session.set_host_name(line) return 0 def help_host(self): print "Usage: host " print "Sets the Host field to be queried by the ISP." def do_domain(self, line): print "Usage: host " self.session.set_domain_name(line) return 0 def help_domain(self): print "Sets the Domain field to be queried by the ISP." def do_lan_address(self, line): self.session.set_LAN_IP(line) return 0 def help_lan_address(self): print "Usage: lan_address " print "Sets the LAN IP address." def do_lan_netmask(self, line): self.session.set_LAN_netmask(line) return 0 def help_lan_netmask(self): print "Usage: lan_netmask " print "Sets the LAN subnetwork mask." def do_wireless(self, line): self.flag_command(self.session.set_wireless, line) return 0 def help_wireless(self): print "Usage: wireless {on|off|enable|disable|yes|no}" print "Switch to enable or disable wireless features." def do_ssid(self, line): self.session.set_SSID(line) return 0 def help_ssid(self): print "Usage: ssid " print "Sets the SSID used to control wireless access." def do_ssid_broadcast(self, line): self.flag_command(self.session.set_SSID_broadcast, line) return 0 def help_ssid_broadcast(self): print "Usage: ssid_broadcast {on|off|enable|disable|yes|no}" print "Switch to enable or disable SSID broadcast." def do_channel(self, line): self.session.set_channel(line) return 0 def help_channel(self): print "Usage: channel " print "Sets the wireless channel." def do_wep(self, line): self.flag_command(self.session.set_WEP, line) return 0 def help_wep(self): print "Usage: wep {on|off|enable|disable|yes|no}" print "Switch to enable or disable WEP security." def do_wan_type(self, line): try: type=eval("LinksysSession.WAN_CONNECT_"+line.strip().upper()) self.session.set_connection_type(type) except ValueError: print >>sys.stderr, "linksys: unknown connection type." return 0 def help_wan_type(self): print "Usage: wan_type {auto|static|ppoe|ras|pptp|heartbeat}" print "Set the WAN connection type." def do_wan_address(self, line): self.session.set_WAN_IP(line) return 0 def help_wan_address(self): print "Usage: wan_address " print "Sets the WAN IP address." def do_wan_netmask(self, line): self.session.set_WAN_netmask(line) return 0 def help_wan_netmask(self): print "Usage: wan_netmask " print "Sets the WAN subnetwork mask." def do_wan_gateway(self, line): self.session.set_WAN_gateway(line) return 0 def help_wan_gateway(self): print "Usage: wan_gateway " print "Sets the LAN subnetwork mask." def do_dns(self, line): (index, address) = line.split() if index in ("1", "2", "3"): self.session.set_DNS_server(eval(index), address) else: print >>sys.stderr, "linksys: server index out of bounds." return 0 def help_dns(self): print "Usage: dns {1|2|3} " print "Sets a primary, secondary, or tertiary DNS server address." def do_password(self, line): self.session.set_password(line) return 0 def help_password(self): print "Usage: password " print "Sets the router password." def do_upnp(self, line): self.flag_command(self.session.set_UPnP, line) return 0 def help_upnp(self): print "Usage: upnp {on|off|enable|disable|yes|no}" print "Switch to enable or disable Universal Plug and Play." def do_reset(self, line): self.session.reset() def help_reset(self): print "Usage: reset" print "Reset Linksys settings to factory defaults." def do_dhcp(self, line): self.flag_command(self.session.set_DHCP, line) def help_dhcp(self): print "Usage: dhcp {on|off|enable|disable|yes|no}" print "Switch to enable or disable DHCP features." def do_dhcp_start(self, line): self.session.set_DHCP_starting_IP(line) def help_dhcp_start(self): print "Usage: dhcp_start " print "Set the start address of the DHCP pool." def do_dhcp_users(self, line): self.session.set_DHCP_users(line) def help_dhcp_users(self): print "Usage: dhcp_users " print "Set number of address slots to allocate in the DHCP pool." def do_dhcp_lease(self, line): self.session.set_DHCP_lease(line) def help_dhcp_lease(self): print "Usage: dhcp_lease " print "Set number of address slots to allocate in the DHCP pool." def do_dhcp_dns(self, line): (index, address) = line.split() if index in ("1", "2", "3"): self.session.set_DHCP_DNS_server(eval(index), address) else: print >>sys.stderr, "linksys: server index out of bounds." return 0 def help_dhcp_dns(self): print "Usage: dhcp_dns {1|2|3} " print "Sets primary, secondary, or tertiary DNS server address." def do_logging(self, line): self.flag_command(self.session.set_logging, line) def help_logging(self): print "Usage: logging {on|off|enable|disable|yes|no}" print "Switch to enable or disable session logging." def do_log_address(self, line): self.session.set_Log_address(line) def help_log_address(self): print "Usage: log_address " print "Set the last quad of the address to which to log." def do_configure(self, line): self.session.configure() return 0 def help_configure(self): print "Usage: configure" print "Writes the configuration to the Linksys." def do_cache(self, line): print self.session.pagecache def help_cache(self): print "Usage: cache" print "Display the page cache." def do_quit(self, line): return 1 def help_quit(self, line): print "The quit command ends your linksys session without" print "writing configuration changes to the Linksys." def do_EOF(self, line): print "" self.session.configure() return 1 def help_EOF(self): print "The EOF command writes the configuration to the linksys" print "and ends your session." def default(self, line): """Pass the command through to be executed by the shell.""" os.system(line) return 0 def help_help(self): print "On-line help is available through this command." print "? is a convenience alias for help." def help_introduction(self): print """\ This program supports changing the settings on Linksys blue-box routers. This capability may come in handy when they freeze up and have to be reset. Though it can be used interactively (and will command-prompt when standard input is a terminal) it is really designed to be used in batch mode. Commands are taken from the command line first, then standard input. By default, it is assumed that the Linksys is at http://192.168.1.1, the default LAN address. You can connect to a different address or IP with the 'connect' command. Note that your .netrc must contain correct user/password credentials for the router. The entry corresponding to the defaults is: machine 192.168.1.1 login "" password admin Most commands queue up changes but don't actually send them to the Linksys. You can force pending changes to be written with 'configure'. Otherwise, they will be shipped to the Linksys at the end of session (e.g. when the program running in batch mode encounters end-of-file or you type a control-D). If you end the session with `quit', pending changes will be discarded. For more help, read the topics 'wan', 'lan', and 'wireless'.""" def help_lan(self): print """\ The `lan_address' and `lan_netmask' commands let you set the IP location of the Linksys on your LAN, or inside. Normally you'll want to leave these untouched.""" def help_wan(self): print """\ The WAN commands become significant if you are using the BEFSR41 or any of the other Linksys boxes designed as DSL or cable-modem gateways. You will need to use `wan_type' to declare how you expect to get your address. If your ISP has issued you a static address, you'll need to use the `wan_address', `wan_netmask', and `wan_gateway' commands to set the address of the router as seen from the WAN, the outside. In this case you will also need to use the `dns' command to declare which remote servers your DNS requests should be forwarded to. Some ISPs may require you to set host and domain for use with dynamic-address allocation.""" def help_wireless(self): print """\ The channel, ssid, ssid_broadcast, wep, and wireless commands control wireless routing.""" def help_switches(self): print "Switches may be turned on with 'on', 'enable', or 'yes'." print "Switches may be turned off with 'off', 'disable', or 'no'." print "Switch commands include: wireless, ssid_broadcast." def help_addresses(self): print "An address argument must be a valid IP address;" print "four decimal numbers separated by dots, each " print "between 0 and 255." def emptyline(self): pass interpreter = LinksysInterpreter() for arg in sys.argv[1:]: interpreter.onecmd(arg) fatal = False while not fatal: try: interpreter.cmdloop() fatal = True except LinksysError, (message, fatal): print "linksys:", message # The following sets edit modes for GNU EMACS # Local Variables: # mode:python # End: examples/basicfirst.py000066600000000757150501000730011071 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: basicfirst.py,v 1.5 2005/02/11 11:09:11 mfx Exp $ import sys import pycurl class Test: def __init__(self): self.contents = '' def body_callback(self, buf): self.contents = self.contents + buf print >>sys.stderr, 'Testing', pycurl.version t = Test() c = pycurl.Curl() c.setopt(c.URL, 'http://curl.haxx.se/dev/') c.setopt(c.WRITEFUNCTION, t.body_callback) c.perform() c.close() print t.contents examples/xmlrpc_curl.py000066600000003653150501000730011270 0ustar00#! /usr/bin/env python # -*- coding: iso-8859-1 -*- # vi:ts=4:et # $Id: xmlrpc_curl.py,v 1.13 2007/03/04 19:26:59 kjetilja Exp $ # We should ignore SIGPIPE when using pycurl.NOSIGNAL - see # the libcurl tutorial for more info. try: import signal from signal import SIGPIPE, SIG_IGN signal.signal(signal.SIGPIPE, signal.SIG_IGN) except ImportError: pass try: from cStringIO import StringIO except ImportError: from StringIO import StringIO import xmlrpclib, pycurl class CURLTransport(xmlrpclib.Transport): """Handles a cURL HTTP transaction to an XML-RPC server.""" xmlrpc_h = [ "Content-Type: text/xml" ] def __init__(self, username=None, password=None): self.c = pycurl.Curl() self.c.setopt(pycurl.POST, 1) self.c.setopt(pycurl.NOSIGNAL, 1) self.c.setopt(pycurl.CONNECTTIMEOUT, 30) self.c.setopt(pycurl.HTTPHEADER, self.xmlrpc_h) if username != None and password != None: self.c.setopt(pycurl.USERPWD, '%s:%s' % (username, password)) self._use_datetime = False def request(self, host, handler, request_body, verbose=0): b = StringIO() self.c.setopt(pycurl.URL, 'http://%s%s' % (host, handler)) self.c.setopt(pycurl.POSTFIELDS, request_body) self.c.setopt(pycurl.WRITEFUNCTION, b.write) self.c.setopt(pycurl.VERBOSE, verbose) self.verbose = verbose try: self.c.perform() except pycurl.error, v: raise xmlrpclib.ProtocolError( host + handler, v[0], v[1], None ) b.seek(0) return self.parse_response(b) if __name__ == "__main__": ## Test server = xmlrpclib.ServerProxy("http://betty.userland.com", transport=CURLTransport()) print server try: print server.examples.getStateName(41) except xmlrpclib.Error, v: print "ERROR", v TODO000066600000001164150501000730005231 0ustar00If you want to hack on pycurl, here's our list of unresolved issues: NEW FEATURES/IMPROVEMENTS: * Callback handling for stream seek. * Add docs to the high-level interface. DEFICIENICES: * Using certain invalid options, it may be possible to cause a crash. This is un-Pythonic behaviour, but you somewhere have to draw a line between efficiency (and feature completeness) and safety. There _are_ quite a number of internal error checks, but tracking and catching all possible (deliberate) misuses is not a goal (and probably impossible anyway, due to the complexity of libcurl).