cybertools/agent/crawl
scrat c367f55637 added dummy sftp.py for testing purposes
remote.py: now the file on the remote server gets the basename of the local file
outlook.py: added utf-8 encoding for the mail content to avoid problems with special characters

git-svn-id: svn://svn.cy55.de/Zope3/src/cybertools/trunk@2759 fd906abe-77d9-0310-91a1-e0d9ade77398
2008-07-13 09:06:28 +00:00
..
__init__.py added cybertools.agent package (work in progress...) 2008-02-23 14:07:15 +00:00
base.py more on resources: extend interface and base class 2008-05-09 07:47:39 +00:00
filesystem.py more on resources: extend interface and base class 2008-05-09 07:47:39 +00:00
filesystem.txt more on resources: extend interface and base class 2008-05-09 07:47:39 +00:00
mail.py some smaller changes to make the attachment examples work 2008-05-31 13:32:49 +00:00
outlook.py added dummy sftp.py for testing purposes 2008-07-13 09:06:28 +00:00
outlook.txt added Attachments and Attachment classes to winapi.py to get a more comprehensive doctest. In course of this change some small glitches have been found in outlook.py and mail.py, which concerned adressing of the Attachment items in outlook.py (createResource) and mail.py where the createMetadata Method now has **kw arguments for adding the filePath as a Metadata in case the mail has an attachment. 2008-05-25 17:02:00 +00:00
README.txt rearrange system startup so that components are not registered during initial import but via a controlled setup 2008-04-07 06:36:48 +00:00

================================================
Agents for Job Execution and Communication Tasks
================================================

  ($Id$)

  >>> config = '''
  ... controller(names=['core.sample'])
  ... scheduler(name='core')
  ... logger(name='default', standard=30)
  ... '''
  >>> from cybertools.agent.main import setup
  >>> master = setup(config)
  Starting agent application...
  Using controllers core.sample.


Crawler
=======

The agent uses Twisted's cooperative multitasking model.

Crawler is the base class for all derived crawlers like the filesystem crawler
and the mailcrawler. The SampleCrawler returns a deferred that already had a
callback being called, so it will return at once.

Returns a deferred that must be supplied with a callback method (and in
most cases also an errback method).

We create the sample crawler via the master's controller. The sample
controller provides a simple method for this purpose.

  >>> controller = master.controllers[0]
  >>> controller.createAgent('crawl.sample', 'crawler01')

In the next step we request the start of a job, again via the controller.

  >>> controller.enterJob('sample', 'crawler01')

The job is not executed immediately - we have to hand over control to
the twisted reactor first.

  >>> from cybertools.agent.tests import tester
  >>> tester.iterate()
  SampleCrawler is collecting.
  Job 00001 completed; result: [];