Skip to content

Commit 0dce283

Browse files
committed
Merge pull request scrapy#893 from kmike/less-ads
[MRG] DOC simplify extension docs
2 parents 44cbbec + e435b3e commit 0dce283

File tree

4 files changed

+13
-18
lines changed

4 files changed

+13
-18
lines changed

docs/topics/downloader-middleware.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -51,8 +51,8 @@ particular setting. See each middleware documentation for more info.
5151
Writing your own downloader middleware
5252
======================================
5353

54-
Writing your own downloader middleware is easy. Each middleware component is a
55-
single Python class that defines one or more of the following methods:
54+
Each middleware component is a Python class that defines one or
55+
more of the following methods:
5656

5757
.. module:: scrapy.contrib.downloadermiddleware
5858

docs/topics/extensions.rst

Lines changed: 8 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -5,7 +5,7 @@ Extensions
55
==========
66

77
The extensions framework provides a mechanism for inserting your own
8-
custom functionality into Scrapy.
8+
custom functionality into Scrapy.
99

1010
Extensions are just regular classes that are instantiated at Scrapy startup,
1111
when extensions are initialized.
@@ -75,14 +75,10 @@ included in the :setting:`EXTENSIONS_BASE` setting) you must set its order to
7575
Writing your own extension
7676
==========================
7777

78-
Writing your own extension is easy. Each extension is a single Python class
79-
which doesn't need to implement any particular method.
80-
81-
The main entry point for a Scrapy extension (this also includes middlewares and
82-
pipelines) is the ``from_crawler`` class method which receives a
83-
``Crawler`` instance which is the main object controlling the Scrapy crawler.
84-
Through that object you can access settings, signals, stats, and also control
85-
the crawler behaviour, if your extension needs to such thing.
78+
Each extension is a Python class. The main entry point for a Scrapy extension
79+
(this also includes middlewares and pipelines) is the ``from_crawler``
80+
class method which receives a ``Crawler`` instance. Through the Crawler object
81+
you can access settings, signals, stats, and also control the crawling behaviour.
8682

8783
Typically, extensions connect to :ref:`signals <topics-signals>` and perform
8884
tasks triggered by them.
@@ -133,7 +129,7 @@ Here is the code of such extension::
133129
crawler.signals.connect(ext.spider_closed, signal=signals.spider_closed)
134130
crawler.signals.connect(ext.item_scraped, signal=signals.item_scraped)
135131

136-
# return the extension object
132+
# return the extension object
137133
return ext
138134

139135
def spider_opened(self, spider):
@@ -183,12 +179,12 @@ Telnet console extension
183179
~~~~~~~~~~~~~~~~~~~~~~~~
184180

185181
.. module:: scrapy.telnet
186-
:synopsis: Telnet console
182+
:synopsis: Telnet console
187183

188184
.. class:: scrapy.telnet.TelnetConsole
189185

190186
Provides a telnet console for getting into a Python interpreter inside the
191-
currently running Scrapy process, which can be very useful for debugging.
187+
currently running Scrapy process, which can be very useful for debugging.
192188

193189
The telnet console must be enabled by the :setting:`TELNETCONSOLE_ENABLED`
194190
setting, and the server will listen in the port specified in

docs/topics/item-pipeline.rst

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -23,8 +23,7 @@ Typical use for item pipelines are:
2323
Writing your own item pipeline
2424
==============================
2525

26-
Writing your own item pipeline is easy. Each item pipeline component is a
27-
single Python class that must implement the following method:
26+
Each item pipeline component is a Python class that must implement the following method:
2827

2928
.. method:: process_item(self, item, spider)
3029

docs/topics/spider-middleware.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -52,8 +52,8 @@ particular setting. See each middleware documentation for more info.
5252
Writing your own spider middleware
5353
==================================
5454

55-
Writing your own spider middleware is easy. Each middleware component is a
56-
single Python class that defines one or more of the following methods:
55+
Each middleware component is a Python class that defines one or more of the
56+
following methods:
5757

5858
.. module:: scrapy.contrib.spidermiddleware
5959

0 commit comments

Comments
 (0)