Skip to content

Commit e122c56

Browse files
committed
Merge pull request scrapy#1842 from nyov/nyov/docs
[MRG+1] Update documentation links
2 parents 9f4fe5d + 5876b9a commit e122c56

26 files changed

+68
-67
lines changed

docs/contributing.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -120,7 +120,7 @@ Scrapy Contrib
120120
==============
121121

122122
Scrapy contrib shares a similar rationale as Django contrib, which is explained
123-
in `this post <http://jacobian.org/writing/what-is-django-contrib/>`_. If you
123+
in `this post <https://jacobian.org/writing/what-is-django-contrib/>`_. If you
124124
are working on a new functionality, please follow that rationale to decide
125125
whether it should be a Scrapy contrib. If unsure, you can ask in
126126
`scrapy-users`_.
@@ -189,7 +189,7 @@ And their unit-tests are in::
189189

190190
.. _issue tracker: https://github.com/scrapy/scrapy/issues
191191
.. _scrapy-users: https://groups.google.com/forum/#!forum/scrapy-users
192-
.. _Twisted unit-testing framework: http://twistedmatrix.com/documents/current/core/development/policy/test-standard.html
192+
.. _Twisted unit-testing framework: https://twistedmatrix.com/documents/current/core/development/policy/test-standard.html
193193
.. _AUTHORS: https://github.com/scrapy/scrapy/blob/master/AUTHORS
194194
.. _tests/: https://github.com/scrapy/scrapy/tree/master/tests
195195
.. _open issues: https://github.com/scrapy/scrapy/issues

docs/faq.rst

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -77,8 +77,8 @@ Scrapy crashes with: ImportError: No module named win32api
7777

7878
You need to install `pywin32`_ because of `this Twisted bug`_.
7979

80-
.. _pywin32: http://sourceforge.net/projects/pywin32/
81-
.. _this Twisted bug: http://twistedmatrix.com/trac/ticket/3707
80+
.. _pywin32: https://sourceforge.net/projects/pywin32/
81+
.. _this Twisted bug: https://twistedmatrix.com/trac/ticket/3707
8282

8383
How can I simulate a user login in my spider?
8484
---------------------------------------------
@@ -123,7 +123,7 @@ Why does Scrapy download pages in English instead of my native language?
123123
Try changing the default `Accept-Language`_ request header by overriding the
124124
:setting:`DEFAULT_REQUEST_HEADERS` setting.
125125

126-
.. _Accept-Language: http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.4
126+
.. _Accept-Language: https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.4
127127

128128
Where can I find some example Scrapy projects?
129129
----------------------------------------------
@@ -282,7 +282,7 @@ I'm scraping a XML document and my XPath selector doesn't return any items
282282

283283
You may need to remove namespaces. See :ref:`removing-namespaces`.
284284

285-
.. _user agents: http://en.wikipedia.org/wiki/User_agent
286-
.. _LIFO: http://en.wikipedia.org/wiki/LIFO
287-
.. _DFO order: http://en.wikipedia.org/wiki/Depth-first_search
288-
.. _BFO order: http://en.wikipedia.org/wiki/Breadth-first_search
285+
.. _user agents: https://en.wikipedia.org/wiki/User_agent
286+
.. _LIFO: https://en.wikipedia.org/wiki/LIFO
287+
.. _DFO order: https://en.wikipedia.org/wiki/Depth-first_search
288+
.. _BFO order: https://en.wikipedia.org/wiki/Breadth-first_search

docs/intro/install.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -74,7 +74,7 @@ Windows
7474
Be sure you download the architecture (win32 or amd64) that matches your system
7575

7676
* *(Only required for Python<2.7.9)* Install `pip`_ from
77-
https://pip.pypa.io/en/latest/installing.html
77+
https://pip.pypa.io/en/latest/installing/
7878

7979
Now open a Command prompt to check ``pip`` is installed correctly::
8080

@@ -171,9 +171,9 @@ After any of these workarounds you should be able to install Scrapy::
171171
pip install Scrapy
172172

173173
.. _Python: https://www.python.org/
174-
.. _pip: https://pip.pypa.io/en/latest/installing.html
175-
.. _easy_install: http://pypi.python.org/pypi/setuptools
176-
.. _Control Panel: http://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/sysdm_advancd_environmnt_addchange_variable.mspx
174+
.. _pip: https://pip.pypa.io/en/latest/installing/
175+
.. _easy_install: https://pypi.python.org/pypi/setuptools
176+
.. _Control Panel: https://www.microsoft.com/resources/documentation/windows/xp/all/proddocs/en-us/sysdm_advancd_environmnt_addchange_variable.mspx
177177
.. _lxml: http://lxml.de/
178178
.. _OpenSSL: https://pypi.python.org/pypi/pyOpenSSL
179179
.. _setuptools: https://pypi.python.org/pypi/setuptools

docs/intro/overview.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -170,7 +170,7 @@ your code in Scrapy projects and `join the community`_. Thanks for your
170170
interest!
171171

172172
.. _join the community: http://scrapy.org/community/
173-
.. _web scraping: http://en.wikipedia.org/wiki/Web_scraping
174-
.. _Amazon Associates Web Services: http://aws.amazon.com/associates/
175-
.. _Amazon S3: http://aws.amazon.com/s3/
173+
.. _web scraping: https://en.wikipedia.org/wiki/Web_scraping
174+
.. _Amazon Associates Web Services: https://affiliate-program.amazon.com/gp/advertising/api/detail/main.html
175+
.. _Amazon S3: https://aws.amazon.com/s3/
176176
.. _Sitemaps: http://www.sitemaps.org

docs/intro/tutorial.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@ Scrapy Tutorial
77
In this tutorial, we'll assume that Scrapy is already installed on your system.
88
If that's not the case, see :ref:`intro-install`.
99

10-
We are going to use `Open directory project (dmoz) <http://www.dmoz.org/>`_ as
10+
We are going to use `Open directory project (dmoz) <https://www.dmoz.org/>`_ as
1111
our example domain to scrape.
1212

1313
This tutorial will walk you through these tasks:
@@ -191,8 +191,8 @@ based on `XPath`_ or `CSS`_ expressions called :ref:`Scrapy Selectors
191191
<topics-selectors>`. For more information about selectors and other extraction
192192
mechanisms see the :ref:`Selectors documentation <topics-selectors>`.
193193

194-
.. _XPath: http://www.w3.org/TR/xpath
195-
.. _CSS: http://www.w3.org/TR/selectors
194+
.. _XPath: https://www.w3.org/TR/xpath
195+
.. _CSS: https://www.w3.org/TR/selectors
196196

197197
Here are some examples of XPath expressions and their meanings:
198198

@@ -544,5 +544,5 @@ Then, we recommend you continue by playing with an example project (see
544544
:ref:`intro-examples`), and then continue with the section
545545
:ref:`section-basics`.
546546

547-
.. _JSON: http://en.wikipedia.org/wiki/JSON
547+
.. _JSON: https://en.wikipedia.org/wiki/JSON
548548
.. _dirbot: https://github.com/scrapy/dirbot

docs/news.rst

Lines changed: 5 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -403,10 +403,11 @@ Outsourced packages
403403
| | :ref:`topics-deploy`) |
404404
+-------------------------------------+-------------------------------------+
405405
| scrapy.contrib.djangoitem | `scrapy-djangoitem <https://github. |
406-
| | com/scrapy/scrapy-djangoitem>`_ |
406+
| | com/scrapy-plugins/scrapy-djangoite |
407+
| | m>`_ |
407408
+-------------------------------------+-------------------------------------+
408409
| scrapy.webservice | `scrapy-jsonrpc <https://github.com |
409-
| | /scrapy/scrapy-jsonrpc>`_ |
410+
| | /scrapy-plugins/scrapy-jsonrpc>`_ |
410411
+-------------------------------------+-------------------------------------+
411412

412413
`scrapy.contrib_exp` and `scrapy.contrib` dissolutions
@@ -1186,7 +1187,7 @@ Scrapy changes:
11861187
- nested items now fully supported in JSON and JSONLines exporters
11871188
- added :reqmeta:`cookiejar` Request meta key to support multiple cookie sessions per spider
11881189
- decoupled encoding detection code to `w3lib.encoding`_, and ported Scrapy code to use that module
1189-
- dropped support for Python 2.5. See http://blog.scrapinghub.com/2012/02/27/scrapy-0-15-dropping-support-for-python-2-5/
1190+
- dropped support for Python 2.5. See https://blog.scrapinghub.com/2012/02/27/scrapy-0-15-dropping-support-for-python-2-5/
11901191
- dropped support for Twisted 2.5
11911192
- added :setting:`REFERER_ENABLED` setting, to control referer middleware
11921193
- changed default user agent to: ``Scrapy/VERSION (+http://scrapy.org)``
@@ -1535,7 +1536,7 @@ First release of Scrapy.
15351536

15361537

15371538
.. _AJAX crawleable urls: https://developers.google.com/webmasters/ajax-crawling/docs/getting-started?csw=1
1538-
.. _chunked transfer encoding: http://en.wikipedia.org/wiki/Chunked_transfer_encoding
1539+
.. _chunked transfer encoding: https://en.wikipedia.org/wiki/Chunked_transfer_encoding
15391540
.. _w3lib: https://github.com/scrapy/w3lib
15401541
.. _scrapely: https://github.com/scrapy/scrapely
15411542
.. _marshal: https://docs.python.org/2/library/marshal.html

docs/topics/api.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -271,4 +271,4 @@ class (which they all inherit from).
271271
Close the given spider. After this is called, no more specific stats
272272
can be accessed or collected.
273273

274-
.. _reactor: http://twistedmatrix.com/documents/current/core/howto/reactor-basics.html
274+
.. _reactor: https://twistedmatrix.com/documents/current/core/howto/reactor-basics.html

docs/topics/architecture.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -125,8 +125,8 @@ links:
125125
* `Twisted - hello, asynchronous programming`_
126126
* `Twisted Introduction - Krondo`_
127127

128-
.. _Twisted: http://twistedmatrix.com/trac/
129-
.. _Introduction to Deferreds in Twisted: http://twistedmatrix.com/documents/current/core/howto/defer-intro.html
128+
.. _Twisted: https://twistedmatrix.com/trac/
129+
.. _Introduction to Deferreds in Twisted: https://twistedmatrix.com/documents/current/core/howto/defer-intro.html
130130
.. _Twisted - hello, asynchronous programming: http://jessenoller.com/2009/02/11/twisted-hello-asynchronous-programming/
131-
.. _Twisted Introduction - Krondo: http://krondo.com/blog/?page_id=1327/
131+
.. _Twisted Introduction - Krondo: http://krondo.com/an-introduction-to-asynchronous-programming-and-twisted/
132132

docs/topics/djangoitem.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,4 +10,4 @@ DjangoItem has been moved into a separate project.
1010

1111
It is hosted at:
1212

13-
https://github.com/scrapy/scrapy-djangoitem
13+
https://github.com/scrapy-plugins/scrapy-djangoitem

docs/topics/downloader-middleware.rst

Lines changed: 7 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -300,7 +300,7 @@ HttpAuthMiddleware
300300

301301
# .. rest of the spider code omitted ...
302302

303-
.. _Basic access authentication: http://en.wikipedia.org/wiki/Basic_access_authentication
303+
.. _Basic access authentication: https://en.wikipedia.org/wiki/Basic_access_authentication
304304

305305

306306
HttpCacheMiddleware
@@ -390,9 +390,9 @@ what is implemented:
390390

391391
what is missing:
392392

393-
* `Pragma: no-cache` support http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9.1
394-
* `Vary` header support http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html#sec13.6
395-
* Invalidation after updates or deletes http://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html#sec13.10
393+
* `Pragma: no-cache` support https://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.9.1
394+
* `Vary` header support https://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html#sec13.6
395+
* Invalidation after updates or deletes https://www.w3.org/Protocols/rfc2616/rfc2616-sec13.html#sec13.10
396396
* ... probably others ..
397397

398398
In order to use this policy, set:
@@ -464,7 +464,7 @@ In order to use this storage backend:
464464
* set :setting:`HTTPCACHE_STORAGE` to ``scrapy.extensions.httpcache.LeveldbCacheStorage``
465465
* install `LevelDB python bindings`_ like ``pip install leveldb``
466466

467-
.. _LevelDB: http://code.google.com/p/leveldb/
467+
.. _LevelDB: https://github.com/google/leveldb
468468
.. _leveldb python bindings: https://pypi.python.org/pypi/leveldb
469469

470470

@@ -964,6 +964,6 @@ Default: ``"latin-1"``
964964
The default encoding for proxy authentication on :class:`HttpProxyMiddleware`.
965965

966966

967-
.. _DBM: http://en.wikipedia.org/wiki/Dbm
967+
.. _DBM: https://en.wikipedia.org/wiki/Dbm
968968
.. _anydbm: https://docs.python.org/2/library/anydbm.html
969-
.. _chunked transfer encoding: http://en.wikipedia.org/wiki/Chunked_transfer_encoding
969+
.. _chunked transfer encoding: https://en.wikipedia.org/wiki/Chunked_transfer_encoding

docs/topics/email.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ simple API for sending attachments and it's very easy to configure, with a few
1515
:ref:`settings <topics-email-settings>`.
1616

1717
.. _smtplib: https://docs.python.org/2/library/smtplib.html
18-
.. _Twisted non-blocking IO: http://twistedmatrix.com/documents/current/core/howto/defer-intro.html
18+
.. _Twisted non-blocking IO: https://twistedmatrix.com/documents/current/core/howto/defer-intro.html
1919

2020
Quick example
2121
=============

docs/topics/extensions.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -21,7 +21,7 @@ avoid collision with existing (and future) extensions. For example, a
2121
hypothetic extension to handle `Google Sitemaps`_ would use settings like
2222
`GOOGLESITEMAP_ENABLED`, `GOOGLESITEMAP_DEPTH`, and so on.
2323

24-
.. _Google Sitemaps: http://en.wikipedia.org/wiki/Sitemaps
24+
.. _Google Sitemaps: https://en.wikipedia.org/wiki/Sitemaps
2525

2626
Loading & activating extensions
2727
===============================
@@ -355,8 +355,8 @@ There are at least two ways to send Scrapy the `SIGQUIT`_ signal:
355355

356356
kill -QUIT <pid>
357357

358-
.. _SIGUSR2: http://en.wikipedia.org/wiki/SIGUSR1_and_SIGUSR2
359-
.. _SIGQUIT: http://en.wikipedia.org/wiki/SIGQUIT
358+
.. _SIGUSR2: https://en.wikipedia.org/wiki/SIGUSR1_and_SIGUSR2
359+
.. _SIGQUIT: https://en.wikipedia.org/wiki/SIGQUIT
360360

361361
Debugger extension
362362
~~~~~~~~~~~~~~~~~~

docs/topics/feed-exports.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -330,7 +330,7 @@ format in :setting:`FEED_EXPORTERS`. E.g., to disable the built-in CSV exporter
330330
'csv': None,
331331
}
332332

333-
.. _URI: http://en.wikipedia.org/wiki/Uniform_Resource_Identifier
334-
.. _Amazon S3: http://aws.amazon.com/s3/
333+
.. _URI: https://en.wikipedia.org/wiki/Uniform_Resource_Identifier
334+
.. _Amazon S3: https://aws.amazon.com/s3/
335335
.. _boto: https://github.com/boto/boto
336336
.. _botocore: https://github.com/boto/botocore

docs/topics/firebug.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -164,4 +164,4 @@ elements.
164164
or tags which Therefer in page HTML
165165
sources may on Firebug inspects the live DOM
166166

167-
.. _has been shut down by Google: http://searchenginewatch.com/sew/news/2096661/google-directory-shut
167+
.. _has been shut down by Google: https://searchenginewatch.com/sew/news/2096661/google-directory-shut

docs/topics/item-pipeline.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -160,8 +160,8 @@ method and how to clean up the resources properly.
160160
self.db[self.collection_name].insert(dict(item))
161161
return item
162162

163-
.. _MongoDB: http://www.mongodb.org/
164-
.. _pymongo: http://api.mongodb.org/python/current/
163+
.. _MongoDB: https://www.mongodb.org/
164+
.. _pymongo: https://api.mongodb.org/python/current/
165165

166166
Duplicates filter
167167
-----------------

docs/topics/media-pipeline.rst

Lines changed: 4 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -143,7 +143,7 @@ Supported Storage
143143
File system is currently the only officially supported storage, but there is
144144
also (undocumented) support for storing files in `Amazon S3`_.
145145

146-
.. _Amazon S3: http://aws.amazon.com/s3/
146+
.. _Amazon S3: https://aws.amazon.com/s3/
147147

148148
File system storage
149149
-------------------
@@ -223,7 +223,7 @@ Where:
223223

224224
* ``<image_id>`` is the `SHA1 hash`_ of the image url
225225

226-
.. _SHA1 hash: http://en.wikipedia.org/wiki/SHA_hash_functions
226+
.. _SHA1 hash: https://en.wikipedia.org/wiki/SHA_hash_functions
227227

228228
Example of image files stored using ``small`` and ``big`` thumbnail names::
229229

@@ -390,5 +390,5 @@ above::
390390
item['image_paths'] = image_paths
391391
return item
392392

393-
.. _Twisted Failure: http://twistedmatrix.com/documents/current/api/twisted.python.failure.Failure.html
394-
.. _MD5 hash: http://en.wikipedia.org/wiki/MD5
393+
.. _Twisted Failure: https://twistedmatrix.com/documents/current/api/twisted.python.failure.Failure.html
394+
.. _MD5 hash: https://en.wikipedia.org/wiki/MD5

docs/topics/practices.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -251,5 +251,5 @@ If you are still unable to prevent your bot getting banned, consider contacting
251251
.. _ProxyMesh: http://proxymesh.com/
252252
.. _Google cache: http://www.googleguide.com/cached_pages.html
253253
.. _testspiders: https://github.com/scrapinghub/testspiders
254-
.. _Twisted Reactor Overview: http://twistedmatrix.com/documents/current/core/howto/reactor-basics.html
254+
.. _Twisted Reactor Overview: https://twistedmatrix.com/documents/current/core/howto/reactor-basics.html
255255
.. _Crawlera: http://scrapinghub.com/crawlera

docs/topics/request-response.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -621,4 +621,4 @@ XmlResponse objects
621621
adds encoding auto-discovering support by looking into the XML declaration
622622
line. See :attr:`TextResponse.encoding`.
623623

624-
.. _Twisted Failure: http://twistedmatrix.com/documents/current/api/twisted.python.failure.Failure.html
624+
.. _Twisted Failure: https://twistedmatrix.com/documents/current/api/twisted.python.failure.Failure.html

docs/topics/selectors.rst

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -40,8 +40,8 @@ For a complete reference of the selectors API see
4040
.. _lxml: http://lxml.de/
4141
.. _ElementTree: https://docs.python.org/2/library/xml.etree.elementtree.html
4242
.. _cssselect: https://pypi.python.org/pypi/cssselect/
43-
.. _XPath: http://www.w3.org/TR/xpath
44-
.. _CSS: http://www.w3.org/TR/selectors
43+
.. _XPath: https://www.w3.org/TR/xpath
44+
.. _CSS: https://www.w3.org/TR/selectors
4545

4646

4747
Using selectors
@@ -281,7 +281,7 @@ Another common case would be to extract all direct ``<p>`` children::
281281
For more details about relative XPaths see the `Location Paths`_ section in the
282282
XPath specification.
283283

284-
.. _Location Paths: http://www.w3.org/TR/xpath#location-paths
284+
.. _Location Paths: https://www.w3.org/TR/xpath#location-paths
285285

286286
Using EXSLT extensions
287287
----------------------
@@ -439,7 +439,7 @@ you may want to take a look first at this `XPath tutorial`_.
439439

440440

441441
.. _`XPath tutorial`: http://www.zvon.org/comp/r/tut-XPath_1.html
442-
.. _`this post from ScrapingHub's blog`: http://blog.scrapinghub.com/2014/07/17/xpath-tips-from-the-web-scraping-trenches/
442+
.. _`this post from ScrapingHub's blog`: https://blog.scrapinghub.com/2014/07/17/xpath-tips-from-the-web-scraping-trenches/
443443

444444

445445
Using text nodes in a condition
@@ -481,7 +481,7 @@ But using the ``.`` to mean the node, works::
481481
>>> sel.xpath("//a[contains(., 'Next Page')]").extract()
482482
[u'<a href="#">Click here to go to the <strong>Next Page</strong></a>']
483483

484-
.. _`XPath string function`: http://www.w3.org/TR/xpath/#section-String-Functions
484+
.. _`XPath string function`: https://www.w3.org/TR/xpath/#section-String-Functions
485485

486486
Beware of the difference between //node[1] and (//node)[1]
487487
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

docs/topics/settings.rst

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -1202,6 +1202,6 @@ case to see how to enable and use them.
12021202
.. settingslist::
12031203

12041204

1205-
.. _Amazon web services: http://aws.amazon.com/
1206-
.. _breadth-first order: http://en.wikipedia.org/wiki/Breadth-first_search
1207-
.. _depth-first order: http://en.wikipedia.org/wiki/Depth-first_search
1205+
.. _Amazon web services: https://aws.amazon.com/
1206+
.. _breadth-first order: https://en.wikipedia.org/wiki/Breadth-first_search
1207+
.. _depth-first order: https://en.wikipedia.org/wiki/Depth-first_search

docs/topics/shell.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -138,7 +138,7 @@ Example of shell session
138138
========================
139139

140140
Here's an example of a typical shell session where we start by scraping the
141-
http://scrapy.org page, and then proceed to scrape the http://reddit.com
141+
http://scrapy.org page, and then proceed to scrape the https://reddit.com
142142
page. Finally, we modify the (Reddit) request method to POST and re-fetch it
143143
getting an error. We end the session by typing Ctrl-D (in Unix systems) or
144144
Ctrl-Z in Windows.

docs/topics/signals.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -22,7 +22,7 @@ Deferred signal handlers
2222
Some signals support returning `Twisted deferreds`_ from their handlers, see
2323
the :ref:`topics-signals-ref` below to know which ones.
2424

25-
.. _Twisted deferreds: http://twistedmatrix.com/documents/current/core/howto/defer.html
25+
.. _Twisted deferreds: https://twistedmatrix.com/documents/current/core/howto/defer.html
2626

2727
.. _topics-signals-ref:
2828

@@ -258,4 +258,4 @@ response_downloaded
258258
:param spider: the spider for which the response is intended
259259
:type spider: :class:`~scrapy.spiders.Spider` object
260260

261-
.. _Failure: http://twistedmatrix.com/documents/current/api/twisted.python.failure.Failure.html
261+
.. _Failure: https://twistedmatrix.com/documents/current/api/twisted.python.failure.Failure.html

docs/topics/spider-middleware.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -211,7 +211,7 @@ HttpErrorMiddleware
211211
According to the `HTTP standard`_, successful responses are those whose
212212
status codes are in the 200-300 range.
213213

214-
.. _HTTP standard: http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
214+
.. _HTTP standard: https://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
215215

216216
If you still want to process response codes outside that range, you can
217217
specify which response codes the spider is able to handle using the
@@ -238,7 +238,7 @@ responses, unless you really know what you're doing.
238238

239239
For more information see: `HTTP Status Code Definitions`_.
240240

241-
.. _HTTP Status Code Definitions: http://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
241+
.. _HTTP Status Code Definitions: https://www.w3.org/Protocols/rfc2616/rfc2616-sec10.html
242242

243243
HttpErrorMiddleware settings
244244
~~~~~~~~~~~~~~~~~~~~~~~~~~~~

docs/topics/spiders.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -735,5 +735,5 @@ Combine SitemapSpider with other sources of urls::
735735
.. _Sitemaps: http://www.sitemaps.org
736736
.. _Sitemap index files: http://www.sitemaps.org/protocol.html#index
737737
.. _robots.txt: http://www.robotstxt.org/
738-
.. _TLD: http://en.wikipedia.org/wiki/Top-level_domain
738+
.. _TLD: https://en.wikipedia.org/wiki/Top-level_domain
739739
.. _Scrapyd documentation: http://scrapyd.readthedocs.org/en/latest/

0 commit comments

Comments
 (0)