Skip to content

Commit 6137dd9

Browse files
committed
Fix documentation for S3_STORE_ACL (now settings.FILES_STORE_S3_ACL) settings: it has nothing to do with feed exporters.
1 parent 164f300 commit 6137dd9

File tree

4 files changed

+13
-16
lines changed

4 files changed

+13
-16
lines changed

docs/topics/feed-exports.rst

Lines changed: 0 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -185,9 +185,6 @@ passed through the following settings:
185185
* :setting:`AWS_ACCESS_KEY_ID`
186186
* :setting:`AWS_SECRET_ACCESS_KEY`
187187

188-
Default access policy for uploaded files is ``private``, it can be changed
189-
(for example, to ``public-read``) via :setting:`S3_STORE_ACL`.
190-
191188
.. _topics-feed-storage-stdout:
192189

193190
Standard output

docs/topics/settings.rst

Lines changed: 9 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -607,6 +607,15 @@ For more information See the :ref:`extensions user guide <topics-extensions>`
607607
and the :ref:`list of available extensions <topics-extensions-ref>`.
608608

609609

610+
.. setting:: FILES_STORE_S3_ACL
611+
612+
FILES_STORE_S3_ACL
613+
------------------
614+
615+
Default: ``'private'``
616+
617+
S3-specific access control policy (ACL) for S3 files store.
618+
610619
.. setting:: ITEM_PIPELINES
611620

612621
ITEM_PIPELINES
@@ -926,15 +935,6 @@ If enabled, Scrapy will respect robots.txt policies. For more information see
926935
this option is enabled by default in settings.py file generated
927936
by ``scrapy startproject`` command.
928937

929-
.. setting:: S3_STORE_ACL
930-
931-
S3_STORE_ACL
932-
------------
933-
934-
Default: ``'private'``
935-
936-
S3-specific access control policy (ACL) for uploaded files.
937-
938938
.. setting:: SCHEDULER
939939

940940
SCHEDULER

scrapy/pipelines/files.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -82,7 +82,7 @@ class S3FilesStore(object):
8282
AWS_ACCESS_KEY_ID = None
8383
AWS_SECRET_ACCESS_KEY = None
8484

85-
POLICY = 'private' # Overriden from settings.S3_STORE_ACL in
85+
POLICY = 'private' # Overriden from settings.FILES_STORE_S3_ACL in
8686
# FilesPipeline.from_settings.
8787
HEADERS = {
8888
'Cache-Control': 'max-age=172800',
@@ -233,7 +233,7 @@ def from_settings(cls, settings):
233233
s3store = cls.STORE_SCHEMES['s3']
234234
s3store.AWS_ACCESS_KEY_ID = settings['AWS_ACCESS_KEY_ID']
235235
s3store.AWS_SECRET_ACCESS_KEY = settings['AWS_SECRET_ACCESS_KEY']
236-
s3store.POLICY = settings['S3_STORE_ACL']
236+
s3store.POLICY = settings['FILES_STORE_S3_ACL']
237237

238238
cls.FILES_URLS_FIELD = settings.get('FILES_URLS_FIELD', cls.DEFAULT_FILES_URLS_FIELD)
239239
cls.FILES_RESULT_FIELD = settings.get('FILES_RESULT_FIELD', cls.DEFAULT_FILES_RESULT_FIELD)

scrapy/settings/default_settings.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -156,6 +156,8 @@
156156
'pickle': 'scrapy.exporters.PickleItemExporter',
157157
}
158158

159+
FILES_STORE_S3_ACL = 'private'
160+
159161
HTTPCACHE_ENABLED = False
160162
HTTPCACHE_DIR = 'httpcache'
161163
HTTPCACHE_IGNORE_MISSING = False
@@ -231,8 +233,6 @@
231233
SCHEDULER_DISK_QUEUE = 'scrapy.squeues.PickleLifoDiskQueue'
232234
SCHEDULER_MEMORY_QUEUE = 'scrapy.squeues.LifoMemoryQueue'
233235

234-
S3_STORE_ACL = 'private'
235-
236236
SPIDER_LOADER_CLASS = 'scrapy.spiderloader.SpiderLoader'
237237

238238
SPIDER_MIDDLEWARES = {}

0 commit comments

Comments
 (0)