No recent searches
Popular Articles
Sorry! nothing found for
Posted over 3 years ago by Elda Di Matteo
Hi,
i'm trying to deploy a scrapy spider but i get the following error when i type shub deploy
(venv) (base) macbook@macbooks-MacBook-Pro gmaps % shub deploy
Packing version 1.0
Deploying to Scrapy Cloud project "566510"
Deploy log last 30 lines:
---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
---> Running in 8f3fdff05c9e
Removing intermediate container 8f3fdff05c9e
---> 984c311877b2
Step 12/12 : ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
---> Running in e77c21d190e7
Removing intermediate container e77c21d190e7
---> 2d8edfd8dec0
Successfully built 2d8edfd8dec0
Successfully tagged i.scrapinghub.com/kumo_project/566510:8
Step 1/3 : FROM alpine:3.5
---> f80194ae2e0c
Step 2/3 : ADD kumo-entrypoint /kumo-entrypoint
---> Using cache
---> fbd31fea70b1
Step 3/3 : RUN chmod +x /kumo-entrypoint
---> 3565017f646e
Successfully built 3565017f646e
Successfully tagged kumo-entrypoint:latest
Entrypoint container is created successfully
>>> Checking python dependencies
Collecting pip<20.0,>=9.0.3
WARNING: There're some errors when doing pip-check:
Could not find a version that satisfies the requirement pip<20.0,>=9.0.3 (from versions: )
No matching distribution found for pip<20.0,>=9.0.3
{"message": "Dependencies check exit code: 1", "details": "Pip checks failed, please fix the conflicts", "error": "requirements_error"}
{"status": "error", "message": "Requirements error"}
Deploy log location: /var/folders/z0/0f3bfffx6xx146gbvw_0wcmc0000gn/T/shub_deploy_4pbnv7n0.log
Error: Deploy failed: b'{"status": "error", "message": "Requirements error"}'
im using
python 3.9
scrapy 2.5.1
scrapy-splash 0.8.0
pip 19.3.1
and the scrapunghub.yml file contains this
project: 566510
how can i fix this?
thanks!
0 Votes
nestor posted over 3 years ago Admin Best Answer
Your project is running on a very old stack "hworker:20160708", you need to deploy your project with an updated stack.
Follow the instructions here: https://support.zyte.com/support/solutions/articles/22000200402-changing-the-deploy-environment-with-scrapy-cloud-stacks
You'll need to modify your scrapinghub.yml file to be something like this:
project: xxxxxx
stack: scrapy:x.x
Available stacks can be found here: https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags
4 Comments
Elda Di Matteo posted over 3 years ago
oh ok thanks again!
it is working now!
really appreciate your help!
nestor posted over 3 years ago Admin
In your scrapinghub.yml, change "stacks" to "stack" (no quotes).
Also there is no 2.5.1 stack, you need to use the defined stack tags from this list https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags , stack 2.5 uses scrapy 2.5.1
So it should be:
stack: scrapy:2.5
To see the stack used on each deploy, click on any of the deployments in your "Code & Deploys" section of your project
Also your organization was created before the specified time in the article I linked, so it is using that stack by default
thanks for your reply!
how do you know that my project is running on a very old stack?
i added this to the yml file
https://monosnap.com/file/Wa42eD8RfeXvWOmXO4pg1eGX9js0XK
but now im getting this error
Traceback (most recent call last):
File "/Users/macbook/PycharmProjects/gmaps/venv/bin/shub", line 8, in <module>
sys.exit(cli())
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/deploy.py", line 70, in cli
conf, image = load_shub_config(), None
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 507, in load_shub_config
conf.load_file(closest_sh_yml)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 133, in load_file
self.load(f)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 87, in load
option_conf.update(yaml_option_conf)
ValueError: dictionary update sequence element #0 has length 1; 2 is required
any idea?
nestor posted over 3 years ago Admin Answer
Login to post a comment
People who like this
This post will be deleted permanently. Are you sure?
We use cookies to try and give you a better experience in Freshdesk.
You can learn more about what kind of cookies we use, why, and how from our Privacy Policy. If you hate cookies, or are just on a diet, you can disable them altogether too. Just note that the Freshdesk service is pretty big on some cookies (we love the choco-chip ones), and some portions of Freshdesk may not work properly if you disable cookies.
We’ll also assume you agree to the way we use cookies and are ok with it as described in our Privacy Policy, unless you choose to disable them altogether through your browser.
Hi,
i'm trying to deploy a scrapy spider but i get the following error when i type shub deploy
(venv) (base) macbook@macbooks-MacBook-Pro gmaps % shub deploy
Packing version 1.0
Deploying to Scrapy Cloud project "566510"
Deploy log last 30 lines:
---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
---> Running in 8f3fdff05c9e
Removing intermediate container 8f3fdff05c9e
---> 984c311877b2
Step 12/12 : ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin
---> [Warning] Your kernel does not support swap limit capabilities or the cgroup is not mounted. Memory limited without swap.
---> Running in e77c21d190e7
Removing intermediate container e77c21d190e7
---> 2d8edfd8dec0
Successfully built 2d8edfd8dec0
Successfully tagged i.scrapinghub.com/kumo_project/566510:8
Step 1/3 : FROM alpine:3.5
---> f80194ae2e0c
Step 2/3 : ADD kumo-entrypoint /kumo-entrypoint
---> Using cache
---> fbd31fea70b1
Step 3/3 : RUN chmod +x /kumo-entrypoint
---> Using cache
---> 3565017f646e
Successfully built 3565017f646e
Successfully tagged kumo-entrypoint:latest
Entrypoint container is created successfully
>>> Checking python dependencies
Collecting pip<20.0,>=9.0.3
WARNING: There're some errors when doing pip-check:
Could not find a version that satisfies the requirement pip<20.0,>=9.0.3 (from versions: )
No matching distribution found for pip<20.0,>=9.0.3
{"message": "Dependencies check exit code: 1", "details": "Pip checks failed, please fix the conflicts", "error": "requirements_error"}
{"status": "error", "message": "Requirements error"}
Deploy log location: /var/folders/z0/0f3bfffx6xx146gbvw_0wcmc0000gn/T/shub_deploy_4pbnv7n0.log
Error: Deploy failed: b'{"status": "error", "message": "Requirements error"}'
im using
python 3.9
scrapy 2.5.1
scrapy-splash 0.8.0
pip 19.3.1
and the scrapunghub.yml file contains this
project: 566510
how can i fix this?
thanks!
0 Votes
nestor posted over 3 years ago Admin Best Answer
Hi,
Your project is running on a very old stack "hworker:20160708", you need to deploy your project with an updated stack.
Follow the instructions here: https://support.zyte.com/support/solutions/articles/22000200402-changing-the-deploy-environment-with-scrapy-cloud-stacks
You'll need to modify your scrapinghub.yml file to be something like this:
project: xxxxxx
stack: scrapy:x.x
Available stacks can be found here: https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags
0 Votes
4 Comments
Elda Di Matteo posted over 3 years ago
oh ok thanks again!
it is working now!
really appreciate your help!
0 Votes
nestor posted over 3 years ago Admin
In your scrapinghub.yml, change "stacks" to "stack" (no quotes).
Also there is no 2.5.1 stack, you need to use the defined stack tags from this list https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags , stack 2.5 uses scrapy 2.5.1
So it should be:
stack: scrapy:2.5
To see the stack used on each deploy, click on any of the deployments in your "Code & Deploys" section of your project
Also your organization was created before the specified time in the article I linked, so it is using that stack by default
0 Votes
Elda Di Matteo posted over 3 years ago
Hi,
thanks for your reply!
how do you know that my project is running on a very old stack?
i added this to the yml file
https://monosnap.com/file/Wa42eD8RfeXvWOmXO4pg1eGX9js0XK
but now im getting this error
(venv) (base) macbook@macbooks-MacBook-Pro gmaps % shub deploy
Traceback (most recent call last):
File "/Users/macbook/PycharmProjects/gmaps/venv/bin/shub", line 8, in <module>
sys.exit(cli())
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 764, in __call__
return self.main(*args, **kwargs)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 717, in main
rv = self.invoke(ctx)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 1137, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 956, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/click/core.py", line 555, in invoke
return callback(*args, **kwargs)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/deploy.py", line 70, in cli
conf, image = load_shub_config(), None
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 507, in load_shub_config
conf.load_file(closest_sh_yml)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 133, in load_file
self.load(f)
File "/Users/macbook/PycharmProjects/gmaps/venv/lib/python3.9/site-packages/shub/config.py", line 87, in load
option_conf.update(yaml_option_conf)
ValueError: dictionary update sequence element #0 has length 1; 2 is required
any idea?
thanks!
0 Votes
nestor posted over 3 years ago Admin Answer
Hi,
Your project is running on a very old stack "hworker:20160708", you need to deploy your project with an updated stack.
Follow the instructions here: https://support.zyte.com/support/solutions/articles/22000200402-changing-the-deploy-environment-with-scrapy-cloud-stacks
You'll need to modify your scrapinghub.yml file to be something like this:
project: xxxxxx
stack: scrapy:x.x
Available stacks can be found here: https://github.com/scrapinghub/scrapinghub-stack-scrapy/tags
0 Votes
Login to post a comment