celery 为何部署到服务器上变成同步阻塞了

查看 101|回复 4
作者:lzjunika   
最近整个小应用
业务服务:gunicorn + flask
后台服务:celery
主业务主文件转换与上传,比较耗时,所以用 celery 跑后台服务
本地(Mac book):
能正常异步跑,自动切换多个 worker,无阻塞,是理想效果:
[2023-08-29 16:37:14,279: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[54e1286f-74c4-48d4-98e5-99937c65714c] received
[2023-08-29 16:37:14,417: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[5b4d53fc-afc2-4e50-9b9c-905d5eddddde] received
[2023-08-29 16:37:14,486: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[8fb95642-6900-4aaa-b666-11f98e3a0eea] received
[2023-08-29 16:37:14,531: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[4df12aa2-458c-4946-8aad-3ed25e68c5e0] received
[2023-08-29 16:37:14,583: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[93192a4d-7569-44c7-a09b-7035ea331901] received
[2023-08-29 16:37:14,618: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[6897f139-bc7a-4b9f-aab8-22c6d7a07a85] received
[2023-08-29 16:37:14,660: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[d301e702-accd-44b2-b85e-2f7d3c3a4e4f] received
[2023-08-29 16:37:14,690: WARNING/ForkPoolWorker-8] requestId:
[2023-08-29 16:37:14,693: WARNING/ForkPoolWorker-8] 0000018A4070950C5A0294A2CECAB8DF
[2023-08-29 16:37:14,701: WARNING/ForkPoolWorker-8] [2023-08-29 16:37:14,701] WARNING in offce: obs_upload_file:OK
[2023-08-29 16:37:14,701: WARNING/ForkPoolWorker-8] obs_upload_file:OK
[2023-08-29 16:37:14,702: WARNING/ForkPoolWorker-8] test_1.png
[2023-08-29 16:37:14,736: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[42c63363-9528-4f59-9e21-b2816907141f] received
[2023-08-29 16:37:14,737: INFO/ForkPoolWorker-8] Task pss.api.offce.put_content_to_obs[54e1286f-74c4-48d4-98e5-99937c65714c] succeeded in 0.4250246670001161s: True
[2023-08-29 16:37:14,755: WARNING/ForkPoolWorker-1] requestId:
[2023-08-29 16:37:14,756: WARNING/ForkPoolWorker-1] 0000018A407095555502052E5A386783
[2023-08-29 16:37:14,763: WARNING/ForkPoolWorker-1] [2023-08-29 16:37:14,761] WARNING in offce: obs_upload_file:OK
[2023-08-29 16:37:14,761: WARNING/ForkPoolWorker-1] obs_upload_file:OK
[2023-08-29 16:37:14,767: WARNING/ForkPoolWorker-1] test_2.png
[2023-08-29 16:37:14,785: INFO/ForkPoolWorker-1] Task pss.api.offce.put_content_to_obs[5b4d53fc-afc2-4e50-9b9c-905d5eddddde] succeeded in 0.3451121250000142s: True
[2023-08-29 16:37:14,788: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[835c3945-2396-4490-94d0-421298d1813f] received
[2023-08-29 16:37:14,890: WARNING/ForkPoolWorker-2] requestId:
[2023-08-29 16:37:14,891: WARNING/ForkPoolWorker-2] 0000018A407095E8540AE00FC334E409
[2023-08-29 16:37:14,892: WARNING/ForkPoolWorker-2] [2023-08-29 16:37:14,892] WARNING in offce: obs_upload_file:OK
[2023-08-29 16:37:14,892: WARNING/ForkPoolWorker-2] obs_upload_file:OK
[2023-08-29 16:37:14,893: WARNING/ForkPoolWorker-2] test_3.png
[2023-08-29 16:37:14,895: INFO/ForkPoolWorker-2] Task pss.api.offce.put_content_to_obs[8fb95642-6900-4aaa-b666-11f98e3a0eea] succeeded in 0.3848593749999054s: True
服务器(Centos7.6):
只能同步跑,只有一个固定 worder 在运行,同步阻塞:
[2023-08-29 16:25:58,664: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[873c3f38-98b4-47cc-98e8-6f65a58c3269] received
[2023-08-29 16:25:58,733: WARNING/ForkPoolWorker-7] requestId:
[2023-08-29 16:25:58,734: WARNING/ForkPoolWorker-7] 0000018A406644C054084BB9021C6A9B
[2023-08-29 16:25:58,734: WARNING/ForkPoolWorker-7] [2023-08-29 16:25:58,734] WARNING in offce: obs_upload_file:OK
[2023-08-29 16:25:58,734: WARNING/ForkPoolWorker-7] obs_upload_file:OK
[2023-08-29 16:25:58,734: WARNING/ForkPoolWorker-7] test_8.png
[2023-08-29 16:25:58,735: INFO/ForkPoolWorker-7] Task pss.api.offce.put_content_to_obs[873c3f38-98b4-47cc-98e8-6f65a58c3269] succeeded in 0.07009824365377426s: True
[2023-08-29 16:26:00,287: INFO/MainProcess] Task pss.api.offce.put_content_to_obs[7a4868f2-305f-4f6b-992c-6ea0791f3427] received
[2023-08-29 16:26:00,370: WARNING/ForkPoolWorker-7] requestId:
[2023-08-29 16:26:00,370: WARNING/ForkPoolWorker-7] 0000018A40664B17550A56827D6506B2
[2023-08-29 16:26:00,371: WARNING/ForkPoolWorker-7] [2023-08-29 16:26:00,371] WARNING in offce: obs_upload_file:OK
[2023-08-29 16:26:00,371: WARNING/ForkPoolWorker-7] obs_upload_file:OK
[2023-08-29 16:26:00,371: WARNING/ForkPoolWorker-7] test_9.png
[2023-08-29 16:26:00,372: INFO/ForkPoolWorker-7] Task pss.api.offce.put_content_to_obs[7a4868f2-305f-4f6b-992c-6ea0791f3427] succeeded in 0.08343333378434181s: True
如上运行日志
本地正常运行,有 3 个 worker:ForkPoolWorker-8 、ForkPoolWorker-1 和 ForkPoolWorker-2 ,并行处理,服务不阻塞
远程服务器只能同步运行,且只有一个固定的 worker: ForkPoolWorker-7,服务阻塞的
是不是远程哪里的配置不对,远程服务器启动信息摘要也跟本地一样:
-------------- celery@xg-003 v5.2.7 (dawn-chorus)
--- ***** -----
-- ******* ---- Linux-3.10.0-1062.1.2.el7.x86_64-x86_64-with-centos-7.7.1908-Core 2023-08-29 15:48:54
- *** --- * ---
- ** ---------- [config]
- ** ---------- .> app:         pss:0x7f348eaec160
- ** ---------- .> transport:   redis://***:6379/6
- ** ---------- .> results:     redis://***:6379/6
- *** --- * --- .> concurrency: 8 (prefork)
-- ******* ---- .> task events: OFF (enable -E to monitor tasks in this worker)
--- ***** -----
-------------- [queues]
                .> celery           exchange=celery(direct) key=celery
请各位指教,是不是哪里配置有问题

2023-08-29, WARNING, offce

kkk9   
redis 正常吗
lzjunika
OP
  
都能正常运行,本地由于是异步的,很快执行完毕
服务器上,一真无法异步,worker 从默认的 prefork ,调到 eventlet 和 gevent ,都不行
有没有人遇到这种情况
lzjunika
OP
  
@kkk9 redis 正常,换成本机了
purensong   
我觉得是参数的问题,
您需要登录后才可以回帖 登录 | 立即注册

返回顶部