首页 文章

码头 Worker 的芹菜 Worker 将无法获得正确的消息经纪人

提问于
浏览
0

我正在使用app工厂模式创建一个烧瓶服务,我需要使用芹菜来执行异步任务 . 我也使用docker和docker-compose来包含和运行一切 . 我的结构看起来像这样:

server
 |
 +-- manage.py
 +-- docker-compose.yml
 +-- requirements.txt
 +-- Dockerfile
 |    
 +-- project
 |  |  
 |  +-- api
 |      |
 |      +--tasks.py
 |
 |  +-- __init__.py

我的 tasks.py 文件看起来像这样:

from project import celery_app

@celery_app.task
def celery_check(test):
    print(test)

我调用 manage.py 来运行,如下所示:

# manage.py

from flask_script import Manager
from project import create_app

app = create_app()
manager = Manager(app)

if __name__ == '__main__':
    manager.run()

我的 __init__.py 看起来像这样:

# project/__init__.py

import os
import json
from flask_mongoalchemy import MongoAlchemy
from flask_cas import CAS
from flask import Flask
from itsdangerous import JSONWebSignatureSerializer as JWT
from flask_httpauth import HTTPTokenAuth
from celery import Celery

# instantiate the database and CAS
db = MongoAlchemy()
cas = CAS()

# Auth stuff (ReplaceMe is replaced below in create_app())
jwt = JWT("ReplaceMe")
auth = HTTPTokenAuth('Bearer')
celery_app = Celery(__name__, broker=os.environ.get("CELERY_BROKER_URL"))


def create_app():
    # instantiate the app
    app = Flask(__name__, template_folder='client/templates', static_folder='client/static')

    # set config
    app_settings = os.getenv('APP_SETTINGS')
    app.config.from_object(app_settings)

    # Send new static files every time if debug is enabled
    if app.debug:
        app.config['SEND_FILE_MAX_AGE_DEFAULT'] = 0

    # Get the secret keys
    parse_secret(app.config['CONFIG_FILE'], app)

    celery_app.conf.update(app.config)
    print(celery_app.conf)

    # set up extensions
    db.init_app(app)
    cas.init_app(app)
    # Replace the secret key with the app's
    jwt.secret_key = app.config["SECRET_KEY"]

    parse_config(app.config['CONFIG_FILE'])

    # register blueprints
    from project.api.views import twist_blueprint
    app.register_blueprint(twist_blueprint)

    return app

在我的docker-compose中,我启动一个worker并定义一些这样的环境变量:

version: '2.1'

services:
  twist-service:
    container_name: twist-service
    build: .
    volumes:
      - '.:/usr/src/app'
    ports:
      - 5001:5000 # expose ports - HOST:CONTAINER
    environment:
      - APP_SETTINGS=project.config.DevelopmentConfig
      - DATABASE_NAME_TESTING=testing
      - DATABASE_NAME_DEV=dev
      - DATABASE_URL=twist-database
      - CONFIG_FILE=./project/default_config.json
      - MONGO_PASSWORD=user
      - CELERY_RESULT_BACKEND=redis://redis:6379
      - CELERY_BROKER_URL=redis://redis:6379/0
      - MONGO_PORT=27017
    depends_on:
      - celery
      - twist-database
  celery:
    container_name: celery
    build: .
    command: celery -A project.api.tasks --loglevel=debug worker
    volumes:
      - '.:/usr/src/app'
  twist-database:
    image: mongo:latest
    container_name: "twist-database"
    environment:
      - MONGO_DATA_DIR=/data/db
      - MONGO_USER=mongo
    volumes:
      - /data/db
    ports:
      - 27017:27017  # expose ports - HOST:CONTAINER
    command: mongod
  redis:
    image: "redis:alpine"
    command: redis-server
    volumes:
      - '/redis'
    ports:
      - '6379:6379'

但是,当我运行docker-compose文件并生成容器时,我最终在celery worker日志中得到了这个:

[2017-07-20 16:53:06,721: ERROR/MainProcess] consumer: Cannot connect to amqp://guest:**@127.0.0.1:5672//: [Errno 111] Connection refused.

这意味着 Worker 在创建芹菜时忽略了redis的配置集,并尝试使用rabbitmq . 我已经尝试将project.api.tasks更改为project和project.celery_app,但无济于事 .

2 回答

  • 1

    在我看来, celery 服务应该也有环境变量 CELERY_RESULT_BACKENDCELERY_BROKER_URL .

  • 0

    您需要将docker服务链接在一起 . 最直接的机制是在dockerfile中添加networks section .

相关问题