Multiple Rasa containers with docker-compose (part 2)

Multiple Rasa containers with docker-compose (part 2)

Dockerfiles for our other containers

Let's start with the action servers...

Rasa normally runs on port 5055. If we want to start two action servers on the same server, we would need to specify different ports for each instance, but in this case, since they will run in separate containers, there's no need. If we want to access them from the host we have to expose different ports, though. We can do that in the docker-compose.yml file.

cd into the chatbot_a/actions folder and let's create this Dockerfile:

FROM sscudder/rasabase

RUN apt-get update && apt-get -y install locales locales-all
RUN locale-gen pt_BR.UTF-8
ENV LANG pt_BR.UTF-8
ENV LANGUAGE pt_BR:pt
ENV LC_ALL pt_BR.UTF-8

COPY /actions/actions.ini /actions/actions.ini
COPY /actions/actions.py /actions/actions.py

WORKDIR /actions

The Dockerfile from chatbot_b/actions will be exactly the same.

Moving on to the docker file for chatbot_a:

FROM sscudder/rasabase

RUN mkdir chatbot_a

COPY ./chatbot_a /chatbot_a

WORKDIR /chatbot_a

And chatbot|_b:

FROM sscudder/rasabase

RUN mkdir chatbot_b

COPY ./chatbot_b /chatbot_b

WORKDIR /chatbot_b

Putting it all together with docker-compose

Next we'll reference all our dockers in docker-compose.yml.

The docker compose is in three sections... the first one with the version, then we have a services section, where we specify all our containers, and finally a networks section.

For each container (or "service"), we have to give it a name, then say what directories to use and where the Dockerfile is, say which ports we want to expose, how the folders should appear in the docker, what command to run when the container comes up, and (very important) what virtual network we want to use for this container (we'll use network called "chatbot" - very original!). We have to specify the same network for all the containers that we want to see each other.

For an action server it's like this:

services:

  # =============================== actions A =================================
  # Actions for chatbot_a
  action_server_a:
    build:
      context: ./chatbot_a
      dockerfile: ./actions/Dockerfile
    ports:
      - 5056:5055
    volumes:
      - ./chatbot_a/actions:/app/actions
    command: bash -c "rasa run actions --actions actions -vv"
    networks:
      - chatbot

action_server_a is the name of the service. (The folder we are in is prepended, and the instance number is added to the end, so if you run docker-compose ps afterwards, you'll see the name is app_action_server_a_1)

The ports section shows us what ports in the "real" world are mapped to port in the container. In this case, port 5056 on the host machine is mapped to port 5055 (the default action port) in the container. This is so we can test the action servers in the real world on different ports if need be.

In the action server for chatbot b, use a number different from 5056 to avoid conflicts, like 5057.

In the volumes section, the folder chatbot_a/actions is mapped to app/actions in the container.

Command is how to run it.

Networks has to point to a common network that all the containers will be part of, and uses the same name as the networks section of the overall docker-compose.yml file.

Now we know more of less how they need to be set up, create this docker-compose.yml file in the app directory:

version: '3.0'
services:

  # =============================== chatbot A ===================================
  # Rasa image for chatbot_a
  rasa_ana:
    build:
      context: .
      dockerfile: ./chatbot_a/Dockerfile
    ports:
      - 5007:5005
    volumes:
      - ./chatbot_a:/chatbot_a
      - ./models_a:/chatbot_a/models
    command: bash -c "ls -ls models && rasa run -m models -vv --endpoints endpoints.yml --enable-api"
    networks:
      - chatbot
    depends_on:
      - action_server_a

  # =============================== actions A =================================
  # Actions for chatbot_a
  action_server_a:
    build:
      context: ./chatbot_a
      dockerfile: ./actions/Dockerfile
    ports:
      - 5056:5055
    volumes:
      - ./chatbot_a/actions:/app/actions
    command: bash -c "rasa run actions --actions actions -vv"
    networks:
      - chatbot

  # =============================== chatbot B ===================================
  # Rasa image for chatbot_a
  rasa_ana:
    build:
      context: .
      dockerfile: ./chatbot_a/Dockerfile
    ports:
      - 5007:5005
    volumes:
      - ./chatbot_a:/chatbot_a
      - ./models_a:/chatbot_a/models
    command: bash -c "ls -ls models && rasa run -m models -vv --endpoints endpoints.yml --enable-api"
    networks:
      - chatbot
    depends_on:
      - action_server_a

  # =============================== actions B =================================
  # Actions for chatbot_a
  action_server_a:
    build:
      context: ./chatbot_a
      dockerfile: ./actions/Dockerfile
    ports:
      - 5056:5055
    volumes:
      - ./chatbot_a/actions:/app/actions
    command: bash -c "rasa run actions --actions actions -vv"
    networks:
      - chatbot


  # =============================== mongoDB =================================
  # Responsible for store all the data.
  mongo:
    image: mongo
    container_name: "mongo"
    ports:
      - 17017:27017
    volumes:
      - ./mongo-volume:/data/db
    networks:
      - chatbot

networks:
  chatbot:
    driver: bridge

Don't forget that the chatbots shouldn't use the same tracker store!

Imagine how the chatbot would track conversations if we have events from two different domains all mixed together... Rasa uses a sender_id to track conversations. If we were to have a collision between a user from both domains, chaos would happen.

To get around this, just change the database for each chatbot in the endpoints.yml file like below:

endpoints.yml for chatbot_a:

...

tracker_store:
   type: mongod
   url: mongodb://mongo:27017
   db: rasa_a

...

endpoints.yml for chatbot_b:

...

tracker_store:
   type: mongod
   url: mongodb://mongo:27017
   db: rasa_b

...

Comments

Popular posts from this blog

Better report for Rasa chatbot model analysis

Multiple Rasa containers with docker-compose (part 1)