No module named 'dotenv' inside Docker container when building monorepo project with Turborepo

68 Views Asked by At

I am encountering an issue while building a monorepo project using "Turborepo." The specific problem arises when attempting to build a Docker container for a Python app named "sync-sheet." The error message I receive during the Docker build process is as follows:

Traceback (most recent call last):
  File "/app/services/sheet-sync/main.py", line 1, in <module>
    from src.reads.supplier_info import build as build_suppliers
  File "/app/services/sheet-sync/src/reads/supplier_info.py", line 1, in <module>
    from src.reads.read_data_from_sheet import read_data_from_sheet
  File "/app/services/sheet-sync/src/reads/read_data_from_sheet.py", line 1, in <module>
    from src.config.google_config import sheets
  File "/app/services/sheet-sync/src/config/google_config.py", line 2, in <module>
    from dotenv import load_dotenv
ModuleNotFoundError: No module named 'dotenv'

Here is the project structure:

.
├── README.md
├── docker-compose.yml
├── node_modules
├── package.json
├── apps
│   └── sync-sheet
│       ├── Dockerfile
│       ├── Makefile
│       ├── creds.json
│       ├── googleSheetENV
│       ├── main.py
│       ├── package.json
│       ├── requirements.txt
│       └── src
├── packages
├── tsconfig.json
├── turbo.json
└── yarn.lock

The Dockerfile for the "sync-sheet" app:

FROM node:19.9.0-alpine as base

# adding apk deps to avoid node-gyp related errors and some other stuff. adds turborepo globally
RUN apk add -f --update --no-cache --virtual .gyp nano bash libc6-compat g++ \
    && yarn global add turbo \
    && apk del .gyp


RUN apk --no-cache add make python3 py3-pip


#############################################
FROM base AS pruned
WORKDIR /app
ARG APP

COPY . .

# see https://turbo.build/repo/docs/reference/command-line-reference#turbo-prune---scopetarget
RUN turbo prune --scope=sheet_sync --docker

#############################################
FROM base AS installer
WORKDIR /app
ARG APP

COPY --from=pruned /app/out/json/ .
COPY --from=pruned /app/out/yarn.lock /app/yarn.lock

# Forces the layer to recreate if the app's package.json changes
COPY services/sheet-sync/package.json /app/services/sheet-sync/package.json

# see https://github.com/moby/buildkit/blob/master/frontend/dockerfile/docs/reference.md#run---mounttypecache
RUN \
    --mount=type=cache,target=/usr/local/share/.cache/yarn/v6,sharing=locked \
    yarn --prefer-offline --frozen-lockfile

COPY --from=pruned /app/out/full/ .
COPY turbo.json turbo.json


# For example: `--filter=frontend^...` means all of frontend's dependencies will be built, but not the frontend app itself (which we don't need to do for dev environment)
RUN turbo run build --no-cache --filter=sheet_sync^...

# re-running yarn ensures that dependencies between workspaces are linked correctly
RUN \
    --mount=type=cache,target=/usr/local/share/.cache/yarn/v6,sharing=locked \
    yarn --prefer-offline --frozen-lockfile



#############################################
FROM base AS runner
WORKDIR /app


COPY --from=installer /app .

CMD yarn workspace sheet_sync start

The requirements.txt file for the "sync-sheet" app:

# ... (other dependencies)
python-dotenv==1.0.0

Makefile:

 .PHONY: test
    test:
        pytest -svv -s
        
    .PHONY: build
    build:
        python3 -m pip install -U -r requirements.txt
    
    .PHONY: debug
    debug:
        python3 -m pdb main.py
    
    .PHONY: run
    run:
        python3 main.py

Package.json inside the "sheet-sync"

{
    "name": "sheet_sync",
    "version": "0.0.1",
    "description": "",
    "main": "index.js",
    "author": "Roni Jack Vituli",
    "license": "MIT",
    "scripts": {
      "start": "make run",
      "debug": "make debug",
      "build": "make build",
    }
}

Despite having 'python-dotenv' in the requirements.txt file and the package being installed correctly in my local development environment, I still encounter the "No module named 'dotenv'" error when building the Docker container. Any insights on how to resolve this issue would be greatly appreciated!

0

There are 0 best solutions below