Issue
Suppose I wrote a docker-compose.dev.yml
file to set the development environment of a Flask project (web application) using Docker. In docker-compose.dev.yml
I have set up two services, one for the database and one to run the Flask application in debug mode (which allows me to do hot changes without having to recreate/restart the containers). This allows everyone on the development team to use the same development environment very easily. However, there is a problem: it is evident that while developing an application it is necessary to install libraries, as well as to list them in the requirements.txt
file (in the case of Python). For this I only see two alternatives using a Docker development environment:
- Enter the console of the container where the Flask application is running and use the
pip install ...
andpip freeze > requirements.txt
commands. - Manually write the dependencies to the
requirements.txt
file and rebuild the containers.
The first option is a bit laborious, while the second is a bit "dirty". Is there any more suitable option than the two mentioned alternatives?
Edit: I don't know if I'm asking something that doesn't make sense, but I'd appreciate if someone could give me some guidance on what I'm trying to accomplish.
Solution
For something like this I use multi-layer docker images.
Disclaimer: The below examples are not tested. Please consider it as a mere description written in pseudo code ;)
As a very simple example, this approach could look like this:
# Make sure all layers are based on the same python version.
FROM python:3.10-slim-buster as base
# The actual dev/test image.
# This is where you can install additional dev/test requirements.
FROM base as test
COPY ./requirements_test.txt /code/requirements_test.txt
RUN python -m pip install --no-cache-dir --upgrade -r /code/requirements_test.txt
ENTRYPOINT ["python"]
# Assuming you run tests using pytest.
CMD ["-m", "pytest", "..."]
# The actual production image.
FROM base as runtime
COPY ./requirements.txt /code/requirements.txt
RUN python -m pip install --no-cache-dir --upgrade -r /code/requirements.txt
ENTRYPOINT ["python"]
# Assuming you wantto run main.py as a script.
CMD ["/path/to/main.py"]
With requirements.txt
like this (just an example):
requests
With requirements_test.txt
like this (just an example):
-r requirements.txt
pytest
In your docker-compose.yml
file you only need topassthe --target
(of the multi-layered Dockerfile, in this example: test
and runtime
) like this (not complete):
services:
service:
build:
context: .
dockerfile: ./Dockerfile
target: runtime # or test for running tests
A final thought: As I mentioned in my comment, a much better approach for dealing with such dependency requirements might be using tools like poetry
or pip-tools
- or whatever else is out there.
Update 2022-05-23:
As mentioned in the comment, for the sake of completeness and because this approach might be close to a possible solution (as requested in the question):
An example for a fire-and-forget approach could look like this - assuming the container has a specific name (<containe_name>
):
# This requires to mount the file 'requirements_dev.txt' into the container - as a volume.
docker exec -it <container_name> python -m pip install --upgrade -r requirements_dev.txt
This command simply installs new dependencies into the running container.
Answered By - rocksteady
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.