'SQL scripts in Dockerfile is skipped to run if I attach the deployment to a persistent volume

I create a new image base on the "mcr.microsoft.com/mssql/server" image. Then I have a script to create a new database with some tables with seeded data within the Dockerfile.

FROM mcr.microsoft.com/mssql/server
USER root

# CreateDb
COPY ./CreateDatabaseSchema.sql ./opt/scripts/

ENV ACCEPT_EULA=true
ENV MSSQL_SA_PASSWORD=myP@ssword#1

# Create database
RUN /opt/mssql/bin/sqlservr & sleep 60; /opt/mssql-tools/bin/sqlcmd -S localhost -U SA -P ${MSSQL_SA_PASSWORD} -d master -i /opt/scripts/CreateDatabaseSchema.sql

I can see the database created by my script if I don't attach it to a persistent volume, and DO NOT see the new database if I attach it to a persistent volume. I check the log and don't see any error. Looks like the system skip to process that file. What is the problem that might cause the environment to skip processing the SQL script whci defined in Dockerfile?

thanks,

Austin



Solution 1:[1]

The problem with using persistent volume is all the data in that directory is replaced by the base image. I need to learn how to create the database after volume mounts

volumeMounts:
  - mountPath: /var/opt/mssql

Solution 2:[2]

You can use docker-compose.yml and Dockerfile. Both can work together.

version: '3.9'

services:
  mysqlserver:
    build:
      context: ..
      dockerfile: Dockerfile
  restart: always
  volumes:
    - make/my/db/persistent:/var/opt/mssql

Then you can run it with:

docker-compose -f docker-compose.yml up

Have fun

Sources

This article follows the attribution requirements of Stack Overflow and is licensed under CC BY-SA 3.0.

Source: Stack Overflow

Solution Source
Solution 1
Solution 2