First steps for configuring SQL Server with Docker

Sometimes, our working habits could experiment some interesting changes, maybe forced by real needs or even simply for the valid desire to explore new ways to do some tasks.  I remember in the past, I used to install SQL Server Developer edition on my laptop f. Nonetheless, I have had to add other Databases such as PostgreSQL or MySQL mainly for proof of concepts.  Therefore, I realized how much time and resources I was spending through the conventional process of installing from scratch new Databases versions, which includes taking valuable resources from my laptop.  At this point, I was convinced of replacing old working styles concerning Databases at my dev environment.

The right combination (at least for me) is a mix of Docker and preparing a specific place to create the volumes where I want that my Databases persist.  In the case of SQL Server, we are going to use images based on Linux SQL Server.

For this article we are going to create volumes, mainly to preserve our data.  Remember, that our Data (into the database) does not belong to the Docker Container.  Why? For different reasons, mainly the volatility associated with containers, latency, etc.  Taking this into account, we will create volumes for persisting our databases into target directories, in this way even if accidentally or deliberately we destroy the containers, we could perfectly recreate them and got our Databases from the volumes previously configured.

Requisites

It is mandatory to get the last version of Docker application.  You can find more information about how to install it at the next link:

https://docs.docker.com/desktop/windows/install/

Follow the instructions in the previous link, this includes WSL 2 backend. My setup includes Windows 11.

Configuring our SQL Server through Docker container

At this point, we are ready to start.  The first thing that you should do is to locate the image repository for SQL Server.  In the following link you can find the available images:

https://hub.docker.com/_/microsoft-mssql-server

Once you have picked up the desired version, we should prepare the directories in Windows that must act as volumes and where we aim to store the databases.  In my case, I created the directories data, log, and secrets under the path D:/SqlVolume/

Executing a docker command can be done through the docker application or like my case in Powershell, for this purpose, you should open PowerShell with admin permissions and write the following command to verify that docker is healthy and ready to start.

At this point, I will share and explain by section the docker command for creating a new image containing SQL Server.

docker run --name sql2022data -e "ACCEPT_EULA=Y" -e "SA_PASSWORD=YourStrongPa$$w0rdHere" -p 1433:1433 -v D:/SqlVolume/data:/var/opt/mssql/data/ -v D:/SqlVolume/log:/var/opt/mssql/log/ -v D:/SqlVolume/secrets:/var/opt/mssql/secrets/ -d mcr.microsoft.com/mssql/server:2022-latest

After executing the previous command, you could enter inside of Docker and verify the image is running as expected.

Additionally, if you go to Powershell and type docker -ps you can find the existing containers and their status. 

Let me break it into pieces and explain to you the docker command.

docker run: It allows you to run a command in a new container. 

–name: This parameter represents the name of the new container

–e: This parameter allows us to set environment variables.  In our case, we are setting up two variables, one for confirming the acceptance of the End-User Licensing Agreement and the other for defining the password for the SA (system administrator) user.

–p: It is the parameter to set up the Port, for simplicity, we are binding port 1433 (SQL Server default port) to TCP port on our machine (host machine).

-v:  On this parameter, we mount a volume and link to the specific data/log/secrets folders associated with the SQL Server directories inside the container (remember that we are using a Linux SQL Server image).  Preparing in advance the folder in Windows system to link with default directories.

-d:  Finally, in this parameter, we indicate that the container will run in the background, without being attached to any input or output stream.

Connecting to our MS SQL Server on Linux for Docker Engine

At this time, we are ready to connect to the new SQL Server raised through Docker.  Open your SSMS and click on Connect button.  In the Server Name you should simply write period, the port is not required because we are using the same port (1433), otherwise, you should write the port defined.  Filling the Login (sa) and the password that we used in the docker run command.

We have our SQL Server ready, and we are already connected.  What else do we need? Well, for completing this article, I would like to show you how to restore a Database from a backup.

Restoring a backup into our new MS SQL Server on Linux for Docker Engine

What is the difference between restoring a Database for Docker Engine with respect to SQL Server? In reality no difference, but in practical terms, you can find some little issues. 

We starting to download the backup WideWorldImporters-Full.bak from the GitHub repository.  The next step consists in copy this backup file to the data directory, in our case, the volume is defined as D:/SqlVolume/data:/var/opt/mssql/data/

Open a new query in SSMS and execute this T-SQL statement for identifying the details of the backup.

RESTORE FILELISTONLY FROM DISK=N'/var/opt/mssql/data/WideWorldImporters-Full.bak'
GO 

As you can observe, there are a group of Physical names pointing out a specific directory.  It is recommended that you adjust the mdf, ndf, and ldf to the directories already predefined (remember we have a Linux container).

The following statement would allow us to restore the backup.

RESTORE DATABASE WideWorldImporters FROM DISK=N'/var/opt/mssql/data/WideWorldImporters-Full.bak' WITH REPLACE,
MOVE 'WWI_Primary' to '/var/opt/mssql/data/WideWorldImporters.mdf',  
MOVE 'WWI_UserData' to '/var/opt/mssql/data/WideWorldImporters_UserData.ndf',  
MOVE 'WWI_Log' to '/var/opt/mssql/log/WideWorldImporters.ldf',  
MOVE 'WWI_InMemory_Data_1' to '/var/opt/mssql/data/WideWorldImporters_InMemory_Data_1'
GO

And here is the issue that I found.  You probably get the following error:

After googling, I found a workaround that worked right.  This workaround consists in applying two small changes, the first could be related to the permission associated with the backup copied, which had the root user as the owner.  So, we are going to change the owner to mssql through this command:

docker exec -u 0 sql2022data bash -c "chown mssql /var/opt/mssql/data/WideWorldImporters-Full.bak"

This command is connecting inside the sql2022data container and activates bash and supplies a command, in this case, chown which allows us to change the owner of a file/directory.  The second step for completing the workaround consists in executing a couple of bash commands into the container, specifically the creation of new empty files that should match with the names and extension of files contained in the backup and listed previously with the FILELISTONLY command.

We have to enter bash mode inside the container:

docker exec -it sql2022data /bin/bash

We can notice the recognized user as mssql and the code after the at is the container code that you can identify in docker.  The next commands will create the empty files

Note: Remember to create the log file in the log directory.

Once you have completed this step, finally we could try to restore the backup and it should work as expected.

Finally, we have our Database restored successfully. 

I hope that this article would be useful for you, considering the great benefit associated with the use of Docker, especially for setting up quickly our new development environment.   Finally, it is interesting to recall that thanks to our volumes, independently that we destroy the container, we would have the option to preserve our databases for future new containers/images.  Happy coding!

geohernandez

Recent Posts

Getting Started with SnowSQL: Connecting to Your Snowflake Account

In this quick guide, we’ll walk through the essential steps to connect to Snowflake using…

2 weeks ago

A new step in my career as a Senior Data Architect

I am thrilled to share that I have embarked on a new professional journey as…

4 months ago

Data Modeling and its relevance in the Cloud Era

Since 2005, I've immersed myself in the dynamic world of data and its modeling. It's…

6 months ago

Formatting our Postgres scripts with pgformatter in DBeaver

Are you a PostgreSQL enthusiast using DBeaver on a Windows Platform? If you find yourself…

11 months ago

Looking back to Kimball’s approach to Data Warehousing

Over time, it's fascinating to witness how certain concepts, approaches, or visions age. Time, in…

12 months ago

List Comprehension and Walrus operator in Python

When we are working with lists, dictionaries, and sets in Python, we have a special…

1 year ago