Testing with Docker: A Guide to Test Containers
Ever found yourself struggling with inconsistent test environments or spending hours setting up databases for testing? Docker test containers might be the solution you've been looking for. In this post, we'll explore how to use test containers to make your testing process more reliable and efficient. What are Test Containers? Test containers are lightweight, throwaway instances of your test dependencies (like databases, message queues, or other services) that can be quickly spun up for testing and torn down afterward. They're particularly useful for integration testing, where you need to test your application's interaction with external services. Why Use Test Containers? Consistency: Every developer and CI pipeline gets the exact same environment Isolation: Each test suite can have its own fresh instance of dependencies Speed: Containers start up quickly and can be parallelized Cleanup: No leftover test data or state between runs Getting Started Let's create a simple example using a Node.js application with PostgreSQL. We'll use the testcontainers npm package. First, install the necessary dependencies: bashCopynpm install --save-dev testcontainers jest Here's our example application code: javascriptCopy// db.js const { Pool } = require('pg'); class UserRepository { constructor(connectionString) { this.pool = new Pool({ connectionString, }); } async createUser(name, email) { const result = await this.pool.query( 'INSERT INTO users(name, email) VALUES($1, $2) RETURNING id', [name, email] ); return result.rows[0].id; } async getUserById(id) { const result = await this.pool.query( 'SELECT * FROM users WHERE id = $1', [id] ); return result.rows[0]; } } module.exports = UserRepository; Now, let's write our test using test containers: javascriptCopy// db.test.js const { GenericContainer } = require('testcontainers'); const UserRepository = require('./db'); describe('UserRepository', () => { let container; let userRepository; beforeAll(async () => { // Start PostgreSQL container container = await new GenericContainer('postgres:13') .withExposedPorts(5432) .withEnv('POSTGRES_USER', 'test') .withEnv('POSTGRES_PASSWORD', 'test') .withEnv('POSTGRES_DB', 'testdb') .start(); // Create connection string const connectionString = `postgresql://test:test@${container.getHost()}:${container.getMappedPort(5432)}/testdb`; userRepository = new UserRepository(connectionString); // Initialize database schema await userRepository.pool.query(` CREATE TABLE users ( id SERIAL PRIMARY KEY, name VARCHAR(100), email VARCHAR(100) ) `); }); afterAll(async () => { await userRepository.pool.end(); await container.stop(); }); it('should create and retrieve a user', async () => { const userId = await userRepository.createUser('John Doe', 'john@example.com'); const user = await userRepository.getUserById(userId); expect(user).toEqual({ id: userId, name: 'John Doe', email: 'john@example.com' }); }); }); Best Practices Container Management Always clean up your containers after tests: javascriptCopyafterAll(async () => { await container.stop(); }); Performance Optimization Use container reuse when possible Run containers in parallel for different test suites Consider using container networking for complex scenarios Configuration Management Store container configuration in a central place: javascriptCopy// testConfig.js module.exports = { postgres: { image: 'postgres:13', user: 'test', password: 'test', database: 'testdb' } }; Advanced Scenarios Custom Images Sometimes you need a specialized container. Here's how to build one: dockerfileCopy# Dockerfile.test FROM postgres:13 COPY ./init.sql /docker-entrypoint-initdb.d/ javascriptCopyconst container = await new GenericContainer() .withBuild({ context: __dirname, dockerfile: 'Dockerfile.test' }) .start(); Multiple Containers For microservices testing: javascriptCopydescribe('Microservice Integration', () => { let pgContainer; let redisContainer; let network; beforeAll(async () => { network = await new Network().start(); pgContainer = await new GenericContainer('postgres:13') .withNetwork(network) .start(); redisContainer = await new GenericContainer('redis:6') .withNetwork(network) .start(); }); }); Common Pitfalls and Solutions Container Cleanup: Always use afterAll hooks properly Resource Management: Monitor memory usage in CI environments Network Issues: Handle connection retries for slow-starting services Data Persistence: Be careful with volume mounts in test containers Integration with CI/CD Here's an example GitHub Actions workflow: yamlCopyname: Test on: [push] jobs: test: runs-on: ubuntu-latest steps: - uses: actions/checkout@v2 - name: Use Node.js uses: actions/setup-node@v2 with: node-version: '14.x' - run: npm ins

Ever found yourself struggling with inconsistent test environments or spending hours setting up databases for testing? Docker test containers might be the solution you've been looking for. In this post, we'll explore how to use test containers to make your testing process more reliable and efficient.
What are Test Containers?
Test containers are lightweight, throwaway instances of your test dependencies (like databases, message queues, or other services) that can be quickly spun up for testing and torn down afterward. They're particularly useful for integration testing, where you need to test your application's interaction with external services.
Why Use Test Containers?
Consistency: Every developer and CI pipeline gets the exact same environment
Isolation: Each test suite can have its own fresh instance of dependencies
Speed: Containers start up quickly and can be parallelized
Cleanup: No leftover test data or state between runs
Getting Started
Let's create a simple example using a Node.js application with PostgreSQL. We'll use the testcontainers npm package.
First, install the necessary dependencies:
bashCopynpm install --save-dev testcontainers jest
Here's our example application code:
javascriptCopy// db.js
const { Pool } = require('pg');
class UserRepository {
constructor(connectionString) {
this.pool = new Pool({
connectionString,
});
}
async createUser(name, email) {
const result = await this.pool.query(
'INSERT INTO users(name, email) VALUES($1, $2) RETURNING id',
[name, email]
);
return result.rows[0].id;
}
async getUserById(id) {
const result = await this.pool.query(
'SELECT * FROM users WHERE id = $1',
[id]
);
return result.rows[0];
}
}
module.exports = UserRepository;
Now, let's write our test using test containers:
javascriptCopy// db.test.js
const { GenericContainer } = require('testcontainers');
const UserRepository = require('./db');
describe('UserRepository', () => {
let container;
let userRepository;
beforeAll(async () => {
// Start PostgreSQL container
container = await new GenericContainer('postgres:13')
.withExposedPorts(5432)
.withEnv('POSTGRES_USER', 'test')
.withEnv('POSTGRES_PASSWORD', 'test')
.withEnv('POSTGRES_DB', 'testdb')
.start();
// Create connection string
const connectionString = `postgresql://test:test@${container.getHost()}:${container.getMappedPort(5432)}/testdb`;
userRepository = new UserRepository(connectionString);
// Initialize database schema
await userRepository.pool.query(`
CREATE TABLE users (
id SERIAL PRIMARY KEY,
name VARCHAR(100),
email VARCHAR(100)
)
`);
});
afterAll(async () => {
await userRepository.pool.end();
await container.stop();
});
it('should create and retrieve a user', async () => {
const userId = await userRepository.createUser('John Doe', 'john@example.com');
const user = await userRepository.getUserById(userId);
expect(user).toEqual({
id: userId,
name: 'John Doe',
email: 'john@example.com'
});
});
});
Best Practices
- Container Management Always clean up your containers after tests: javascriptCopyafterAll(async () => { await container.stop(); });
- Performance Optimization
Use container reuse when possible
Run containers in parallel for different test suites
Consider using container networking for complex scenarios
- Configuration Management Store container configuration in a central place: javascriptCopy// testConfig.js module.exports = { postgres: { image: 'postgres:13', user: 'test', password: 'test', database: 'testdb' } }; Advanced Scenarios Custom Images Sometimes you need a specialized container. Here's how to build one: dockerfileCopy# Dockerfile.test FROM postgres:13
COPY ./init.sql /docker-entrypoint-initdb.d/
javascriptCopyconst container = await new GenericContainer()
.withBuild({
context: __dirname,
dockerfile: 'Dockerfile.test'
})
.start();
Multiple Containers
For microservices testing:
javascriptCopydescribe('Microservice Integration', () => {
let pgContainer;
let redisContainer;
let network;
beforeAll(async () => {
network = await new Network().start();
pgContainer = await new GenericContainer('postgres:13')
.withNetwork(network)
.start();
redisContainer = await new GenericContainer('redis:6')
.withNetwork(network)
.start();
});
});
Common Pitfalls and Solutions
Container Cleanup: Always use afterAll hooks properly
Resource Management: Monitor memory usage in CI environments
Network Issues: Handle connection retries for slow-starting services
Data Persistence: Be careful with volume mounts in test containers
Integration with CI/CD
Here's an example GitHub Actions workflow:
yamlCopyname: Test
on: [push]
jobs:
test:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Use Node.js
uses: actions/setup-node@v2
with:
node-version: '14.x'
- run: npm install
- run: npm test
Conclusion
Test containers provide a powerful way to manage test dependencies and ensure consistent testing environments. They're especially valuable for teams working on applications with complex infrastructure requirements.
Remember these key takeaways:
Use test containers for reliable, isolated testing environments
Always clean up containers after tests
Consider performance implications in CI/CD pipelines
Maintain good practices around configuration management
The next time you're setting up a test environment, consider if test containers could make your life easier. They might just be the testing solution you've been looking for!
Want to learn more about Docker and testing? Follow me for more articles on software development best practices!