A slow database can cripple your application. Here are proven strategies to optimize database performance.
Create indexes on frequently queried columns:
-- Single column index
CREATE INDEX idx_users_email ON users(email);
-- Composite index (order matters!)
CREATE INDEX idx_orders_user_date ON orders(user_id, created_at);
-- Partial index for specific conditions
CREATE INDEX idx_active_users ON users(email)
WHERE status = 'active';
Write efficient queries:
-- ❌ Bad: N+1 problem
SELECT * FROM posts;
-- Then for each post:
SELECT * FROM comments WHERE post_id = ?;
-- ✅ Good: Single query with JOIN
SELECT p.*, c.*
FROM posts p
LEFT JOIN comments c ON p.id = c.post_id
WHERE p.user_id = ?;
Analyze query execution plans:
EXPLAIN ANALYZE
SELECT u.name, COUNT(o.id) as order_count
FROM users u
LEFT JOIN orders o ON u.id = o.user_id
WHERE u.created_at > '2025-01-01'
GROUP BY u.id;
Sometimes redundancy improves performance:
-- Instead of joining every time
ALTER TABLE posts ADD COLUMN comment_count INT DEFAULT 0;
-- Update with trigger
CREATE TRIGGER update_comment_count
AFTER INSERT ON comments
FOR EACH ROW
UPDATE posts SET comment_count = comment_count + 1
WHERE id = NEW.post_id;
Reuse database connections:
// Node.js with pg-pool
const { Pool } = require('pg')
const pool = new Pool({
max: 20, // Maximum connections
idleTimeoutMillis: 30000,
connectionTimeoutMillis: 2000,
})
// Use pool for queries
const result = await pool.query('SELECT * FROM users')
Implement multi-level caching:
// Redis caching example
async function getUser(id) {
// Check cache first
const cached = await redis.get(`user:${id}`)
if (cached) return JSON.parse(cached)
// Query database
const user = await db.query('SELECT * FROM users WHERE id = ?', [id])
// Cache for 1 hour
await redis.setex(`user:${id}`, 3600, JSON.stringify(user))
return user
}
Process data in chunks:
-- Instead of individual inserts
INSERT INTO logs (message, created_at) VALUES
('Message 1', NOW()),
('Message 2', NOW()),
('Message 3', NOW());
-- For updates
UPDATE products
SET price = CASE
WHEN id = 1 THEN 99.99
WHEN id = 2 THEN 149.99
WHEN id = 3 THEN 199.99
END
WHERE id IN (1, 2, 3);
Track slow queries:
-- PostgreSQL: Enable slow query log
ALTER SYSTEM SET log_min_duration_statement = 1000; -- Log queries > 1s
-- MySQL: Check slow query log
SET GLOBAL slow_query_log = 'ON';
SET GLOBAL long_query_time = 1;
Database optimization is iterative. Start by measuring current performance, identify bottlenecks, apply optimizations, and measure again. Remember: premature optimization is evil, but ignoring performance is deadly.