WA
Home
Story
Experience
Framework
Insights
Testimonials
Mentorship

© 2025 Wesam Abousaid. All rights reserved.

Made with using Next.js & Tailwind CSS

Database Optimization: Strategies for Better Performance

Database Optimization: Strategies for Better Performance

June 3, 2025
2 min read
Wesam Abousaid
English
databaseperformancebackendoptimizationsql

Database Optimization Strategies

A slow database can cripple your application. Here are proven strategies to optimize database performance.

1. Indexing Fundamentals

Create indexes on frequently queried columns:

-- Single column index
CREATE INDEX idx_users_email ON users(email);

-- Composite index (order matters!)
CREATE INDEX idx_orders_user_date ON orders(user_id, created_at);

-- Partial index for specific conditions
CREATE INDEX idx_active_users ON users(email) 
WHERE status = 'active';

2. Query Optimization

Write efficient queries:

-- ❌ Bad: N+1 problem
SELECT * FROM posts;
-- Then for each post:
SELECT * FROM comments WHERE post_id = ?;

-- ✅ Good: Single query with JOIN
SELECT p.*, c.*
FROM posts p
LEFT JOIN comments c ON p.id = c.post_id
WHERE p.user_id = ?;

3. Use EXPLAIN

Analyze query execution plans:

EXPLAIN ANALYZE
SELECT u.name, COUNT(o.id) as order_count
FROM users u
LEFT JOIN orders o ON u.id = o.user_id
WHERE u.created_at > '2025-01-01'
GROUP BY u.id;

4. Denormalization When Needed

Sometimes redundancy improves performance:

-- Instead of joining every time
ALTER TABLE posts ADD COLUMN comment_count INT DEFAULT 0;

-- Update with trigger
CREATE TRIGGER update_comment_count
AFTER INSERT ON comments
FOR EACH ROW
UPDATE posts SET comment_count = comment_count + 1
WHERE id = NEW.post_id;

5. Connection Pooling

Reuse database connections:

// Node.js with pg-pool
const { Pool } = require('pg')

const pool = new Pool({
  max: 20, // Maximum connections
  idleTimeoutMillis: 30000,
  connectionTimeoutMillis: 2000,
})

// Use pool for queries
const result = await pool.query('SELECT * FROM users')

6. Caching Strategies

Implement multi-level caching:

// Redis caching example
async function getUser(id) {
  // Check cache first
  const cached = await redis.get(`user:${id}`)
  if (cached) return JSON.parse(cached)
  
  // Query database
  const user = await db.query('SELECT * FROM users WHERE id = ?', [id])
  
  // Cache for 1 hour
  await redis.setex(`user:${id}`, 3600, JSON.stringify(user))
  return user
}

7. Batch Operations

Process data in chunks:

-- Instead of individual inserts
INSERT INTO logs (message, created_at) VALUES
  ('Message 1', NOW()),
  ('Message 2', NOW()),
  ('Message 3', NOW());

-- For updates
UPDATE products
SET price = CASE
  WHEN id = 1 THEN 99.99
  WHEN id = 2 THEN 149.99
  WHEN id = 3 THEN 199.99
END
WHERE id IN (1, 2, 3);

8. Monitor and Analyze

Track slow queries:

-- PostgreSQL: Enable slow query log
ALTER SYSTEM SET log_min_duration_statement = 1000; -- Log queries > 1s

-- MySQL: Check slow query log
SET GLOBAL slow_query_log = 'ON';
SET GLOBAL long_query_time = 1;

Conclusion

Database optimization is iterative. Start by measuring current performance, identify bottlenecks, apply optimizations, and measure again. Remember: premature optimization is evil, but ignoring performance is deadly.


Back to Blog