RateLimit in Nginx with Redis

Gokul K
4 min readAug 14, 2023

--

Using central caching, especially with Redis, in conjunction with NGINX offers several benefits that enhance the performance, scalability, and resilience of your web applications. Here are some reasons why you should consider using central caching with NGINX and Redis:

  1. Faster Response Times: Caching frequently requested content at the edge (closest to the user) reduces the load on your backend servers. This means that users receive cached content directly from the cache, resulting in faster response times and a smoother user experience.
  2. Reduced Server Load: Caching static or semi-static content like images, CSS files, and JavaScript resources allows your backend servers to handle fewer requests, freeing up resources to handle more dynamic and personalized requests.
  3. Scalability: Central caching helps you scale your infrastructure efficiently. By serving cached content from Redis, you can handle a larger number of users without adding extra load to your backend servers. This is particularly valuable during traffic spikes.
  4. Less Database Load: Caching database queries or results reduces the load on your database server. This is particularly beneficial for read-heavy workloads, as you can serve cached data without hitting the database unnecessarily.
  5. Distributed Caching: Redis supports distributed caching, which means you can have multiple Redis instances spread across different servers or even different data centers. This enhances redundancy and resilience, reducing the risk of a single point of failure.
  6. Customization and Personalization: Redis allows you to implement more advanced caching strategies, such as user-specific caching or caching based on user roles. This enables you to deliver personalized content efficiently without putting additional strain on your backend.
  7. Rate Limiting and Request Control: As mentioned earlier, Redis can be used for rate limiting and request control, preventing abuse or excessive requests from specific clients or IP addresses.
  8. Expiration and Invalidation: Redis provides mechanisms to set expiration times for cached items. This ensures that outdated or irrelevant content is automatically purged from the cache, keeping the cache fresh and up-to-date.
  9. Session Management: Redis can also be used to store and manage user sessions, enhancing your application’s session management capabilities.
  10. Microservices and API Caching: In a microservices architecture, Redis can be used to cache API responses, reducing the load on downstream services and improving overall system performance.
  11. Content Distribution: You can use Redis to store content generated by background jobs or worker processes, making it accessible to all your servers and reducing the need for repeated processing.
  12. Geo-Replication and Disaster Recovery: Redis supports geo-replication, allowing you to replicate cached data across different geographical locations. This provides a disaster recovery solution and improves availability.

In summary, central caching with Redis and NGINX offers numerous advantages, including faster response times, reduced server load, enhanced scalability, and improved resilience. It’s a versatile solution that can improve the overall performance and user experience of your web applications.

Implementing rate limiting with NGINX and Redis caching is a powerful way to control and manage incoming requests to your web server. This approach allows you to prevent abuse or overload of your server by limiting the number of requests from a specific client or IP address within a certain time frame. Here’s a general outline of how you can set up rate limiting using NGINX and Redis caching:

Install NGINX and Redis: Ensure that you have NGINX and Redis installed on your server. You might also need the NGINX ngx_http_redis module, which is not included by default. Compile NGINX with the --with-http_redis_module option or install a package that includes this module.

Configure Redis: Configure Redis to work as a cache for your rate limiting. You can specify settings like the Redis server’s IP address, port, and other relevant configurations in the Redis configuration file.

NGINX Configuration: In your NGINX configuration file (usually located in /etc/nginx/nginx.conf or /etc/nginx/sites-available/default), you need to define a limit_req_zone and limit_req directive to enable rate limiting:

http {
# …
limit_req_zone $binary_remote_addr zone=ratelimit:10m rate=10r/s;

server {
# …

location / {
limit_req zone=ratelimit burst=20 nodelay;
# Your other proxy_pass and other settings
}
}
}

In this example, you’re defining a zone named ratelimit with a size of 10 megabytes and a rate of 10 requests per second. Adjust these values according to your needs.

Implement Redis Caching: You can use the ngx_http_redis module to connect NGINX to Redis and cache rate limiting data. Here's how you might integrate Redis caching with the limit_req directive:

http {
# ...

server {
# ...

location / {
limit_req zone=ratelimit burst=20 nodelay;
access_by_lua_block {
# Lua script to interact with Redis for caching
local redis = require "resty.redis"
local red = redis:new()

red:set_timeout(1000) -- 1 second

local ok, err = red:connect("redis_server_ip", redis_server_port)
if not ok then
ngx.log(ngx.ERR, "failed to connect to Redis: ", err)
return ngx.exit(ngx.HTTP_INTERNAL_SERVER_ERROR)
end

local key = ngx.var.binary_remote_addr
local limit = tonumber(red:get(key) or 0)
if limit < 10 then
red:incr(key)
else
ngx.exit(ngx.HTTP_TOO_MANY_REQUESTS)
end

local ok, err = red:set_keepalive(10000, 100)
if not ok then
ngx.log(ngx.ERR, "failed to set keepalive: ", err)
end
}
# Your other proxy_pass and other settings
}
}
}

In this example, Lua scripting is used to interact with Redis. The script increments the request count for the client’s IP address in Redis, and if the limit is exceeded (10 requests in this case), it returns an HTTP 429 “Too Many Requests” response.

Testing and Adjustments: Test your configuration thoroughly to ensure that rate limiting is working as expected. Adjust the rate limit, burst, and other parameters according to your application’s needs.

Remember to monitor your NGINX server and Redis instance to ensure everything is working smoothly. This setup can significantly enhance the security and performance of your web application by preventing abuse and overloading.

--

--

Gokul K
Gokul K

Written by Gokul K

A startup Guy. Loves to solve problems. DevSecOps Engineer. #AWScertified #kubernetescertified #terraformcertified credly: https://www.credly.com/users/gokul.k

No responses yet