One of the common challenges in Shopify app development is handling API rate limits. When developing an app locally, you might not notice this issue at first, mostly because the test dataset is small and you don’t have many products and orders in your test store. However, as soon as you deploy the app to production and real merchants start using it, you’ll eventually start noticing 429 Too Many Requests
errors in your error monitoring software.
This has been my experience with Shopify app development as well. When I was just starting, I used naive and inefficient approaches to deal with rate limiting. But as my apps required making a lot of API calls to fetch and update products and metafields, it quickly became a significant issue for me. I had to adapt my approach and look for more efficient ways to handle API rate limiting.
NOTE: The GraphQL examples below use the Shopify GraphQL gem. You can read more about it in my article Less painful way to work with Shopify GraphQL API in Ruby.
Strategy #1: Simple retry
The simplest approach is to make API calls until you reach the limit, then wait for a few seconds and retry. For example:
shop.with_shopify_session do
order = ShopifyAPI::Order.find(id: "...")
# do something with order
rescue ShopifyAPI::Errors::HttpResponseError => error
if error.code == 429
sleep 2
retry
else
raise error
end
end
This is probably the most popular way of dealing with rate limits, and it’s absolutely fine to start development with this method. However, it fails hard when you begin operating at scale.
Strategy #2: Background jobs backoff
When you need to make a lot of parallel API calls, the simple retry strategy stops working. Here are some real-life cases from my experience:
- Updating prices for 300k products
- Adding metafields to 30k products
- Processing product/order webhooks that require API calls
In any of these cases, you’ll need to introduce background jobs to make processing efficient and fast. The simple retry strategy won’t work well anymore.
Imagine you have 30 background workers making Shopify API calls in parallel, quickly using all available API call credits. Then, they all hit the rate limit and pause to sleep for a few seconds without the ability to process any other jobs. That doesn’t seem efficient or scalable. While your background workers are sleeping, other jobs are piling up in the queue, and your app is not processing them.
To overcome this, we can use backoff to pause all API calls for a specific shop when its API quota is exhausted. This will allow background workers to process other stores’ jobs until API credits refill.
First, we need to add an api_backoff_until
field to the Shop
model. We’ll use that field to store the time when workers can continue making API calls:
# db migration
add_column :shops, :api_backoff_untill, :datetime
Next, we need to add an interface to the Shop
model for dealing with backoff. We have to use separate methods for the REST API and GraphQL since they treat rate limits differently. Both #check_rest_rate_limit
and #check_graphql_rate_limit
methods are called after making requests to the API since we receive rate limit information in the API response.
# app/models/concerns/api_backoff.rb
module ApiBackoff
extend ActiveSupport::Concern
REST_CREDITS_THRESHOLD = 10
# The bucket size is 40 and 400 for Shopify Plus.
# The leak rate is 2/second and 20/second for Shopify Plus.
# So 20 seconds should be enough to fully restore bucket
# size for both standard and Shopify Plus stores.
REST_WAIT_TIME = 20.seconds
GRAPHQL_CREDITS_THRESHOLD = 100
# Used to check if API requests should be paused
def api_backoff?
return false unless api_backoff_untill
api_backoff_untill > Time.current
end
def check_rest_rate_limit
if ShopifyAPI.credit_left < REST_CREDITS_THRESHOLD
update!(api_backoff_untill: REST_WAIT_TIME.from_now)
end
end
def check_graphql_rate_limit(response)
return unless graphql_response.points_maxed?(threshold: GRAPHQL_CREDITS_THRESHOLD)
backoff_seconds = (graphql_response.points_limit - GRAPHQL_CREDITS_THRESHOLD) / graphql_response.points_restore_rate
update!(api_backoff_untill: backoff_seconds.seconds.from_now)
end
end
# app/models/shop.rb
class Shop < ApplicationRecord
include ApiBackoff
# ...
end
Now we can use it in background jobs to pause API calls. When Shop#api_backoff?
is active, we want to reschedule the job and let it wait until the end of the backoff period.
# app/jobs/rest_api_job.rb
class RestApiJob < ApplicationJob
def perform(shop)
if shop.api_backoff?
self.class.set(wait: shop.api_backoff_untill).perform_later(*arguments)
else
shop.with_shopify_session do
order = ShopifyAPI::Order.find(id: "...")
# do something with order
end
shop.check_rest_rate_limit
end
end
end
# app/jobs/graphql_api_job.rb
class GraphqlApiJob < ApplicationJob
def perform(shop)
if shop.api_backoff?
self.class.set(wait: shop.api_backoff_untill).perform_later(*arguments)
else
shop.with_shopify_session do
response = GetProduct.call(id: "...")
product = response.data
# do something with product
shop.check_graphql_rate_limit(response)
end
end
end
end
This approach has been serving me well over the past few years, and I continue using it in my Platmart Price Editor app. However, it has some inefficiencies too:
- We’re not taking into consideration the GraphQL query cost before making an API request. The only safeguard we have is a threshold that we’re adding using the
GRAPHQL_CREDITS_THRESHOLD
constant. - This setup relies on the database to check if it’s safe to make the call. When working with many parallel workers, by the time we fetch the Shop record from the database, the value could be outdated, and we might still face rate limit errors.
Strategy #3: Sidekiq Enterprise rate limiters
There’s another way to deal with rate limits that doesn’t have the downsides of the backoff strategy. Sidekiq Enterprise offers a rate limiting feature, which is very handy for Shopify apps. It supports both “leaky bucket” and “points-based” rate limiters, so we can use it for both Shopify REST and GraphQL APIs.
Instead of relying on API responses, Sidekiq limiters counts the currently available API quota and stores it in Redis. This allows background jobs to check if there are enough points before making an API request, thus preventing rate limit errors.
To use it with the Shopify REST API, we need to define a leaky limiter first. For this type of limiter, we’ll need to provide the bucket size and the time required to refill the bucket. These parameters will differ depending on the Shopify plan. We also need to set wait_timeout: 0
to ensure that jobs won’t sleep while waiting for the bucket to refill and will reschedule themselves for a later time instead.
class Shop < ApplicationRecord
# ...
def rest_rate_limiter
shop_key = "shopify-#{id}-rest"
bucket_size = (shopify_plan_name == "shopify_plus" ? 400 : 40)
leak_rate = case shopify_plan_name
when "shopify_plus" then 20
when "advanced" then 4
else
2
end
time_to_refill = bucket_size / leak_rate
Sidekiq::Limiter.leaky(
shop_key,
bucket_size,
time_to_refill,
ttl: 1.day,
wait_timeout: 0
)
end
end
Now we can use this limiter to make calls to the Shopify REST API. If Sidekiq hits the limit defined for this shop, it will raise an OverLimit
error and will schedule a retry for this job automatically.
# app/jobs/rest_api_job.rb
class RestApiJob < ApplicationJob
def perform(shop)
shop.rest_rate_limiter.within_limit do
shop.with_shopify_session do
order = ShopifyAPI::Order.find(id: "...")
# do something with order
end
end
end
end
For the Shopify GraphQL API, we need to use a “points-based” rate limiter. To define it, we need to provide the bucket size and the time required to fully refill the bucket. The bucket size depends on the Shopify plan and is much higher for Shopify Plus merchants.
class Shop < ApplicationRecord
# ...
def graphql_rate_limiter
shop_key = "shopify-points-#{id}-graphql"
bucket_size = case shopify_plan_name
when "shopify_plus" then 20000
when "advanced" then 4000
else
2000
end
time_to_refill = 20 #seconds
Sidekiq::Limiter.points(shop_key, bucket_size, time_to_refill)
end
end
To use this limiter, we need to provide an estimated number of points required for each query or mutation. This estimate will be used to check if we have enough points in the bucket before making an API request. Since the estimate might not be accurate, after making the request, we need to pass the actual query cost to the limiter so it can update the bucket accordingly.
# app/jobs/graphql_api_job.rb
class GraphqlApiJob < ApplicationJob
def perform(shop)
shop.graphql_rate_limiter.within_limit(estimate: 10) do |limiter|
shop.with_shopify_session do
response = GetProduct.call(id: "...")
product = response.data
# do something with product
limiter.points_used(response.query_cost)
end
end
end
end
I’m using this approach in my Platmart Swatches app. It works great, and Sidekiq handles all the complexity related to calculating the current bucket size very well. The only real downside is that the rate limiting feature requires a Sidekiq Enterprise license, which is pretty expensive for small apps ($229/mo for 100 background threads).
Strategy #4: Shopify API bulk operations
The last strategy I’d like to mention is avoiding the rate limiting issue altogether. Shopify offers bulk operations that can be used to both read and mutate data in the API. Using bulk operations, you don’t have to worry about pagination and rate limits at all. Basically, you’re delegating the heavy work of loading all data or making updates to Shopify servers. Your job is to initiate these operations and handle the results once they’re done.
The workflow is as follows:
- Creating a bulk operation with the query that you want to run.
- Waiting for the operation to complete by either polling its status or subscribing to a webhook.
- Once done, downloading the JSONL file with results and processing it.
For mutations, it’s more complex. To create a bulk operation, you have to upload a file with mutation parameters to Shopify servers first.
I won’t be sharing an example of how to use bulk operations here, as it’s a pretty complex topic that deserves a separate article. You can learn more about bulk operations in the official documentation:
Bulk operations can be a lifesaver when you need to load a lot of data from Shopify (e.g. all orders or all products) or when you need to import large chunks of data quickly. We started using bulk operations in our new app, Platmart Size Charts, for loading product data and setting metafields. It was hard to set up, but it works great now.
Main downsides of bulk operations:
- They are pretty complex. It’s difficult to set them up (especially mutations), and you have to design your app in a particular way, with async data loading and real-time status updates.
- You can run only one bulk operation of each type at a time. This is not a problem for our apps but might be a limitation for some apps at a higher scale.
- Not all GraphQL mutations are supported, which might be a problem if you’re working with orders, for example.
Summary
Out of all the described options, I prefer Sidekiq Enterprise rate limiters and bulk operations. Sidekiq limiters are really easy to deal with. It’s a set-and-forget solution that just works. All we need to do is wrap our API calls with a limiter block. However, the cost is quite high ($229/mo for a Shopify Enterprise license). If Shopify could offer a similar limiter as part of the official libraries (for both Ruby and JS), it would be very beneficial for many app developers.
The bulk operations API is complex, but once you learn it and have boilerplate code in place, it works pretty well. It’s crucial for loading large amounts of data or making bulk edits. I hope Shopify will keep improving it and eventually add support for more bulk mutations. Since it’s a difficult API to get started with, it would be great to see more guides on how to use it, with real use case examples and the boilerplate code required to use it in production.
How do you handle rate limits in Shopify apps? Let’s discuss it on X/Twitter.