Dealing with APIs Rate-Limiting

Dealing with APIs Rate-Limiting

Every time you deal with API you need to consider this, to make your app performant

ยท

5 min read

Table of contents

I have been dealing with APIs and Building them and through this I was facing these kinds of issues, and I decided to build an app that consumes GitHub REST API without Authentication.

๐Ÿ’ก
GitHub REST API is limited to 60 requests per hour in case of unauthenticated requests.

I was enhancing my skills in dealing with Rate-limiting so I pushed it to create an app that gets you the top 3 Languages you used over the last 50 created/updated Repositories.

Project Link:
GitHub Top - get the most used 3 languages from your repos

What might surprise you is, to do that you have to fetch all the Repositories
and then fetch each repo's languages, which will lead to the use of 51 requests if you have above 50 repo.

so how we can use less request quota to be able to get more users instead of waiting for 1h to fill back our quota? This is what we will cover in this article.

This Article will cover the topic from two perspectives, so you might only read one of them, as you wish

Consuming API

As a front-end developer, this section will be important for you

so let us see how we make a request in JS, I used Axios, but it's the same as fetch

const res = await axios.get(
`https://api.github.com/users/${user}/repos?sort=created&per_page=50&type=owner`);
const data = res.data;

This will return the repositories of a specific user, so this is one request no worries about it, but in case we want to loop over each repo to get the URL of the languages then make this request to see its data which is consist of the key-value {lang : size}

so let's make the first method of what we think

const repos_lang = ['https://api.github.com/repos/Mahmoudgalalz/ghtop/languages',...]
const data: TLang[] = [];
    repos_lang.forEach((lang)=>{
        axios
        .get(lang_url)
        .then((response) => {
          const langs: object = response.data;
          Object.entries(langs).forEach(([key, value]: [string, number]) => {
            data.push({ name: key, fileSize: value });
          });
        })
        .catch(function (error) {
          console.log(error);
        });
    });
})

This request a repo's language URL that has an object have key-value data
so our array will have 50 repo

This means that we will create 50 individual requests and we might hit the Limits
so how we can make this async, this where Promises.all() comes to solve

Promises.all() takes an array of Promises to resolve it, so as we create a request it's a promise needed to be resolved

const repos_lang = ['https://api.github.com/repos/Mahmoudgalalz/ghtop/languages',...]
const langsPromises = repos_lang.map(lang=> {
    return axios.get(lang)
})
await Promise.all(langsPromises);

This code does the mission and avoids a lot of handshakes for the API, it uses the same request that did the handshake before and doesn't consume more quota, which will be unexpected but it will consume less than 10 requests to fetch 50 repo's languages.

But the downside of this code it will return null if one of the requests gets rejected, or failed
which can be handled by surrounding each request with try-catch

const repos_lang = ['https://api.github.com/repos/Mahmoudgalalz/ghtop/languages',...]
const langsPromises = repos_lang.map(lang=> {
    return axios.get(lang)
})
await Promise.all(langsPromises.map(p=>{
    p.catch(err =>{
        console.log(err)    
        return null
    })
}));

but you can avoid this and use Promise.allSettled() which does the same thing as the above code without any extra code

const repos_lang = ['https://api.github.com/repos/Mahmoudgalalz/ghtop/languages',...]
const langsPromises = repos_lang.map(lang=> {
    return axios.get(lang)
})
await Promise.allSettled(langsPromises);

this covers how to make async requests will be done extremely quickly and will use less quota.

we could add a rate limiter in the front end as we need to avoid useless requests to the server, bottleneck can do this for us

import Bottleneck from 'bottleneck';

const limiter = new Bottleneck({
    minTime: 60, //minimum time between requests
    maxConcurrent: 52, //maximum concurrent requests
});

function scheduleRequest(endpoint) {
  return limiter.schedule(()=>{
    return axios.get(endpoint);
  })
}

Caching

๐Ÿ’ก
what if you are building a social media app that has feeds, does each time user looks for the feeds will make a request for the database to get the feeds back!!!!!

in this case, will be much different but there are feeds doesn't change and remain static so you don't have to get them from the database again as it will make more usage cost even if your database returns all the feeds table, which will have a fucking extreme size, this lead to less performant app.

there are a lot of ways to deal with this kind of system, but let's assume our case

as we need to make more than just one request in the same 1h and we have enough quota, but at the same time we need to be able to access recent Requests we have made

My solution would be to think of caching these data, whatever what did you do it will be a way to have access to these data without making another request to the server

this depends on many factors, like does this data have to change over time?
does there are static things that it will never change?

I do use localStorage to store data that I got from the server after I extract what I need from them. each request made will be deleted if it meets the expiry after 2 days as it will make no big difference in our case, no one will write 1k LOC in 2 days ๐Ÿ˜‚

async function fetchData(user: string): Promise<TTop[]> {
    if (!cache(user)) {
      const res = await axios.get(`https://api.github.com/users/${user}/repos?sort=created&per_page=50&type=owner`);
      const data = res.data;
      const filtered = await filterData(data);
      cache(user, Array.from(Object.entries(filtered)));
      return filtered;
    } else {
      //@ts-ignore
      return cache(user)
    }
 }

function cache(handle:string, data?:object){
    if(isCached(handle)){
        if(data){
            if(expired(handle)){
                return cacheData(handle,data)
            }
        }
        else{
            const data = localStorage.getItem(handle) || '{}'
            return getCachedData(data);
        }
    }
    else {
        data && cacheData(handle,data);
    }

}

this code checks if the user can make a request or not by checking if has a remaining quota, or if he has cached this request before, otherwise he has to wait for a refill of his quota again.

Recap:

  • You have learned how to use Promises to make your app faster

  • You have learned about how caching works, there are a lot of ways to do caching like CDNs, and the design of a read-only database for the most frequent called requests this will be covered in another article about how to design a rate-limiter and a scalable backend service

resources

https://towardsdev.com/how-does-javascript-promise-work-under-the-hood-24fe991761f

https://www.sohamkamani.com/javascript/localstorage-with-ttl-expiry/

https://developer.mozilla.org/en-US/docs/Web/JavaScript/Guide/Using_promises

Did you find this article valuable?

Support Mahmoud Galal by becoming a sponsor. Any amount is appreciated!

ย