Since CodeRabbit launched a couple of months ago, it has received an enthusiastic response and hundreds of sign-ups. CodeRabbit has been installed in over 1300 GitHub organizations and typically reviews more than 2000 pull requests per day. Furthermore, the usage continues to flourish; we are experiencing a healthy week-over-week growth.
While this rapid growth is encouraging, we've encountered challenges with
OpenAI's stringent rate limits, particularly for the newer
gpt-4 model that
powers CodeRabbit. In this blog post, we will delve into the details of OpenAI
rate limits and explain how we leveraged the
FluxNinja's Aperture load management platform to
ensure a reliable experience as we continue to grow our user base.