Rate-Limit Your Node.js API in Mongo

Update: After a request by Jason Humphrey, I’ve released this implementation as a standalone NPM module: mongo-throttle.

I needed to build a rate-limiting middleware for the new Narro public API, and I was inspired to make the database do my heavy lifting. In Narro’s case, that’s MongoDB.

Expiring Records From MongoDB

Mongo has a useful feature called a TTL index.

TTL collections make it possible to store data in MongoDB and have the mongod automatically remove data after a specified number of seconds or at a specific clock time.

You can tell Mongo to remove data for you! We will use this to remove expired request counts from our rate-limiting check. There are a couple important things to note about this feature:

  • As an index, it is set upon collection creation. If you want to change it, you’ll have to do so manually.
  • The index-specific field, expireAfterSeconds, is in seconds. Unlike most other timestamps in your JavaScript code, don’t divide this by 1000.

Throttle Model

First, let’s build our model to store in our rate-limiting collection. Here we define our expires TTL index on our createdAt field (it only takes one field to expire a record from the collection). We are also defining a max number of requests per IP address (conforming to an IP-specific regex).

/**
 * A rate-limiting Throttle record, by IP address
 * models/throttle.js
 */
var Throttle,
    mongoose = require('mongoose'),
    config = require('../config'),
    Schema = mongoose.Schema;

Throttle = new Schema({
    createdAt: {
        type: Date,
        required: true,
        default: Date.now,
        expires: config.rateLimit.ttl // (60 * 10), ten minutes
    },
    ip: {
        type: String,
        required: true,
        trim: true,
        match: /^(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)\.(25[0-5]|2[0-4][0-9]|[01]?[0-9][0-9]?)$/
    },
    hits: {
        type: Number,
        default: 1,
        required: true,
        max: config.rateLimit.max, // 600
        min: 0
    }
});

Throttle.index({ createdAt: 1  }, { expireAfterSeconds: config.rateLimit.ttl });
module.exports = mongoose.model('Throttle', Throttle);

Throttler Middleware

I’m using Express/Koa here, so I’m going to write this as a middleware library. All we want to do is find-or-create an existing Throttle record for the requesting IP and increment its value. Upon reaching the max, we can truncate the request chain immediately. The benefit we get from defining our model above is never having to reset records or remove them from the collection!

// Module dependencies
var config = require('../config'),
    Throttle = require('../models/throttle');

/**
   * Check for request limit on the requesting IP
   *  
   * @access public
   * @param {object} request Express-style request
   * @param {object} response Express-style response
   * @param {function} next Express-style next callback
   */ 
module.exports = function(request, response, next) {
    'use strict';
    var ip = request.headers['x-forwarded-for'] ||
        request.connection.remoteAddress ||
        request.socket.remoteAddress ||
        request.connection.socket.remoteAddress;

    // this check is necessary for some clients that set an array of IP addresses
    ip = (ip || '').split(',')[0]; 

    Throttle
        .findOneAndUpdate({ip: ip},
            { $inc: { hits: 1 } },
            { upsert: false })
        .exec(function(error, throttle) {
            if (error) {
                response.statusCode = 500;
                return next(error);
            } else if (!throttle) {
                throttle = new Throttle({
                    createdAt: new Date(),
                    ip: ip
                });
                throttle.save(function(error, throttle) {
                    if (error) {
                        response.statusCode = 500;
                        return next(error);
                    } else if (!throttle) {
                        response.statusCode = 500;
                        return response.json({
                            errors: [
                                {message: 'Error checking rate limit'}
                            ]
                        });
                    }

                    respondWithThrottle(request, response, next, throttle);
                });
            } else {
                respondWithThrottle(request, response, next, throttle);
            }
        });

    function respondWithThrottle(request, response, next, throttle) {
        var timeUntilReset = (config.rateLimit.ttl * 1000) -
                    (new Date().getTime() - throttle.createdAt.getTime()),
            remaining =  Math.max(0, (config.rateLimit.max - throttle.hits));

        response.set('X-Rate-Limit-Limit', config.rateLimit.max);
        response.set('X-Rate-Limit-Remaining', remaining);
        response.set('X-Rate-Limit-Reset', timeUntilReset);
        request.throttle = throttle;
        if (throttle.hits < config.rateLimit.max) {
            return next();
        } else {
            response.statusCode = 429;
            return response.json({
                errors: [
                    {message: 'Rate Limit reached. Please wait and try again.'}
                ]
            });
        }
    }
};

Throttling In Use

Once we have our middleware in place, we can simply drop it into the request-handling chain of Express/Koa and appropriately rate-limit our clients.

var fs = require('fs'),
    throttler = require('../lib/throttler'),
    pkg = JSON.parse(fs.readFileSync('./package.json'));

// I'll assume you've defined your app instance
app.get('/api', throttler, function(req, res) {
    res.jsonp({
        meta: {
            version: pkg.version,
            name: pkg.name
        }
    });
});

In practice, I placed the throttler middleware ahead of things like authentication. If you wanted to rate-limit on something like an API key or authenticated user record, you could do so by placing authentication ahead of rate-limiting and changing the ip field on the Throttle model to something like a user ID or API key.

Databases Doing Dirty-work

Eric did a great thing in the past two weeks with his implementation of data calculating MySQL tables. In short, he wrote a table definition that updates itself on the hour by recalculating its own columns and records by determining the accrued new data and then summing and saving rows for each of our customers. Think of it as a preemptive cache that only has as much overheard as what has accrued in the last hour, with the added benefit of being entirely contained within our MySQL table definitions.

It’s reminded me of the old adage about letting the database do the work for you. There’s usually a way to get the information collated and keyed just the way you want it, but it will take more forethought into your query. And you more than likely won’t be able to use your shiny ORM.

Inspired by Eric’s approach, I started researching some specialty methods for MongoDB. I use Mongo as the datastore for the main service (out of a few micro-services) on Narro. MongoDB doesn’t have the job scheduling Eric employed for calculating time-series data, but it does have auto-expiry of records. I wonder what we could do with this? How about building a rate-limiting service that auto-expires request counts!

Instead of Writing

A list of things I did yesterday instead of writing a thoughtful piece here:

Timestamp - Going to a Party

I need to be there at 8, and it should take ten minutes to drive there. I request an Uber at 7:40. On the map displayed within their application, the cars disappear as I make the request. They were lies! I wait five minutes for the driver that accepted. The driver is lost and he calls me for directions. I wait another ten minutes. The driver was directed to the wrong location by his Uber application. I give up and walk the block to where he is, because he is still lost. He is driving an old truck. He drives me to the wrong entrance to the building, again only because he was directed there by Uber itself. Luckily, I make it inside.

$6.50


On the way home, I leave the building and can hail a cab immediately. The driver goes quickly, directly to my door. The car is electrically-powered, as mandated by the city government.

$8.70

Swiss Army Side Project

I was thinking today of how there’s a benefit to having a tiny side project. I hope most people do. I have quite a few, but one in particular has been useful to me. I have broken out Narro into microservices from the beginning, and one of those is the podcast feed generator service.

Originally, it was written in Node.js, which was chosen for speed of development. Then, I went through a phase of learning Go and seeing server-side code in that light. So the podcast generator was re-written in Go. It also gave me the opportunity to write a podcast generation library in Go: gopod. Now, I’m learning languages again and have begun to eye the service again. I’m thinking of Rust or Lisp at this point. As I’ve written this service multiple times, it gives me the ability to compare languages and libraries as they tackle the exact same task. It’s a valuable perspective.

Unique Humans.txt Files

I was making robots.txt and humans.txt files for Narro recently, and I wanted to find a few unique examples. I was looking for something that included more than the boilerplate code from humanstxt.org. I think that the humans.txt file should be place for a bit of expression. I think rigid structure should be avoided. Please send me any others, but here are the interesting ones I found:

Accursed With a Couple Customers

I’ve seeing a trend with some of the start up companies I’ve worked at. It tends to happen that a prolific and available customer drives the majority of revenue or traffic. That’s all well and good but what usually happens is that one (or two) customers start making decisions in their best interest. Who can blame them?

The original path that would lead to many more customers for the startup is abandoned for this one customer group. It’s hard to break cycle. You tell yourself the customer in hand is worth more than all the customers in the bush. Every article out there is telling you how lucky it is to have paying customers. You must be doing something right!

You’re actually digging yourself a hole. The hole leads to the product your one customer group wants, not your original vision for the industry/product. You’ll have to decide if you want to court this one customer/group or go after the real reason you started this. I know many reasons venture capitalists can give for pivoting the company to your existing customer base. All I can say is that I haven’t seen it work out well. In each case of my experience, it has led to the once-promising startup becoming a lap-dog to the hand that feeds it. You lose employees that are frustrated without the original driving idea. And you tie the company ever-closer to this one customer.

Any place I just used the word customer could just as easily substitute the entity of investor with the same effect.

Thoughts After App Release

In the first week of the available iOS app, the Narro community nearly doubled in size.

I was happily surprised! So far, every feature I have built for Narro has been a direct result of a) some idea I had for myself, or b) some request made by an existing user. The iOS app was no exception. As such, I was expecting mostly extant users to download Narro on iOS.

I have thought before about building things for potential Narro users. I think this proves that a feature can both satisfy current customers and attract new, unknown, markets. Here’s to happy discoveries.

Thoughts While Waiting for App Review

I just pressed the button to submit Narro for iOS into the App Store. After 12 revisions, 3 weeks of testing, and 15 external beta testers, I think it’s ready to go.

I’ve worked on teams submitting apps into closed platforms (iOS, Android, Blackberry), but this is my first app submission alone. As I settle in for the inevitable waiting period while Big Apple looks over my code, here are some thoughts:

  • The documentation isn’t as nice as you may imagine.
  • It is so cool to have a hand in something that is in the hands of so many people.
  • Waiting in lines for attention just to be told you don’t smell right is no fun.
  • Since it will be reviewed, it’s nice to feel like you’re not able to fuck things up too badly.
  • Getting approval and asking for forgiveness sucks.
  • You have only one environment to work in, but only one to worry about.
  • But not really, because iOS has legacy software and hardware, too.

There are several parallel considerations between a solid web app and a solid native app. I always enjoy optimizing page load for my web apps, and so it was this time. To optimize the app size of Narro for iOS, I used dynamically-generated images for the onboarding tutorial. I didn’t have to include any images at all!

Update 2015-11-08

Got rejected on this first submission:

Apps or metadata that mentions the name of any other mobile platform will be rejected.

Argh. Remove, recompile, resubmit.

Increased Speed and Urgency

From Jack Dorsey’s re-introduction as CEO of Twitter today:

It seems strange to desire increased urgency on your team. This brings me back to some thoughts I’ve been writing down about different types of fuel powering your work. I need to compile them all together.

A sense of urgency or stress is definitely on my list, but I also have it earmarked as one that rapidly depletes you in the end game.

Timestamp - Commuting to Work

I wake up between 6:30 and 7:30. This can happen before an alarm or after sleeping straight through two. It depends more on the day before than the night before.

Wiping sleep from my eyes, I take a shower. Feed the cats while I shave and slip in my contacts over the bathroom sink. I put on clothes, usually pants and a t-shirt and then gather things into my pockets. These include: house keys, bike-lock keys, credit card, CTA card, any cash I have left, and my phone. After opening the window Marybeth has closed the night before, I put on my shoes by the door.

I place my bluetooth earphone into my ear and pair it to my phone as I walk down the stairs to the lobby. Before I’m in the second staircase down to the garage, I’ve started playing a podcast. I unlock my bike hanging in the closet and walk it out of the depths and onto the surface streets.

I hop onto the bike and ride West on Randolph, obeying traffic laws at intersections. I have a fixed gear bike, so I have to concentrate on maintaining control as I speed down the hill. You have to keep your eyes peeled - I once found a crumpled dollar bill right in the middle of the Michigan & Randolph intersection.

I eventually reach the other side of the highway, usually after a few drivers yelling at me that bicyclists should use the sidewalk. Sometimes I ride along the river and take Kinzie heading West and the people there are less furious.

Once I’m in the West Loop, my focus shifts to avoiding the potholes. I’ve gotten three flat tires in as many weeks. I ride past the building construction for Google’s new headquarters and dozens of meat-packing plants and a few bakeries. Crossing the Metra line, I arrive at our office. It’s not our office but a co-working space where we rent out a whole floor. There are five to six other companies in the building on any given day.

I carry my bike up the back stairs and punch in the key-code on an old mechanical lock. Setting my bike down and wiping sweat from my face, I tap the keyboard paired to my iMac 5k and I’m logged in.

Memory Leaks Using Canvas in Node

At ThreadMeUp, we do much of our image manipulation and generation using HTML5 Canvas objects. This allows us to build some interesting tech, like mirroring client-side interactions with the canvas onto a Node server representation.

Recently, we ran into a problem where concurrent or repetitive canvas manipulations on the server would produce huge memory leaks. This has happened for us while using the Fabric.js library for abstraction as well as with the bare node-canvas package.

Either way, the behavior was the same. We saw canvas instances created, used correctly, released but never removed form actual memory. After a few large images were placed and moved around within the context, memory would climb to above 2 gigabytes.

After trying eplicit calls with delete and setting values to null, this is what finally did the trick:

var canvas = fabric.createCanvasForNode(); // or whatever you do to create a context
// blah blah canvas manipulation
// place images, etc
canvas.clear();
canvas.dispoose();
// Now, your garbage collection will reclaim memory

New Hires are a Valuable Resource

We’ve brought on two engineers in the last four weeks, and aiming to keep up the pace. I’m constantly reminded of how valuable new hires can be. The person can be senior, green, front-end, back-end, local, or remote. All that needs to happen is open communication and clear on-boarding. The new member’s point of view will take care of the rest.

To me, one of the best things a new hire can do is expose confusion or weak points in the code. For example, part of your on-boarding should be orientation with the data models. If the new member gets confused on the model relationships, you should be listening with all your heart for exactly how he or she is tripping up. It’s the current team’s fault, not the new member’s. The same process applies to deployment practices, issue tracking, even team management.

The fresh eyes that any new team member brings are essential to improving both your group and your product. Humans adapt way too quickly to any problem brought into their lap - you should be eager for new blood to jolt you awake. The inevitable, though, will always occur - acclimation. You should encourage that new engineer to raise any and all concerns during the on-boarding process.

Bad Graphs in the Wild

Is this intentionally misleading?

I recently used this example as a source of interview questions, so I thought it would be worthwhile to write it down.

At the grocery store yesterday, I found this laminated graph in front of the egg refrigerators. I laughed out loud and immediately snapped a photo. There were other people around me and they seemed confused by my photography - hopefully they didn’t think the graph was helpful! I know the photo is only black and white above, but the printing was only black ink on white paper anyway.

I think this “informative” data is intentionally misleading.

  • Why is there no Y-axis unit? Is the unit “Cost?”
  • What is the baseline for the Y-axis? One dollar? 0%? One half of a Euro?
  • Is the unit spacing on the Y-axis linear or logarithmic or exponential or something else?
  • Why are the months on the X-axis unevenly spaced?
  • What is the relative price change in other goods? Maybe all food has risen in price the same relative amount.
  • Avian influenza is mentioned. When did that affect the price of eggs? There is no time of inflection.
  • Does this data belong to this year?
  • It is now nearly the end of August. What has the price been like the last 3 months?
  • What is the source of this data?

It was a fantastic source of questions for potential data visualization engineers today!

On Reading Fine Structure

I’ve sent these thoughts & instructions to friends more than once, so I should probably write them down.

From my Reading Lists, Fine Structure Sam Hughes

This is the content of my latest email recommendation for this series/book/collection-of-blog-posts.


This is the science fiction I was talking about during Friday’s party. It is about super heroes, hard science, time travel, universal scale and storytelling devices. I love it - read it three times.

I have some suggestions on how to read Fine Structure.

The first time reading it, do so without reading any comments. Trust me, there are pieces of information you will regret reading if you do so during your first pass. Upon a second reading of the series, your really should read the comments, as there is a great deal of thought process revealed by the writer in there.

My one request is that, if you do start reading it, you tell me. I would love to hear your thoughts! I’ll give mine.

Ah, and the author has written other science fictions (smaller than this series) at qntm.org/fiction if you like this.

Scripting Ruby with no Internet

As I sit, I’m riding on the commuter rail as it creaks and staggers its way North. I intended to write some thoughts down, but got distracted in the hassle of touching a new Jekyll post.

So, I wrote a little Ruby script - aptly - for creating a new Jekyll draft or post from the command line.

#!/usr/bin/env ruby

# jpost
# A script to create a jekyll post with provided title & current date
# and open it with current $EDITOR
# usage within Jekyll directory:
#       $ jpost My Title Goes Here [--draft]

first = ARGV[0]
last = ARGV[ARGV.count - 1]
flag = '--draft'
name = '_drafts/'

if first == flag
    ARGV.shift(1)
elsif last == flag
    ARGV.pop
else 
    name = '_posts/'
end

name += Time.now.to_s.split(' ')[0] + '-'
name += ARGV.join('-').downcase.gsub(/[^\w-]/, '')
name += '.md'

front_matter = '---\nlayout: post\ntitle: ' + ARGV.join(' ') + '\n---\n'

exec('echo "' + front_matter + '" > ' + name + ' && $EDITOR ' + name)

When I worked as a photojournalist, there was a concept of chimping: glancing at the camera LCD immediately after each photograph taken. Most photographers viewed it as a cheapening of the craft - many extolled the benefits of photographing with old film Nikons simply because you were not tempted to check the camera incessantly.

It’s interesting that I feel somewhat the same about programming without the aid of sites like Stack Overflow. As I wrote this Jekyll posting script, I loved the empowering feeling as I paged through the native Ruby man pages. Having not written Ruby in a few months, it was reassuring to know that I could find my own way without cowing to search results.

What Is a Timestamp Post?

I was reading a short post by Max Fenton about reading books in 2015 and I had an idea to timestamp my current experiences. Partly to highlight idiosyncrasies that I notice now, partly to find them later on.

Timestamp - Chicago Tech Meetup

I leave work just before 6pm, walking about a mile to an office in River North. I stand outside the front door of the building with a small group of other attendees, waiting for someone upstairs to unlock the door remotely.

Once inside, we find that we are the only attendees yet to arrive. There are large tubs of beer, cooling on ice, and Gatorade dispensers full of vodka-lemonade placed on a ping-pong table in the center of the room. There are about 20 warm pizzas, piled in a corner. On each wall of the room hangs a 60” television monitor cycling through various graphs displaying current users of the company website and phone application. As the evening advances, the lines tick upward.

I pour two drinks and start talking to a man older than myself. When I say I start talking to him, I mean that I stand and listen to him deliver his life story to me, inching closer to my face over time. He is (was?) an infrastructure engineer for various tel-com companies from years past, and pushes up his large glasses with a finger each time he mentions one by name. He is at least 20 years older than me.

Eventually, a young woman comes over to tell us that there is pizza and drinks available, and the man begins to tell her about his experience with ageism in recent years. She explains that she is a talent recruiter for the agency organizing the event. She would be happy to refer him to one of her colleagues. I tell the recruiter my position at my company and that I am here looking for hires, which makes her face light up. The man furrows his brow at me and looks disdainfully at my t-shirt and shorts.

The CEO of the office stands up on a chair and yells for attention. A large crowd has gathered by now, sticker name-tags with written names on nearly all of them. The CEO gives a speech about how his company is hiring, how great their vision for aggregated articles will be for the future, and encourages anyone to apply for a job. He emphasizes both the ping-pong table and their multiple platform-native applications as selling points for future success.

Eventually, I talk more with the recruiter and schedule a meeting with her and a colleague at our own office the following day. I also meet a web developer starting his own cryptocurrency. It is better than the 595 other cryptocurrencies vying for attention and usage online, and he has traveled to this meetup in Chicago from Michigan to convince other developers.

I meet multiple graduate students building their own versions of popular applications, but as case studies. One woman is interviewing attendees as part of her thesis, which includes building debugging software for which she has received grant money. I talk with her and another iOS developer until the event organizers herd us to the elevators. We exchange email addresses, written on paper or typed into phone applications, and I send them messages the following day.

Contribute to Open Source as a Code Test


I counted up the number of open source code projects I actively used or extended today. Just in the last twenty hours, I tallied over 50 libraries and applications and resources.

When I’m evaluating potential hires onto our team, I’m looking at things like previous work experience. I’m also actively looking for proof of someone giving back to the development community.

I usually ask candidates to try their hand at a coding project - their choice from a couple cases. One involves a node middleware library, the other involves data visualization of drone strikes. After long thought, I’m adding to that list of choices.

When evaluating potential team members, I’ll also accept someone identifying and fixing a bug or adding a feature in an open source project. I think more companies should do this as well.

I’ve seen many sites dedicated to laying out test questions and fizz-buzz problems for hiring managers. Imagine how much would be fixed if, for just half of candidates, we instead asked them to tackle open issues on public, popular repositories.

Cubic Mapping of Text

For the past several months, I’ve taken daily notes. Previously, I scribbled and jotted thoughts into several paperback notebooks, but now I type daily into my iOS/OSX Notes application. Each day begins with a new note, date stamped at the top.

Recently, I found a method to export these notes as delectable plain text. I’m thinking of performing serious analysis on them - most importantly topic mapping. I want to create something I can poke at and watch grow.

I was thinking of mapping the extracted topics in relation to one another. I’ve seen mappings of one dimension - a list of the topics. Mapping two dimensions of a topic analysis usually leaves a graph of topic density over time (when was this topic mentioned?). I was thinking a three-dimensional mapping would be interesting.

As I see it, a useful three-dimensional mapping would give me the ability to rotate the graph in space and gain understanding of a range of topics, those dimensions being:

  • Time, dates topic was mentioned
  • Relation, similar topics lie closer in this dimension
  • Specificity, a topic may be mentioned in passing or expounded upon in this dimension

I think a mapping like this would let the author traverse the notes taken and topics mentioned in a way that would show outliers or trends in thought very easily. At least worth a shot.