Or watch on Vimeo
Or watch on Vimeo
Or watch on Vimeo
Your API is a long-term promise.
I promise that my API actions, and how they behave will work as described in the documentation.
This promise is hard to uphold. Time isn’t explicitly defined in that agreement. The API creator may think they can change their API whenever it’s necessary. The API consumer probably thinks the API will remain the same forever.
These two assumptions directly oppose each other.
APIs that want to iterate and improve their product could email all of their consumers, asking them to update their integrations. But what if those companies don’t have full-time software engineers on staff? What if they have a 6-month turnaround on their engineering process? What if they don’t have anyone listening to the email address attached to their API account?
APIs that want to create a perfectly stable platform could write their API once, and never make any changes. But what if you need to deprecate a feature completely? What if your website grows and evolves, leaving your API as a confusing, inconsistent facet of your product? What if you notice a glaring oversight in your original design?
In the real world, it’s more complex. I’ve been working on a frequently changing API for many years now, and would like to share some of the techniques I’ve learned to achieve long term stability for your API.
I’m going to expand on all of these topics in future posts. Follow me, or check back later for more updates!
If you are working with JavaScript, there’s a good chance that you have a ton of promises or callbacks nested over and over again. Promises helped me clean up the numerous callbacks, but coroutines really took it to the next level. Coroutines allow you to remove callbacks entirely, and write asynchronous code that looks completely synchronous. In a couple of quick steps, I’ll show you how to simplify your promise-based code by converting to coroutines.
Note: This article briefly talks about generators. If you would like a more thorough description, check out my article on generators!
Here’s an example using only promises. I’ve made it a little complex to really show off how powerful coroutines are. Throughout the rest of this post I’ll walk you through the conversion process.
Note: The request
method performs an HTTP GET request on a url, and returns a Promise
.
function GET() {
// Make an HTTP GET request to http://www.dashron.com
return request('http://www.dashron.com')
.then(function (json) {
// parse the response
json = JSON.parse(json);
// Request a couple more web pages in response to the first request
return Promise.resolveAll([
request('http://www.dashron.com/' + json.urls[0]),
request('http://www.dashron.com/' + json.urls[1])
])
.then(function (pages) {
// Build the response object
return {
main: json,
one: pages[0],
two: pages[2]
};
});
})
.catch(function (error) {
// Handle errors
console.log(error);
});
}
// When the GET request is complete, log the response, which is a combination of all responses
GET().then(function (response) {
console.log(response);
});
First you need to create a coroutine. I’ve written a library (roads-coroutine) that helps you build coroutines. This library exposes a function, which takes a generator function as its only parameter and returns a coroutine.
var coroutine = require('roads-coroutine');
var GET = coroutine(function* GET() {
// ... Removed for brevity
});
GET().then(function (response) {
console.log(response);
});
Find all your promises, and throw yield
directly in front (without removing anything!). yield
is a keyword that can only be used in generators. In the below example, it was added before request
and Promise.resolveAll
. When everything is done, yield acts like an asynchronous equals sign. It will wait until the promise is resolved, and pass the result to the left. If the promise is rejected, yield will throw the appropriate error.
var coroutine = require('roads-coroutine');
var GET = coroutine(function* GET() {
// Make an HTTP GET request to http://www.dashron.com
return yield request('http://www.dashron.com')
.then(function (json) {
// parse the response
json = JSON.parse(json);
// Request a couple more web pages in response to the first request
return yield Promise.resolveAll([
request('http://www.dashron.com/' + json.urls[0]),
request('http://www.dashron.com/' + json.urls[1])
])
.then(function (pages) {
// Build the response object
return {
main: json,
one: pages[0],
two: pages[2]
};
});
})
.catch(function (error) {
// Handle errors
console.log(error);
});
});
// When the GET request is complete, log the response, which is a combination of all responses
GET().then(function (response) {
console.log(response);
});
Now that yield
handles your promise functions for you (by returning the result, and throwing the reject), you can just use normal variables and try/catch. In example 4 I remove all then
and catch
statements, and replace them with variables and a try/catch.
var GET = coroutine(function* GET() {
try {
// Make an HTTP GET request to http://www.dashron.com
var json = yield request('http://www.dashron.com');
// parse the response
json = JSON.parse(json);
// Request a couple more web pages in response to the first request
var pages = yield [
request('http://www.dashron.com/' + json.urls[0]),
request('http://www.dashron.com/' + json.urls[1])
];
return {
main: json,
one: pages[0],
two: pages[2]
};
} catch (error) {
// Handle errors
console.log(error);
};
});
// When the GET request is complete, log the response, which is a combination of all responses
GET().then(function (response) {
console.log(response);
});
Notice all nesting is gone. Instead of making more requests in the then
of the first promise, yield
will handle the waiting for you. The above code looks synchronous, and is much easier to read.
Aesthetics aside, this solves one major headache with promises. With promises, if you ever forget a catch, your error will be ignored, and lost forever (or caught by the hard-to-manage uncaughtRejectionHandler
. With coroutines, your exceptions will be thrown as expected, and can be processed as you see fit.
ECMAScript 7 is adding two new keywords to support this feature natively. async
functions instead of generators, and the await
keyword instead of yield
.
async function () {
console.log(await request("http://dashron.com"));
}
There are some minor differences which make this system better (such as order of operations) but we have a while until it will be available for use. In the meanwhile, keep using roads-coroutine
!
Generators were introduced in ES6, and are available on these platforms. While I have not used generators in the browser yet, I use them heavily in server-side iojs.
In my opinion, the number one reason to use generators is to clean up asynchronous code. Generators can also be used to create array-like objects, but their interactions with promises are incredibly powerful. This article will explain generators; a future article will explain how it applies to cleaning up asynchronous code. For now, I want to take you through the unique ways in which generators differ from normal functions.
First things first, here is a quick overview of how generators work. Some of this might not make sense yet, so take a quick glance and then read the full tutorial below.
Example 1
function* doStuff(value) {
var foo = yield value;
return foo;
}
Some notes about the generator:
function*
(with an asterisk).yield
statements.next()
method of the iterator.yield
will also pause execution of the generator until the iterator allows it to continue via the next()
method.yield
or return
your value. This value will be part of the object returned by the iterator’s next
method.Example 2
var iterator = doStuff("banana");
while (!iterator.done) {
iterator.next();
}
Some notes about the iterator returned by a generator:
next()
, which will resume execution of the generator until it hits the next yield
statement, or the function has completed its execution."banana"
), provide it as a parameter to next()
on the iterator. It will be the return value of a yield statement. This is optional.next()
returns an object with two properties, value
and done
.value
contains the current value of the iterator. In this case the yielded value.done
will be true
if the function has completed execution.yield
to throw an exception instead of returning a value, your iterator can use the throw()
method.Generators are different from normal functions in four ways:
*
) next to the function keyword (e.g. function* doStuff()
). This defines the function as a generator, instead of a normal function.yield
statements. (e.g. var x = yield foo();
).Before we go into why or how we use a yield
statement, let’s just talk about the syntax. The following example is a fairly basic line of code. We will compare that line to one with a yield
statement
Example 3
result = encodeURIComponent("http://www.dashron.com");
As you are probably aware, the above code is executed in two easy steps:
=
) requires a value on the right, so encodeURIComponent
is called with a parameter.encodeURIComponent
into the variable, result
.So, what happens if you add a yield
statement?
Example 4
result = yield encodeURIComponent("http://www.dashron.com");
At this level, yield
acts a bit like an assignment operator.
yield encodeURIComponent("http://www.dashron.com")
.yield
statment also requires a value on the right, so encodeURIComponent("http://www.dashron.com")
is executed with the string parameter.yield
takes the return value of encodeURIComponent()
, performs a little bit of magic (more on this later), and passes a value to the assignment operator.yield
statement into the variable, result
.Note: Unlike the assignment operator, yield
does not need a variable to its left. Like a function, you can use parenthesis to interact with the return value in place. For example, the following is valid:
Example 5
result = (yield encodeURIComponent("http://www.dashron.com")).length;
So what can yield
do? A lot actually. It’s a little complicated, so let’s go over it step by step.
yield
pauses your function, and allows you to resume execution at any time. I want to get that out of the way first, because it’s not something you see outside of generators. In fact, you don’t even need to use yield
, generators always start out paused. To see how this works, let’s check out a generator example without any yield
statements:
Example 6
function* doStuff() {
return "Noses on dowels";
}
var result = doStuff();
var nextResult = result.next();
In Example 6, result
does NOT equal "Noses on dowels"
. result
contains an iterator. This object is the “remote control” of your generator. It has a single method, next()
. Every time you call next()
on your iterator, the function will execute up until: (1) it encounters a yield
statement; or (2) the function has finished execution. Here, result
contains your iterator, and nextResult
contains contains information about the current iteration.
Now let’s add a couple of yield
statements into the mix:
Example 7
function* doStuff() {
var catchphrase = yield "Didja get that thing I sent you";
var finalphrase = yield catchphrase;
return finalphrase;
}
var result = doStuff();
var nextResult = result.next().value;
var secondResult = result.next("Blackwatch Plaid");
var finalResult = result.next("Happy Cake Oven");
Each time you call next()
, it executes part of the doStuff()
function. Let’s break down Example 7 into each call to next()
.
next()
Any time you call next()
it behaves identically, except for the first and last time. Let’s walk through each next()
call in order, starting with var nextResult = result.next();
. This call will execute the code shown in example 7.1.
Example 7.1
yield "Didja get that thing I sent you";
Notice that the code to the left of the yield
statement (var catchphrase =
) is not shown in Example 7.1, becasue it is not executed at this time. That’s because the yield
statement pauses execution before it can happen! You must interact with your iterator to continue to the rest of the code. So let’s review the second next()
call, var secondResult = result.next("Blackwatch Plaid");
. This call will execute the code shown in Example 7.2.
next()
Example 7.2
var catchphrase = yield
yield catchprase;
The first line of code in Example 7.2 needs to assign a value to the variable catchphrase
. The assignment operator is expecting a value from the yield
statement, and this value is provided by the iterator’s next()
method. Example 7.2‘s code is executed when you call result.next("Blackwatch Plaid");
, so yield
returns "Blackwatch Plaid"
.
Example 7.2 above is important, and worth re-reading. This is the standard behavior of a iterator’s next()
method. Every time you call next()
, a chunk of your generator will be executed, until there is no code left to run. next()
will fail if there is no code left, so you need to keep track of one more piece of information: the done
parameter.
Example 7.3 demonstrates the final code in this generator’s execution.
Example 7.3
var finalphrase = yield
return finalphrase;
This contains everything that is executed between the final yield
and return
statements. In example 7, code is run the third time next()
is called. Calling next()
a fourth time is not terribly useful, it will return the same value as the third, without executing any code. To make sure you don’t call next()
unnecessarily you need to keep an eye on the return values of next()
. Each time next()
is called it returns an object with two properties.
value
: This depends on the execution. If this is not the final next statement, it will contiain the yielded value. If this is the final next statement, it will contain the returned valuedone
: true
if the generator has completed execution. false
otherwise.So if done
is true, you should stop calling next()
.
Example 7 did not make use of the done
property because it wasn’t necessary. done
is used most commonly in more complex code, so let’s jump into our final example.
Example 8
function* getTen() {
for (var i = 0; i < 10; i++) {
yield i;
}
}
var gen = getTen();
Notice that the generator in Example 8 only has one visible yield
statement. This does not mean that the function execution will only be paused once. Becuase the yield
is inside a for
loop, each iteration of the loop will reach the yield
and pause execution. This specific function will pause execution 10 times, sending out a number each time (0 through 9).
To properly execute the generator you will need to call the next()
method many times. I’m lazy, and I don’t want to copy the next()
method over and over again. Instead, we can throw next()
into a loop and check return value each time. next
returns the object mentioned above (with example 7.3), so you should watch it’s done
property. As long as it evaluates to false
, we can continue to call this iterator’s next()
method.
Example 9
var progress = null;
do {
progress = gen.next();
console.log(progress.value);
} while(!progress.done);
And now we’re done! Your generator will be processed completely, hitting every yield
statement until the function is complete. But what does this have to do with asynchronous code and callbacks? I will be writing more on that in the near future, so check back soon!
It took me way too long to figure out how to get S3 cors headers working, here are my notes.
<?xml version="1.0" encoding="UTF-8"?>
<CORSConfiguration xmlns="http://s3.amazonaws.com/doc/2006-03-01/">
<CORSRule>
<AllowedOrigin>*</AllowedOrigin>
<AllowedMethod>GET</AllowedMethod>
<MaxAgeSeconds>3000</MaxAgeSeconds>
</CORSRule>
</CORSConfiguration>
[bucket].s3.amazonaws.com/[object]
crossorigin="anonymous"
. Read more here.Check out MDN for more information about CORS headers.
At API Dublin I spoke about Vimeo’s upload API, and how we rebuilt it from the ground up.
Yesterday I launched Bifocals.js, a node library for handling http responses.
It was my first big launch, and I learned a ton from it. Before I go into that, lets show some numbers.
That visit duration is abysmal. Clearly the docs need to be improved.
I received the best discussion via Facebook, and then Hacker News. No one initially knew what the hell my library did. So I wrote up a new description, which will be added to the docs later.
Bifocals makes it incredibly easy to split your web page up into little tiny chunks. It might not be immediately obvious why this is useful, but it becomes slightly more clear with an understanding of the javascript event queue.
Any time an http request hits the server, it puts your callback into a queue. Node processes this queue in order. Every time you handle a callback for an http request, a database request, any socket or i/o in general, it uses this queue. Additionally, process.nextTick will add functions onto this queue.
If each of your callbacks operate quicker, it should allow improve the request time across your application. Each time a callback completes it releases control to another callback. With mix of pages that render at different speeds, faster requests should get out of the way while longer requests are still handling i/o.
Bifocals not only allows your view functions to be smaller, but it allows them to operate out of order. All of your views can start performing IO at the same time, and no matter which one finishes first bifocals will render the final output accurately.
When all of these techniques are put together, theoretically it should create a faster overall page speed across the entire site. (I hope to have real stats soon)
If you don’t need these benefits, bifocals offers two additional nice features.
Hopefully this is a little more clear. Thanks to everyone who had comments, I have some cool new features coming soon.