Pre-ES6 Generators

You’re hopefully convinced now that generators are a very important addition to the async programming toolbox. But it’s a new syntax in ES6, which means you can’t just polyfill generators like you can Promises (which are just a new API). So what can we do to bring generators to our browser JS if we don’t have the luxury of ignoring pre-ES6 browsers?

For all new syntax extensions in ES6, there are tools — the most common term for them is transpilers, for trans-compilers — which can take your ES6 syntax and transform it into equivalent (but obviously uglier!) pre-ES6 code. So, generators can be transpiled into code that will have the same behavior but work in ES5 and below.

But how? The “magic” of yield doesn’t obviously sound like code that’s easy to transpile. We actually hinted at a solution in our earlier discussion of closure-based iterators.

Manual Transformation

Before we discuss the transpilers, let’s derive how manual transpilation would work in the case of generators. This isn’t just an academic exercise, because doing so will actually help further reinforce how they work.

Consider:

  1. // `request(..)` is a Promise-aware Ajax utility
  2. function *foo(url) {
  3. try {
  4. console.log( "requesting:", url );
  5. var val = yield request( url );
  6. console.log( val );
  7. }
  8. catch (err) {
  9. console.log( "Oops:", err );
  10. return false;
  11. }
  12. }
  13. var it = foo( "http://some.url.1" );

The first thing to observe is that we’ll still need a normal foo() function that can be called, and it will still need to return an iterator. So, let’s sketch out the non-generator transformation:

  1. function foo(url) {
  2. // ..
  3. // make and return an iterator
  4. return {
  5. next: function(v) {
  6. // ..
  7. },
  8. throw: function(e) {
  9. // ..
  10. }
  11. };
  12. }
  13. var it = foo( "http://some.url.1" );

The next thing to observe is that a generator does its “magic” by suspending its scope/state, but we can emulate that with function closure (see the Scope & Closures title of this series). To understand how to write such code, we’ll first annotate different parts of our generator with state values:

  1. // `request(..)` is a Promise-aware Ajax utility
  2. function *foo(url) {
  3. // STATE *1*
  4. try {
  5. console.log( "requesting:", url );
  6. var TMP1 = request( url );
  7. // STATE *2*
  8. var val = yield TMP1;
  9. console.log( val );
  10. }
  11. catch (err) {
  12. // STATE *3*
  13. console.log( "Oops:", err );
  14. return false;
  15. }
  16. }

Note: For more accurate illustration, we split up the val = yield request.. statement into two parts, using the temporary TMP1 variable. request(..) happens in state *1*, and the assignment of its completion value to val happens in state *2*. We’ll get rid of that intermediate TMP1 when we convert the code to its non-generator equivalent.

In other words, *1* is the beginning state, *2* is the state if the request(..) succeeds, and *3* is the state if the request(..) fails. You can probably imagine how any extra yield steps would just be encoded as extra states.

Back to our transpiled generator, let’s define a variable state in the closure we can use to keep track of the state:

  1. function foo(url) {
  2. // manage generator state
  3. var state;
  4. // ..
  5. }

Now, let’s define an inner function called process(..) inside the closure which handles each state, using a switch statement:

  1. // `request(..)` is a Promise-aware Ajax utility
  2. function foo(url) {
  3. // manage generator state
  4. var state;
  5. // generator-wide variable declarations
  6. var val;
  7. function process(v) {
  8. switch (state) {
  9. case 1:
  10. console.log( "requesting:", url );
  11. return request( url );
  12. case 2:
  13. val = v;
  14. console.log( val );
  15. return;
  16. case 3:
  17. var err = v;
  18. console.log( "Oops:", err );
  19. return false;
  20. }
  21. }
  22. // ..
  23. }

Each state in our generator is represented by its own case in the switch statement. process(..) will be called each time we need to process a new state. We’ll come back to how that works in just a moment.

For any generator-wide variable declarations (val), we move those to a var declaration outside of process(..) so they can survive multiple calls to process(..). But the “block scoped” err variable is only needed for the *3* state, so we leave it in place.

In state *1*, instead of yield request(..), we did return request(..). In terminal state *2*, there was no explicit return, so we just do a return; which is the same as return undefined. In terminal state *3*, there was a return false, so we preserve that.

Now we need to define the code in the iterator functions so they call process(..) appropriately:

  1. function foo(url) {
  2. // manage generator state
  3. var state;
  4. // generator-wide variable declarations
  5. var val;
  6. function process(v) {
  7. switch (state) {
  8. case 1:
  9. console.log( "requesting:", url );
  10. return request( url );
  11. case 2:
  12. val = v;
  13. console.log( val );
  14. return;
  15. case 3:
  16. var err = v;
  17. console.log( "Oops:", err );
  18. return false;
  19. }
  20. }
  21. // make and return an iterator
  22. return {
  23. next: function(v) {
  24. // initial state
  25. if (!state) {
  26. state = 1;
  27. return {
  28. done: false,
  29. value: process()
  30. };
  31. }
  32. // yield resumed successfully
  33. else if (state == 1) {
  34. state = 2;
  35. return {
  36. done: true,
  37. value: process( v )
  38. };
  39. }
  40. // generator already completed
  41. else {
  42. return {
  43. done: true,
  44. value: undefined
  45. };
  46. }
  47. },
  48. "throw": function(e) {
  49. // the only explicit error handling is in
  50. // state *1*
  51. if (state == 1) {
  52. state = 3;
  53. return {
  54. done: true,
  55. value: process( e )
  56. };
  57. }
  58. // otherwise, an error won't be handled,
  59. // so just throw it right back out
  60. else {
  61. throw e;
  62. }
  63. }
  64. };
  65. }

How does this code work?

  1. The first call to the iterator‘s next() call would move the generator from the uninitialized state to state 1, and then call process() to handle that state. The return value from request(..), which is the promise for the Ajax response, is returned back as the value property from the next() call.
  2. If the Ajax request succeeds, the second call to next(..) should send in the Ajax response value, which moves our state to 2. process(..) is again called (this time with the passed in Ajax response value), and the value property returned from next(..) will be undefined.
  3. However, if the Ajax request fails, throw(..) should be called with the error, which would move the state from 1 to 3 (instead of 2). Again process(..) is called, this time with the error value. That case returns false, which is set as the value property returned from the throw(..) call.

From the outside — that is, interacting only with the iterator — this foo(..) normal function works pretty much the same as the *foo(..) generator would have worked. So we’ve effectively “transpiled” our ES6 generator to pre-ES6 compatibility!

We could then manually instantiate our generator and control its iterator — calling var it = foo("..") and it.next(..) and such — or better, we could pass it to our previously defined run(..) utility as run(foo,"..").

Automatic Transpilation

The preceding exercise of manually deriving a transformation of our ES6 generator to pre-ES6 equivalent teaches us how generators work conceptually. But that transformation was really intricate and very non-portable to other generators in our code. It would be quite impractical to do this work by hand, and would completely obviate all the benefit of generators.

But luckily, several tools already exist that can automatically convert ES6 generators to things like what we derived in the previous section. Not only do they do the heavy lifting work for us, but they also handle several complications that we glossed over.

One such tool is regenerator (https://facebook.github.io/regenerator/), from the smart folks at Facebook.

If we use regenerator to transpile our previous generator, here’s the code produced (at the time of this writing):

  1. // `request(..)` is a Promise-aware Ajax utility
  2. var foo = regeneratorRuntime.mark(function foo(url) {
  3. var val;
  4. return regeneratorRuntime.wrap(function foo$(context$1$0) {
  5. while (1) switch (context$1$0.prev = context$1$0.next) {
  6. case 0:
  7. context$1$0.prev = 0;
  8. console.log( "requesting:", url );
  9. context$1$0.next = 4;
  10. return request( url );
  11. case 4:
  12. val = context$1$0.sent;
  13. console.log( val );
  14. context$1$0.next = 12;
  15. break;
  16. case 8:
  17. context$1$0.prev = 8;
  18. context$1$0.t0 = context$1$0.catch(0);
  19. console.log("Oops:", context$1$0.t0);
  20. return context$1$0.abrupt("return", false);
  21. case 12:
  22. case "end":
  23. return context$1$0.stop();
  24. }
  25. }, foo, this, [[0, 8]]);
  26. });

There’s some obvious similarities here to our manual derivation, such as the switch / case statements, and we even see val pulled out of the closure just as we did.

Of course, one trade-off is that regenerator’s transpilation requires a helper library regeneratorRuntime that holds all the reusable logic for managing a general generator / iterator. A lot of that boilerplate looks different than our version, but even then, the concepts can be seen, like with context$1$0.next = 4 keeping track of the next state for the generator.

The main takeaway is that generators are not restricted to only being useful in ES6+ environments. Once you understand the concepts, you can employ them throughout your code, and use tools to transform the code to be compatible with older environments.

This is more work than just using a Promise API polyfill for pre-ES6 Promises, but the effort is totally worth it, because generators are so much better at expressing async flow control in a reason-able, sensible, synchronous-looking, sequential fashion.

Once you get hooked on generators, you’ll never want to go back to the hell of async spaghetti callbacks!