Recipe quick takes: Claire Ptak’s Egg-Yolk Chocolate-Chip Cookies

Well, that backlog I mentioned in the last post is still there, but I made something this weekend, and since I didn’t actually take many photos, I figured I might as well write about it while it’s still reasonably within reach of memory.

I’m a sucker for a new recipe for chocolate-chip cookies (or brownies — and I did the rye brownie recipe from the same source a few months ago). This past week I received a big order (more than 6 kg) from Chocosphere, all various Valrhona products, mostly repackaged from their commercial bakery line (you can buy most of these products in 2 kg or 3 kg packages, which was way more commitment than I was interested in, but Chocosphere repackages them into home-baker-friendly 1 kg bags), including their 60% cacao chocolate chips (“chips noires”). By US standards you’d probably call them “mini chips” — they aren’t as big as the standard Nestlé/Hershey/Ghirardelli/Guittard cookie chips — but I figured I might as well find a cookie recipe to try them in, and ran across the “Egg Yolk Chocolate Chip Cookies” in Claire Ptak’s The Violet Bakery Cookbook (Ten Speed Press, 2015; p. 140). She attributes the basic idea — of changing the texture of the cookie by using only egg yolks instead of whole eggs — to French pastry chef Pierre Hermé.

I don’t have a photo of the mise em place here. The ingredients are the ones you’d expect, but this recipe is formulated for “even” proportions in metric measures: 250 g softened unsalted butter, 200 g light brown sugar, 100 g granulated sugar, ½ tsp vanilla extract, 3 egg yolks, 325 g all-purpose flour (betting you could use pastry flour here, since the recipe was probably developed for soft wheat), 1¼ tsp kosher salt, ¾ tsp baking soda, and 250 g dark chocolate chips (“or broken-up bar of your favorite chocolate” — not many chocolates come in 250-gram bars!). The recipe proceeds by a modified creaming method: ingredients are combined in the normal order, in a stand mixer, but only until just mixed; Ptak specifically cautions “you are not aiming for light and fluffy here”. The recipe yields about 1175 g of dough, and for the stated yield of 16 cookies, that’s 70–75 g of dough per cookie.

The second departure from the usual method is that the dough is portioned and frozen completely before baking. I found experimentally that using a #20 disher — the most common size, which if you only have one disher is probably the one you think of as “the ice-cream scoop with the sweeper thing” — and leveling off the open end against the side of the bowl comes to nearly exactly 70 g (a few were as low as 65 and some others were as high as 80 but I’m picky). These are placed in a parchment-lined baking pan — spacing doesn’t matter — covered in plastic wrap, and frozen for at least an hour (I left mine in overnight). To bake, the solid balls are placed on a parchment-lined cookie sheet, spaced widely, and allowed to thaw slightly (5–10 minutes) before baking in a preheated 355°F (180°C) oven for 18–20 minutes. (I don’t know who did that unthinking Fahrenheit conversion: most people don’t have oven controls graduated that finely, and even for those of us who do, oven temperatures cycle within a range of 20 F° or more anyway. I usually convert 350°F as 175°C but 180°C isn’t materially different. Still, my oven has a digital display so I set it to 355° rather than 350°.)

Cookies, baked and unbaked, on cookie sheets
The photo above shows one sheet of fully baked cookies on the left, and another sheet of still-frozen dough balls on the right. (I ate one of the cookies the previous night, before it was frozen, which is why there’s one missing — it was the last of the batch and a bit runty at only 55 g; probably could have gotten ten more grams for a nearly-full serving with less sampling.)

Single cookie on a plate
The cookies were allowed to cool completely on a wire rack. Above is a close-up view of a single cookie on a plate.

Cookie broken in two to show texture
I broke the sample cookie in two to demonstrate the depth and texture. You can see that this recipe is a good match for “mini” chips like the Valrhona ones I used: larger chips would have been less evenly distributed through the dough, but in this style every bite of cookie has some chocolate in it. (Click on the image to see the full resolution: you can see how, even 10 hours after baking, the interior of the cookie is still visibly moist. These cookies are very buttery!)

Overall, these cookies have an extremely tender mouth feel, very different from the crisp texture of the Default Recipe, but still quite enjoyable — they’re almost like a shortbread with chocolate chips, but moister. The flavor has none of the toffee notes of the Default Recipe, either, belying its much higher ratio of brown to white sugar, which must be a result of the much lower baking temperature (both oven and dough temperature). Starting from frozen dough definitely keeps the cookies from spreading quite so much, giving them a good thickness — I make the Default Recipe with 50-gram portions, and these are about the same diameter despite having half again as much dough. Highly recommended. I’m not sure I’ll share any of these with my work colleagues.

Nutrition

In the absence of nutrition data for the Valrhona chips, I substituted the readily available Ghirardelli 60% dark chocolate baking chips for this computation, although they are of the larger “standard American chocolate chip” size.

Nutrition Facts
Serving size: 1 cookie (70g dough, uncooked weight)
Servings per recipe: 16
Amount per serving
Calories 351 Calories from fat 171
% Daily Value
Total Fat 19​g 30%
 Saturated Fat 12​g 59%
Trans Fat 0​g
Cholesterol 68​mg 19%
Sodium 145​mg 6%
Total Carbohydrate 42​g 14%
 Dietary fiber 2​g 7%
 Sugars 26​g
Proteins 4​g 9%
Vitamin A 10%
Vitamin C 0%
Calcium 0%
Iron 12%
Posted in Food | Tagged , , ,

Still here

I have worked up quite a backlog of posts that need to get written. I was hoping to make some progress on that this weekend, but that didn’t pan out. Neither did the 55-mile bike ride I was planning on doing. (It’s hard to do stuff in the morning if you don’t wake up in the morning.) I got distracted by Jean Yang’s blog post “The Genius Fallacy“, and felt like I ought to respond in some way to it (or to the things it reminded me of), but in the end I couldn’t figure out what actual point I wanted to make, so that didn’t happen either. (That’s how most of my projects founder, to be honest: they start with an idea, or more often a snipped of imagined dialogue, but don’t manage to develop enough to actually be worth sitting down at the keyboard and turning into writing.)

Here’s the other stuff on my agenda:

  • A short write-up on a brownie recipe I didn’t much care for
  • An essay on some questions related to gender inspired a bit by the Worldcon 75 program
  • A few photo collections from a day and a half of post-Worldcon walking around Helsinki
  • Another recipe write-up on some chocolate-chip cookies I haven’t actually made yet
Aside | Posted on by | Tagged

More musings on commuter rail

Earlier this week, I tweeted this:

There are two, largely independent backstories to this tweet. The first is that I’m going back to Helsinki next week to attend the 75th World Science Fiction Convention, and when I was in Helsinki last March and April, I was inspired to write a whole lot on that city’s excellent transit system (see post 1 and post 2). The second reason is the current Commonwealth Ave. overpass reconstruction project in Boston, which was projected to have some deleterious effects on my commute, and which made me take a more serious look at the possibility of taking commuter rail into work — at least for the duration of the construction. I ultimately decided that paying $22.50 a day plus an extra hour and a half of my time was not worth it, given that my car commute is out of peak hours and costs about the same when you factor in parking, tolls, and fuel. But that made me think about the state’s current level of (dis)investment in public transportation infrastructure and what it would take to get me out of the car, on those days when schedule or weather don’t allow for a bike commute (which is more than half the year). I concluded that commuter rail would have to offer sufficiently frequent service, even at the hours I work, and get me from Framingham to Kendall Square in less than 45 minutes — which is not as good as my car commute, 35 minutes parking space to parking space, but is at least in the same ballpark, and if implemented properly would be significantly less variable.

How could you do that, given that the current Framingham-to-South Station run is scheduled to take 49 minutes, and then there’s the Red Line beyond that? The answer, as it turns out, is pretty simple: Electric Multiple Units, or EMUs — a standard passenger rail technology throughout the world, which (when combined with the appropriate investments in track, overhead electrification, and high-level platforms at stations) can significantly reduce travel times by accelerating much faster than conventional locomotive-hauled trains, especially the diesel locomotives currently used throughout the MBTA commuter rail network. Helsinki has such a system (actually the only commuter-rail network in Finland — the rest of the country isn’t dense enough to support it), which clearly demonstrates that a cold climate in a maritime city is no obstacle to successful implementation. Helsinki’s system provides service on multiple lines from the central business district to the airport — a distance similar to my commute — every fifteen minutes. Helsinki uses a customized cold-weather version of the Stadler FLIRT for most of their services, and I know that a number of US transit agencies have ordered FLIRT equipment for their own commuter rail services, so I looked up the performance details and sat down with a simplified line diagram and a calculator to figure out what that service would look like.

The FLIRT is typically configured for a maximum speed of 160 km/h (99 mi/h). At a typical acceleration of 1.02 m/s/s (depending on configuration, this can vary from 0.8 to 1.2 m/s/s) it takes 43.5 seconds and about six tenths of a mile. (Actually, I chose that acceleration value to make it work out to exactly 0.6 mile or 965 m!) I’m assuming that the entire Framingham–Worcester line is rated for 99 mi/h. (It’s not, but remember, we’re what-ifing an investment in better service, and that would involve electrification, trackbed improvements, new platforms, and possibly some grade crossing improvements or eliminations.) I also assume that there’s a “terminal zone” between South Station and the future West Station where speeds are limited by interlocking (junctions with other lines and switching into South Station). I assume that the train can accelerate and decelerate at the same rate, and that this would be done in practice (probably not) just because it makes the math come out easier. Finally, I assume average dwell time at each station is 30 seconds — and since I don’t take the commuter rail right now I don’t know if this is overly optimistic or pessimistic.

So what does this schedule look like? Well, consider, for comparison sake, the current MBTA train 552, which leaves Worcester Union Station at 8:00 AM and arrives at South Station at 9:06, for a scheduled travel time of one hour and six minutes. This train runs express from Worcester to Yawkey, so it only has two station stops aside from the termini — and it creates a huge gap in the schedule for everyone else, because the Framingham–Worcester Line is only two tracks and there’s no way for an express to pass a local train making an intermediate station stop. Now compare that with the following schedule, making all station stops:

Worcester 800
Grafton 804
Westborough 806
Southborough 811
Ashland 813
Framingham 816
West Natick 818
Natick Center 820
Wellesley Square 822
Wellesley Hills 824
Wellesley Farms 826
Auburndale 828
West Newton 829
Newtonville 830
Boston Landing 833
(West Station) 834
Yawkey 836
Back Bay 838
South Station 841

Change ends at South Station and the same trainset leaves for Worcester at 9:00. What’s more, you can start a second trainset at Framingham, also at 8:00, and it gets to South Station at 8:27, so it can become the 8:45 outbound. (In the future, of course, you’ve also converted the Grand Junction branch and it gets Framingham residents a one-seat ride to Kendall in 25 minutes!) Repeat the same pattern every half hour from 6 AM to 11 PM, and you’ve made an enormous improvement in regional mobility and given thousands of people a practical reason to get out of their cars and onto the train. It takes, I think, four trainsets to run this service, not counting spares shared with other lines.

Well, it was a good dream, anyway. We all know that something this useful has absolutely no chance of ever making it through the MBTA bureaucracy or Beacon Hill. Numbers available on request if you want to check my math.

Posted in Transportation | Tagged , | 9 Comments

Administrivia: service interruption

Hi, folks. Just wanted to note that I’ve been blogging less this summer by intention, due to a combination of increased travel, the regular summer bike commute (finally!) and decreased baking (due to increased weight :-( ) Expect regular service to resume some time in September or October. Hopefully some photos from my trip to Worldcon 75 in Helsinki in mid-August will make it up before then.

Aside | Posted on by

“Mycroft’s Delight” Revisited

This gallery contains 15 photos.

I’ve made Diane Duane’s “Mycroft’s Delight” a couple of times in the past (I wrote it up back in January, 2016) but there were some changes I wanted to make — in particular, getting rid of the tropical-oil-laden Nutella used … Continue reading

Gallery | Tagged , , , , , , ,

Other people’s recipes: Fritz Knipschildt’s Chocolate–Peanut Butter Cookies

I should have gone out for a long bike ride today, but instead I’m writing about some chocolate-chip peanut-butter cookies I made on Saturday. At least that means I’m very nearly caught up with my posting backlog (well, except for Reykjavik). This recipe comes from Danish chocolatier Fritz Knipschildt’s Chocopologie (Houghton Mifflin Harcourt, 2015), co-written with Mary Goodbody. The name of the book comes from Knipschildt’s line of confections and former bakery-cafe in Norwalk, Connecticut, and while all of the recipes either feature chocolate or are intended to accompany chocolate, the book is rather more on the bakery side than confectionery. I’ve made a brownie recipe from this book before, but this is the first of his drop-cookie recipes I’ve tried. He calls them “chocolate–peanut butter cookies” (p. 25), but I’d say “chocolate chip–peanut butter” would be more accurate.

Mise en place
Let’s start as usual with the mise en place. This book is unfortunately one of those that includes only volumetric measurements; I used nutrition labels and Harold McGee to determine measurements by mass for the ingredients where it matters. Clockwise from top left: 85 g of unsalted butter, 128 g of smooth peanut butter, one large egg, ½ cup (120 ml) of vegetable oil, pure vanilla extract (½ tbl is used here), 240 g of all-purpose flour, 120 g of confectioner’s sugar, a 10 oz (280 g) bag of mini semi-sweet chocolate chips, some leaveners, and 120 g of light brown sugar. The small bowl of leaveners contains ½ tsp each of baking powder, baking soda, and salt.

Egg-oil emulsion
While the butter, peanut butter, and sugars are being creamed together in the stand mixer, the egg and oil are whisked together to form an emulsion.

Creamy batter before adding dry ingredients
The egg-oil emulsion is then stirred into the creamed butters and sugars until fully incorporated. The remaining dry ingredients (except chips) are stirred together and then slowly added to the wet ingredients just until fully combined.

Peanut butter batter before folding in chips
Once the peanut-butter dough comes together (and you could stop here and have a pretty decent peanut-butter cookie, or perhaps mix in chopped peanuts to complete the effect), it will be quite stiff. The mini chocolate chips are then folded in by hand with a rubber spatula. Knipschildt calls for 1¾ cups of mini chips, but I figured that a 12 oz bag is usually “2 cups” (whether it actually is or not), so a 10 oz bag like the Ghirardelli chips I was using was probably close enough to the right amount, and this is not a part of the recipe where proportions matter quite so much. At this point I had some other things to do, and packed the dough into a small mixing bowl, covered it with plastic wrap, and left it in the fridge for several hours. The recipe doesn’t call for resting like this, but many cookie doughs benefit from the extra time.

Dough after refrigerating several hours
After resting, the dough is even firmer but yet still rather crumbly. I measured the overall yield of this recipe as 1080 g (perhaps it might even be closer to 1100 g if you don’t taste-test any of the dough while preparing it), which for the stated yield of 22–24 cookies suggests a portion size of about 50 g. I settled on using a #40 disher, which gave me somewhat smaller 45 g portions rather than the 55–60 g portions I got from a #30 disher.

45-gram dough balls, squashed flat, on cookie sheet
Because the dough had been in the refrigerator and was quite stiff, I expected it to be necessary to squash the dough balls by hand before baking — if you just bake the cookies straight off this is probably unnecessary. You can easily fit twelve on a standard cookie sheet, as these cookies don’t spread much. They go in a 350°F (175°C) oven for 12 minutes — mine never got “golden brown” as the recipe calls for, but they were definitely done all the same.

Baked cookies cooling on rack
Fully baked, the cookies don’t look all that different from before baking. They need to cool on the cookie sheet (on a wire rack) for a few minutes to allow the starches to set, otherwise they will fall apart when you try to move them. Once set, they can be transferred to a wire rack.

Full batch of cookies continuing to cool
My overall impression (having eaten a few of these by now) is that they are, like most peanut butter cookies, quite tender, almost shortbread-like in their crumbliness. I would have preferred something a bit more on the moist and chewy side, and with more chocolate flavor (that last defect might be due to short-dated chocolate chips I used). They’re still not bad, and I’ll be bringing them in to work to ensure that I don’t eat them all.

Nutrition

Nutrition Facts
Serving size: 1 cookie (45g before cooking)
Servings per recipe: about 24
Amount per serving
Calories 232 Calories from fat 123
% Daily Value
Total Fat 14​g 21%
 Saturated Fat 5​g 23%
Trans Fat 0​g
Cholesterol 15​mg 5%
Sodium 109​mg 5%
Total Carbohydrate 27​g 9%
 Dietary fiber 2​g 6%
 Sugars 16​g
Proteins 4​g 8%
Vitamin A 2%
Vitamin C 0%
Calcium 1%
Iron 5%
Posted in Food | Tagged , , , ,

Other people’s recipes: Claire Ptak’s Rye Chocolate Brownies

Here in my home office, in front of the bookcase to the left of my desk I used to have a very large pile of cookbooks waiting to be scanned for interesting recipes and ultimately shelved in the kitchen bookcase with the other cookbooks. That pile is now down to just four — and that means I have a lot of new(ish) cookbooks that I am slowly starting to search when I’m looking to make something. Among those cookbooks was Claire Ptak’s The Violet Bakery Cookbook (Ten Speed Press, 2015). Ptak is a Californian who now lives and owns a bakery in London, and her cookbook is another entry in the growing list of English cookbooks crossing the pond to North America. This has its good points (yay! more recipes with flour measured by weight!) but also some downsides (hmmm, I don’t have a dish with anything like those dimensions — or, as in yesterday’s recipe for korvapuusti, where TF do I find fresh yeast?!). One of the recipes that immediately intrigued me was “Rye Chocolate Brownies” (p. 153); it’s unusual to see rye used in baked goods aside from bread, and rye bread in this country nearly always has caraway in it, which I hate, so I don’t normally even keep it on hand. In the headnote to this recipe, Ptak credits Chad Robertson of San Francisco’s Tartine with the idea of using rye and chocolate together; Violet’s brownies were originally made with spelt flour.

Mise en place
Of course we always start with a mise en place. Clockwise from bottom left: 300 g of Valrhona Caraïbe, chopped into rough chunks for melting; 150 g of unsalted butter; 50 g of cocoa powder (I used Dutch-process after noting that the recipe does not use baking soda for leavening); 200 g of light brown sugar and 200 g of granulated sugar (Ptak calls for “unrefined” sugar but doesn’t say anywhere what she actually means by that — my view is that sugar is only unrefined when it’s still inside the cane); 200 g whole rye flour; pure vanilla extract (1 tbl is used); 1 tsp salt; ½ tsp baking powder; and four eggs (as close as I could come to the 200 g that is called for with the eggs in my fridge — the recipe calls for “medium” eggs, but I know that egg sizing is not the same in Britain and the US).

Chocolate and butter melting in double boiler
The recipe proceeds along familiar lines, if you followed my “Browniefest” series from a couple of years ago; I forget what I called this particular method back then, but it’s a lot like making a genoise, except much denser (and without the careful folding). Numerous brownie recipes follow this same procedure, starting with melting the fats (chocolate and butter) together in a double boiler or microwave. I used the double boiler in this case, just because it’s a bit slower and easier to monitor. The melted fats should be allowed to cool a bit before they are used.

Dry ingredient mixture
All the dry ingredients (rye flour, salt, cocoa powder, baking powder) are just whisked together until well mixed.

Egg foam
The eggs, sugars, and vanilla are whipped together in a stand mixer until the mixture is light in color and has expanded significantly in volume. The melted chocolate-butter mixture is then drizzled in, with the mixer running, followed by the dry ingredients, mixed just until they are combined. I’d actually suggest taking this off the mixer and folding in the dry ingredients by hand, although it’s not what Ptak calls for, nor what I did this time, just because it’s a lot easier to ensure you don’t overmix the batter that way.

Brownie batter in mixer
The finished batter is quite viscous and sticky. Ptak says to pour it into a prepared, parchment-lined 8×12 baking pan — I suppose 20×30 cm may be a common size in English kitchens, but I don’t have anything like that. The closest I could come is a quarter-sheet pan, which is just about eight inches wide, but enough longer than a foot that I was a bit uncertain whether it would work or not. (Of course, the standard baking pans for brownies and other bar cookies on this side of the pond are 8×8, 9×9, and 9×13 inches — the 9×13 is very close to the volume of two 8×8 pans, so it’s common to halve or double recipes intended for these pans.)

Brownie batter in a parchment-lined quarter-sheet pan
As we all know, the thing these days is putting salty and sweet together. After spreading the rather stiff batter onto the parchment (while holding the parchment to keep it from sliding around the pan!), a teaspoon of Maldon sea salt is sprinkled over the top and the brownies are baked in a 355°F (180°C) oven for 20–25 minutes. I took mine out after 21 minutes, but they probably could have stood the whole 25. (And 5 F° is really excess precision; your typical home oven is unlikely to maintain better than a 25 F° range of the set point; many are much much worse.)

Finished pan of brownies cooling on rack
After baking, the brownies must sit in the pan on a wire rack until completely cool. I made sure there was a bit of parchment overhanging one side to ease depanning.

Brownies after portioning
Using the parchment “sling” helps to avoid a multi-flip extraction, which keeps the crinkly surface from being crushed. This recipe — unlike nearly every other brownie recipe I’ve ever tried — actually calls for reasonable (bakery-size) portions, with a specified yield of twelve. That’s vastly easier to achieve than the 18 or 24 brownies many recipes allegedly get from a 9×13 pan, so even before trying one, this recipe rises above my expectations. When passing brownies around at work, however, I found it useful to cut these portions in half, because some people look at a normal bakery serving of brownie and think “I couldn’t possibly eat that much”. (Perhaps that’s how they stay so thin. If that’s the price you have to pay, I’d rather eat brownies, thanks.)

Close-up of single brownie
Seriously. Who can say “no” to that, who is of sound mind and not gluten-intolerant or vegan? These brownies are amazing, and everyone at work loved them, even the salt-hater (after she carefully brushed the Maldon flakes off the top of her serving). This recipe is at least as good as my previous favorite, King Arthur Flour’s whole-wheat double-chocolate brownies, with fewer ingredients and an easier prep.

Single brownie, edge on

Nutrition

Nutrition Facts
Serving size: 2⅔″×3″ rectangle
Servings per recipe: 12
Amount per serving
Calories 461 Calories from fat 202
% Daily Value
Total Fat 23​g 35%
 Saturated Fat 14​g 68%
Trans Fat 0​g
Cholesterol 82​mg 27%
Sodium 428​mg 18%
Total Carbohydrate 49​g 20%
 Dietary fiber 8​g 31%
 Sugars 31​g
Proteins 7​g 15%
Vitamin A 8%
Vitamin C 0%
Calcium 4%
Iron 7%
Posted in Food | Tagged , , , , | 1 Comment

Other people’s recipes: Korvapuusti

This gallery contains 22 photos.

Those of you who follow this blog for the recipe walkthroughs are in luck, because I’m finally getting some more new recipes done. This first one was done back in April, after I got home from my trip to Finland, … Continue reading

Gallery | Tagged , , , , , | 1 Comment

A day trip to Turku

This gallery contains 46 photos.

This is nearly the final post about my March–April, 2017, trip to Finland. I should have some Reykjavik pictures to post, if I can find the time and energy to edit them all (and remember what they were), but this … Continue reading

Gallery | Tagged , , | 1 Comment

Interlude: a better way of choosing presidential electors?

In a bitterly contested U.S. presidential election, like the one last year, the question often comes up about the perceived unfairness of the Electoral College, the system of indirect democracy we use for electing presidents. Every state is entitled to choose a number of electors equal to its combined representation in the House of Representatives and the Senate — this has the effect of giving voters in small states approximately three times the voting power of voters in California. There are, on the other hand, many many more people in California, so maybe it balances out.

If you actually believe in democracy, you probably think the chief executive ought to be chosen by direct election — preferably using a ranked-choice voting system like STV (Single Transferable Vote). But to enact such a change would require a constitutional amendment, and the small states — those with artificially boosted representation in the Electoral College — have a double veto on such changes, due to the requirement of a supermajority of both the Senate and the 50 state legislatures. So people have looked at alternative ways of choosing electors that wouldn’t require a constitutional amendment. One such is the National Popular Vote Interstate Compact, which — if adopted by states representing at least 270 electoral votes and assuming no faithless electors — would give the presidency to the winner of the popular vote by voting a majority of electors for whoever that was. Currently, the NPV compact is still far from its goal of a majority of electoral college seats — unsurprisingly, the large states have ratified the compact and the small states mostly haven’t. It highlights the sort of collective-action problem inherent in fixing presidential voting: if the legislature of a member state saw partisan advantage in switching their vote, they could simply do so, by ordinary state law, and leave everyone else in the lurch.

Naturally, the question arises over whether it would be possible to have everyone’s votes count while maintaining the unfair advantage of the small states. One way to do this — which would also require collective action, since it doesn’t benefit the large states to enact it if the small states refuse to go along — would be apportion each state’s electors in accordance with the popular vote in that state. There are ways to do this which would be tolerably democratic, and there are ways to do it which are very undemocratic:

  1. You could randomly assign every voter to an “electoral district”, and give the winner of each district one elector. This only works if it’s truly random, and would be difficult to implement given how elections are implemented in most states (it’s assumed that everyone at the same polling place gets the same ballot).
  2. You could use any of a number of proportional representation systems to assign electors to candidates.
  3. You could do what Maine and Nebraska do already, and have separate electors for each Congressional District plus two at-large electors who, like Senators, represent the state-wide winner.

It should be clear that, so long as gerrymandering is permitted, option 3 is Very Bad: essentially it means that whoever controls the state legislature determines the outcome of the presidential election, but with a veneer of democracy that hides the essential corruptness of the system. Better for the legislature to just decide who the state will be voting for, as in the Old Days. So I’m focusing on option 2.

One of the common ways of apportioning representatives in a system of proportional representation is a system called the “d’Hondt Count”. It’s mathematically equivalent to what is known as “Jefferson’s method”, which Thomas Jefferson used to propose the (ultimately enacted) first apportionment of Congress after the 1790 Census. It’s not the system used for Congressional apportionment today (called the “method of equal proportions”) but it is popular around the world for legislative elections. I implemented a script that takes as input a CSV file with the state-by-state popular vote in a presidential election and outputs the results of apportioning the electors using this method. With a small modification, it’s possible to subtract out the “small state bonus” (two electors per state), and see whether that actually has an impact on the outcome or not. I then created data files representing the popular vote from the last five presidential elections (using a variety of sources), to see how things would have turned out if we had done it this way (source and data files on Github).

Year Method Outcome
2000 Actual outcome After a long court battle, ending in the Supreme Court, George W. Bush is declared the winner in Florida and therefore the presidency.
Bush/Cheney 271, Gore/Lieberman 266
d’Hondt Count Even assuming the post-Bush v. Gore tally in Florida, no candidate receives a majority; Bush wins the House of Representatives 28–17 with four delegations tied. The Senate being tied 50–50, outgoing vice president Al Gore could have cast the tie-breaking vote for his running-mate and Senate colleague Joe Lieberman.
Bush/Cheney 267, Gore/Lieberman 268, Nader/LaDuke 3
d’Hondt without bonus
(219 to win)
No difference in the outcome.
Bush/Cheney 217, Gore/Lieberman 216, Nader/LaDuke 3
2004 Actual outcome Bush/Cheney 286, Kerry/Edwards 251
d’Hondt Count Bush/Cheney 280, Kerry/Edwards 258
d’Hondt without bonus Bush/Cheney 227, Kerry/Edwards 209
2008 Actual outcome Obama/Biden 365, McCain/Palin 173
d’Hondt Count Obama/Biden 289, McCain/Palin 249
d’Hondt without bonus Obama/Biden 236, McCain/Palin 200
2012 Actual outcome Obama/Biden 332, Romney/Ryan 206
d’Hondt Count Obama/Biden 274, Romney/Ryan 264
d’Hondt without bonus Obama/Biden 225, Romney/Ryan 211
2016 Actual outcome Trump/Pence 304, Clinton/Kaine 227, Sanders/Warren 1, Kasich/Fiorina 1, Paul/Pence 1, Powell/Cantwell 1, Powell/Collins 1, Powell/Warren 1, Spotted Eagle/LaDuke 1
d’Hondt Count No candidate receives a majority, and the presidency is decided by the House of Representatives 33–16–1 for Trump. Three faithless electors for third-party candidates could give either candidate 270 EV and an outright win. (The Senate votes 52–48 for Vice President Pence.)
Clinton/Kaine 267, Trump/Pence 267, Johnson/Weld 2, Stein/Baraka 1, McMullin/Finn 1
d’Hondt without bonus No change in outcome.
Clinton/Kaine 218, Trump/Pence 214, Johnson/Weld 2, Stein/Baraka 1, McMullin/Finn 1

You’ll notice that only in the hotly contested 2000 and 2016 elections would third-party candidates have received electors under this scheme. We can recompute the assignment of electors without third-party candidates, and it turns out that the results are indeed different. In 2016, without the third-party vote, but with the current “small state bonus”, Trump and Pence win a bare majority (270 EV); if the bonus is removed, Clinton and Kaine win a two-EV majority (220 EV to 216 EV). In 2000 with the third-party vote removed, the “no bonus” scenario sends the election to the House, but the current-law scenario gives a bare majority to Gore and Lieberman.

In the title of this piece, I questioned whether this would be a better way to choose electors. Having actually worked out the results in a number of important recent cases, I have to conclude that it would not be a significant improvement over the existing system, and that we are better off demanding a true popular vote (hopefully by preference voting). About the only positive thing I can say about doing it the way I’m suggesting above is that it would make it much more clear that nearly all of the country is actually some shade of “purple” — run the scripts and you’ll see just how few states give all of their electors to a single candidate when they are allocated proportionally.

I would gladly accept data files from additional presidential elections by Github pull request.

Posted in Law & Society | Tagged , | 3 Comments