End of the new-car hunt

At the end of the last month, I had to replace my eighth-generation Honda Accord V6, which was coming off lease on December 6. I had narrowed the choices down to two hybrids with very similar technology, the brand-new Accord Hybrid, and the Ford Fusion Hybrid. Both cars had fairly limited availability, although the Ford had a few years of history on the road, not to mention much better lease rates. The Fusion Hybrid has a horrible selection of interior colors (in keeping with Henry Ford’s famous dictum, they are all black on the inside, cloth or leather, unless you get one that’s black outside and red-and-black inside); the Accord Hybrid has a good set of colors, except that they’re essentially hand-built and Honda is only releasing a couple dozen of them to dealerships each month, so you’re pretty much stuck with whatever the dealer can get. On the plus side for Honda, it’s a lot easier to go from one car to another if you keep the same lessor, and you can transfer the registration; more significantly for me, Honda service is available at reasonable hours (i.e., on weekend afternoons), whereas Ford dealers close early.

I got to test-drive both the Fusion Hybrid (albeit in SE trim, not the Titanium that I would have wanted) and the Accord Hybrid, although unfortunately not on the same day. I thought the Honda was just slightly smoother and noticeably quieter; the shifts from EV to hybrid to regenerative braking were less perceptible; and the interior fit and finish was just nicer than the Ford. The only issue was the terrible leasing deal on the Accord, and that fact that the dealer only had one, it was white (not a color I would ever choose), and it was the “Touring” trim, which has several extra-cost features I would not have paid for. But in the end, the comfort of the Accord Hybrid won out, and that’s what I got, on the same three-year lease as I have had all my cars since 1999. I probably won’t use the navigation system much (I prefer to use the one in my phone, which allows me to look up radio tower locations using a browser) but it does have all the fancy technology features Honda offers (except, inexplicably, HD Radio, which Honda only sells on one trim of the Odyssey minivan). It’s been an interesting experience, looking at everything from serious sportscars to hybrids, diesels, and entry-level German and Japanese luxury nameplates.

The implied interest rate on the Honda lease works out to about 2.8% APR, which is comparable to their regular financing, so I concluded that the Accord Hybrid’s much higher lease payment must be the result of overly-pessimistic residual value assumptions. The lease paperwork seems to bear this out; the residual value on the new car is only $1000 more than the old car, even though the new car lists for $5000 more than the old one. I can only understand this as being Honda’s way of avoiding getting burned if this new Accord Hybrid turns out as poorly as the old one did, and the residual value is the same as a comparably equipped conventional I-4 Accord sedan. But this car has a California warranty (10 years/100,000 miles) on the electrical components, so after three years, if Honda has done the engineering as well as it appears, there should still be a substantial “hybrid premium” left on the car. If it turns out well, then I’ll probably get back most of those higher payments when I turn in the car. (Closed-end auto leases have a guaranteed purchase price at the end, so you can either buy the car for yourself, or the dealer can buy the car from you and share the profit from selling it if the resale price is higher than the residual value on the lease. Obviously, if the resale price is lower than the residual, then you just turn the car in and the lessor takes the loss. Thus, it is to the automakers’ financing arms to make the residual value as accurate as possible, but the car companies depend on leasing deals to sell cars, so they often provide various kinds of subsidies to make leases cheaper.) I’m guessing, since the two cars are very similar, that Ford Credit is assuming a much higher residual value than Honda Finance is, and I know from reading Ford’s financial statements that they do have a subsidy mechanism whereby Ford the parent company assumes some of the residual-value risk on leases.

I picked up the car on Wednesday evening after work — my salesman, Dennis Young at Boch Honda, stayed late to help complete the process — and I finally got home and ate dinner around 10 PM. Because I picked the car up after the normal close of business, I had to go back today (Sunday) to pick up my new registration and get an inspection sticker (something I couldn’t have done at any of the Ford dealers around here since they aren’t open for service on Sunday afternoon). Since I was transferring the registration, I had seven days’ grace period to get insurance and inspections taken care of, but it only took a day to straighten out the insurance — when I spoke to my agent, they had not yet received the paperwork from the dealer, so I let them photocopy the window sticker — and make the transfer official.

At this point I’ve only had the new car for a few days, so I can’t offer any meaningful thoughts about driving it. Since I had a V6 for so long, it will take me a while to get used to the way the hybrid power train behaves, and in particular the different sound the car makes. It will also take a while to find the weak points of the electronics and fancy safety features — I already don’t like the radio, and the adaptive cruise control requires a bit too much following distance for Massachusetts highways — but I’ll report back after a few months of winter driving with my impressions.

Posted in Transportation | Tagged , , , , ,

Markets and other human institutions (quotation)

I thought the following quotation from Cosma Shalizi, one of my favorites, seemed appropriate in light of my previous post on Evangelii Gaudium:

There is a fundamental level at which Marx’s nightmare vision is right: capitalism, the market system, whatever you want to call it, is a product of humanity, but each and every one of us confronts it as an autonomous and deeply alien force. Its ends, to the limited and debatable extent that it can even be understood as having them, are simply inhuman. The ideology of the market tell us that we face not something inhuman but superhuman, tells us to embrace our inner zombie cyborg and lose ourselves in the dance. One doesn’t know whether to laugh or cry or run screaming.

But, and this is I think something Marx did not sufficiently appreciate, human beings confront all the structures which emerge from our massed interactions in this way. A bureaucracy, or even a thoroughly democratic polity of which one is a citizen, can feel, can be, just as much of a cold monster as the market. We have no choice but to live among these alien powers which we create, and to try to direct them to human ends. It is beyond us, it is even beyond all of us, to find “a human measure, intelligible to all, chosen by all”, which says how everyone should go. What we can do is try to find the specific ways in which these powers we have conjured up are hurting us, and use them to check each other, or deflect them into better paths.

Cosma Shalizi, “In Soviet Union, Optimization Problem Solves You“, Three-Toed Sloth (2012-05-30)

Quote | Posted on by | Tagged , ,

Some thoughts from a nonbeliever on Evangelii Gaudium

One of the big bits of international news this week was the release by the Roman Catholic Church of Pope Francis’s first major official statement, Evangelii Gaudium (translated in English as “The Joy of the Gospel”). This “apostolic exhortation” set off a bit of a firestorm among economics bloggers, and I follow enough of them to have seen the commentary and been intrigued enough to download the whole thing and spend an hour or so scanning through it. Although I would now describe my orientation as “agnostic humanist” (and if I felt the need for organized religion in my life I’d probably join a Unitarian congregation), I grew up Catholic and went to Catholic schools during the primacy of John Paul II, so I have at least some perspective on this document. I will intentionally avoid cherry-picking actual quotations from Evengelii Gaudium; I certainly could find plenty of Francis’s words that I could agree with, but it seems that the blogosphere is full of people doing exactly that, and there is little value in me doing so as well. It is a long but not particularly difficult read, and if you’re interested, please go to vatican.va and get a copy of the text.

What drew the most obvious attention in my media exposure was the Pope’s attack on the modern idolatry of mammon and markets (cunningly timed to be announced right before Thanksgiving). This can only be seen as a direct rebuke to a group of (primarily American) Catholic intellectuals who have been carrying water for the reactionary right on economics issues, and their public reaction to the exhortation does make one question (somewhat uncharitably, I admit) whether they worship the Christian Trinity of Father, Son, and Holy Spirit, or have replaced it with the Neocon Trinity of Hayek, Mises, and Ronald Reagan. But while Francis may have worded his critique more strongly than his predecessors, the principled opposition of the Church to seeing human beings as mere means to greater profit has a long history. I think one black mark on the service of Pope John Paul II was his willingness to make common cause with the right wing on what we now call “hot-button social issues” — abortion, contraception, and individual self-determination more generally — in service of his opposition to the atheistic materialism of the left, particularly in the Communist bloc (as then was), when perhaps the Church should have attended more closely to its social mission, solidarity with the poor, oppressed, and downtrodden in “free” as well as socialist countries. Francis’s statement, and indeed his actions since assuming the primacy, make one hopeful that he is actually determined to right the Church’s course in this matter.

Towards the end of the document, after all the controversial economics bits, and the tutorial on homily preparation, Francis reiterates existing Church policy on abortion and the ordination of women. I do not doubt his sincerity on either point, but I expect that the latter will come in time, if not during Pope Francis’s life, then later in this century. He almost gets there, in his insistence that priests are the mechanism by which the Sacraments are delivered, but not supposed to be people of power in their own right. There was no mention of priestly celibacy, which is understood differently in the western and eastern branches of Catholicism, and it seemed like the pontiff, in his insistence on an inculturated Church, was signaling some flexibility on that matter: perhaps the “New Evangelization” might involve the development of new Rites of the Church that are more culturally appropriate for Catholic communities in the developing world, and some of them might allow married priests as Eastern Rite churches do now. But all that is just speculation.

I mentioned abortion above, an issue on which the Pope admits no flexibility. I can have some respect for this position, even though I disagree with it profoundly, because it is a reasonable conclusion to make given the Church’s “human dignity” agenda, with which I am quite sympathetic. (I disagree with the Church’s insistence that human dignity logically entails opposition to abortion and contraception, and I think a different — not necessarily better, but different — way of understanding “dignity” would result in the opposite conclusion.) There is no mention that I saw of capital punishment, another area in which the Church and its right-wing Protestant fellow travelers disagree (and rightly, in my view).

The matter of “inculturation” ties closely with one of Francis’s other pressing concerns, that of decentralization. The difficulty of reorienting such an enormous organization — remember, the Catholic Church is not only the world’s largest religion, but also an enormous business enterprise, one of the world’s largest providers of education and health care — must have made a significant impression on Francis as he began his papacy. It is said that he is serious about making significant reforms of the Church’s central administrative organ, the Roman Curia, and cleaning up the scandal-ridden, money-laundering Vatican Bank, but Evangelii Gaudium does not discuss these matters, except obliquely in his plea for a more decentralized Church and a suggestion that more authority ought to be given to the national and regional councils of bishops, a position which some more knowledgeable commentators say is at odds with the Pope’s two most recent predecessors.

There is also some hope for an end to the “wafer wars”, where otherwise faithful Catholics have been denied Communion — the entire point of the Mass — for having done something that the Church, or at least the local bishops, disagree with. This has been a particular issue in the American church, with bishops ordering that Communion be denied to Catholic lawmakers who do not vote to impose the Church’s moral teaching on the people of their counties, states, and country. But the Pope combines his teaching on this issue with elsewhere insisting that faithful Catholics, and not just the clergy, must make their moral views heard in the political process. I hope, in a democratic society, that we can come to some accommodation that respects the right of all people, not just members of a particular religion, to have their moral views heard and considered, but not taken as determinative for all of society.

I’ve already mentioned the reactions — both laudatory and angry — from economists. Radio bloviator Rush Limbaugh ripped into the Pope on Wednesday, probably egged on by those right-wing fundamentalist economists. To get a broader range of opinion, I spent some time searching news and blog sites for commentary from other people, and especially people of other religions. Unfortunately, I was not able to find very much. There was a statement from a Jewish organization (I don’t even recall which one or in which country) thanking the Pope for reiterating the Church’s commitment to dialogue and openness with Jews, but I also saw numerous commentators claiming that the Pope’s statement that the Old Covenant was still valid and thus Jews were worthy of the Church’s respect, even as it continues to try to convert them, was a MAJOR DOCTRINAL CHANGE. (I’ll admit that this was not something we covered in religion classes in high school, so I have no conception of where the Catholic Church officially falls on this spectrum. Christian groups disagree: some claim that the New Covenant was an add-on to the Old and both apply to all the faithful; some say, as Francis appears to have done, that the Old Covenant still applies to Jews but not to Christians; and some, including the critical commentators I mentioned, say that the Old Covenant was entirely superseded and Jews, having “rejected” Jesus, are categorically excluded from the People of God until they convert.) There was also some commentary about Francis’s statements about Islam, but not any from actual Muslims (at least not that I was patient enough to dig out of the fifteenth page of Google results), most of it profoundly negative and ill-informed (“OMG THE POPE SAYS ISLAM IS A RELIGION OF PEACE DOESN’T HE KNOW THEY ARE ALL RUTHLESS KILLERS AND EAT KITTENS FOR BREAKFAST?”).

A few of the Protestant commentators seemed to be at least welcoming of the statement, noting that the Pope’s emphasis on making “joyful evangelization” a fundamental and daily part of the Church’s practice reflects what their churches do all the time. However, there were some typically sour notes as well, including one memorable blog post that I saw which could best be summarized as “BREAKING: BISHOP OF ROME STILL ANTICHRIST, MORE DETAILS AT ELEVEN”, as well as the somewhat less histrionic “Well, he still believes in justification by works, so nothing he says is to be trusted by faithful Christians.” (As an aside, the doctrine of sola fide seems to me to account for a great deal of the harm done by American Protestant churches in the name of Christianity. How is it that European Protestant churches, which share the doctrine, don’t seem to give rise to the same sort of social harm?) A similar blog post took a more condescending tone, addressing the Pope directly and instructing him in the Bible study that would correct his (non-Protestant and therefore erroneous) ways.

One other major note of criticism that I’ve heard has been from issue-advocacy groups that are primarily concerned with clerical sex abuse. It is somewhat disappointing that, having already included a long tutorial on homily preparation, Francis did not also choose to offer instruction to his fellow bishops on this matter, seeing as how the numerous scandals around the world have been one of the major causes driving the faithful away from the Church, and the Pope presumably wants to reverse that. (Of course, it’s not just the sex-abuse scandals but also garden-variety hypocrisy that turns many people off. Others, like me, simply no longer find it plausible to believe what the Church claims with certainty to be true, which is not something that this document is going to reverse, no matter how joyfully Catholics from the Pope on down commit to evangelize.)

A few notes about the document itself. According to Vatican officials, it was written by the Pope himself, in his native language, Spanish. When it was first released, several translations into living languages were made available, but notably, there was no official Latin translation. A few commentators spouted the usual nonsense about how Latin was so much more precise and it would have been better if the official Latin text was released first; other more serious commentators looked for possible translation errors by comparing the various official translations with the original Spanish. One particularly odd note from both mainstream press reporting and bloggers was that nobody seemed to agree on how long the document was. Some said 40,000 words, and others said 50,000; some said 66 pages and others said 88 pages. I can’t speak to the word count (which does not seem to me to be particularly salient anyway), but I downloaded the official English-language text directly from the Vatican (they don’t appear to be using a CDN) and it is unquestionably a 224-page PDF. Did those “88 page” commentators look at the HTML version and try to print it out from their browsers? The Vatican translators have used American spelling but British punctuation conventions throughout the English-language version of the document. I’d be curious if any Spanish-language dialect expert can detect signs of the Pope’s home dialect in the (presumably edited) published Spanish text.

All in all, I like what this document has to say, at least in the parts I have read, and even where I disagree with it, it’s hard to find fault with Pope Francis’s apparent sincerity. There’s no chance that the Church could get me back — that’s water long since passed under the bridge — but if the Pope succeeds in changing the orientation of the Church in the way he has set out to do, it will still be an enormous improvement over the institution that his predecessors left for him.

Posted in Books, Law & Society | Tagged ,

On misrepresenting IQ

There’s been a lot of talk in my Twitter timeline about Boris Johnson and IQ lately. Apparently the Mayor of London thinks that some people really are more worthy than others. This controversy spilled over into the comments of a Language Log post, and commenter Ran Ari-Gur said the following (which I can’t verify):

So it’s not that IQ is an expression of some normally-distributed variable, with just the mean and standard deviation being arbitrarily assigned certain values (100 and 15); rather, it’s that IQ is an expression of some variable of unknown distribution, with all percentiles being arbitrarily assigned values according to what they would be if the variable were normally distributed with μ = 100 and σ = 15.

Think about that for the moment. Lots of people (including, most notably, Herrnstein and Murray in the title of their book) either claim or implicitly assume that not only is Spearman’s g something that is actually real (Gould’s “fallacy of reification”), but that it’s actually a normally distributed random variable. Is it? What evidence do we have for that? Evidence that’s actually independent of these IQ tests that are normally distributed by construction? I haven’t studied the literature, so I honestly don’t know the answer to this.

Aside | Posted on by | Tagged ,

Thanksgiving menu: results

No, I’m not going to post photos of everything we ate at Thanksgiving dinner, since I didn’t take any. But I do want to report on how things turned out (see previous post: “Menu planning for Thanksgiving“).

First off, it is immensely easier cooking dinner for three people, even (or maybe especially) when it’s a huge and complex meal with multiple side dishes and tight scheduling restrictions to get everything done at the same time. For one thing, there are no guests who can’t (within the bounds of politeness) be shoo-ed out of the kitchen. Even the dog was well-behaved! We made the same amount of food for three as I would have made for six (with perhaps the addition of one more side dish for guests who don’t like spinach even with a cup of cream and half a stick of butter).

The process started a week and a half ago, when I made turkey stock. I didn’t follow any particular recipe, but having done it before I had a pretty good idea how to do it: roast turkey wings (would have loved to get some turkey backs and necks but those are never available before Thanksgiving when you actually need them!) and then simmer in water with aromatic vegetables (carrots, celery, and onion), herbs, and spices for three hours or so. This ended up making about two quarts of stock, which contained so much gelatin that it completely solidified when cooled. I stuck it in the freezer for safety, and took it out on Tuesday so that it would be thawed by the time I needed. (It wasn’t, and took about 15 minutes in the microwave at 50% power to melt enough of the ice to get it back to the proper consistency.) The turkey wings I ended up buying frozen from Mayflower Poultry, which was a bit of a disappointment — I could have gotten the same thing from Stop & Shop.

Also on Tuesday, I made the Cook’s Illustrated cranberry chutney with pear, lemon, and rosemary. This recipe makes an enormous quantity of chutney, far more than six people, never mind three, could eat. (I think I coded it in my calorie-counting app as 18 servings, and even that may be an overestimate.) Usually my mother makes the cranberry sauce, but I was inspired by seeing a TV show last week (don’t recall whether it was ATK or Cook’s Country TV) with a related ginger chutney recipe.

Stepping back a few days, on Saturday, I made a trip into town to stop by MF Dulock for some sausage and salt pork. They didn’t have any salt pork, but they did have a lovely-looking garlic and thyme sausage, which the staff was ready to strip from its casing for me. The sausage goes into the stuffing (ATK “Sausage and Fennel Stuffing”), but in the mean time I stuck it in the freezer. I went on to Savenor’s in Cambridge, still looking for salt pork, and was rather disappointed that the guy working the butcher counter didn’t even know what salt pork was. Eventually I managed to find some house-made salt pork at Formaggio Kitchen (where I also picked up cheese, salami, and some sweets that I haven’t finished yet). I also prepared a timeline, showing exactly when everything needed to be going in or out of the oven to have the turkey and sides all ready for serving at 5:00.

I picked up my turkey at Whole Foods on Sunday, put it in a keep-cold bag, and drove directly to my parents’ house to stick it in their refrigerator. I also bought all the other ingredients I expected to need, with the exception of butter (folks always have plenty and my mother always buys unsalted) and marjoram (which Whole Foods didn’t have). Of course, I managed to forget to buy cheesecloth (turned out I didn’t need more) and cranberry juice (for home use) and had to go back after work on Monday for those. I had ordered a fresh free-range organic turkey from Whole Foods, which was shipped in from Pennsylvania; it turned out that I could have had one from Vermont right out of the case at the store, and there was also a local turkey farm I could have gone to right here in Framingham, both of which might have been better choices (from the standpoint of greenhouse-gas emissions if nothing else). The bird was about 13 1/2 pounds. I left my mother with instructions to dry-brine the turkey on Tuesday night, so I wouldn’t have to fight getaway-evening traffic to do it myself.

On Wednesday, I sliced the salt pork for the turkey (“Old-Fashioned Stuffed Turkey“, Cook’s Illustrated 11/2009) while it was still partially frozen. However, I noticed that the recipe recommended using salt pork that was about 50% lean, which clearly must be made with pork belly rather than fatback like the salt pork that I bought from Formaggio. It doesn’t seem to have made much of a difference, other than having more fat to separate off after deglazing the pan. One other issue with using Formaggio’s salt pork was that it still had the skin on it, which had to be removed in order to slice it.

Thanksgiving morning did not start out so well. I was late getting up, forgot to take my medication, and then got stuck in a traffic tie-up behind a three-car accident on Route 128 heading down to my parents’. My original timeline required me to be there at 11:30, and I didn’t arrive until about 11:45, so I pushed the whole thing back an hour. On the plus side, I was able to enlist my mother as sous-chef, which made the preparation vastly easier, since I didn’t have to do any of the vegetable and herb chopping myself. (Their kitchen is easily big enough for two people to work without bumping into each other. I couldn’t even dream of doing this in my condo’s tiny galley kitchen.) I ended up putting the turkey into the oven about an hour and a quarter late, but without the pressure of guests needing to drive back home, this did not stress me out unduly.

One issue that I never did manage to resolve was the disagreement between the supposedly complementary turkey and gravy recipes. The turkey recipe says to take the drippings out at the end of the low-temperature stage of cooking, whereas the gravy recipe says to leave them in until the end (since we’re deglazing the roasting pan). I stuck with the process described in the gravy recipe, which starts by putting chopped aromatics in the bottom of the roasting pan. By the end of the high-heat phase of roasting, these vegetables are nearly black, and add a wonderful deep brown color to the resulting gravy. I also followed the full broth-making protocol for the stuffing, even though I already had a proper turkey stock, and used butter rather than schmaltz for the roux; these also contributed to a better, deeper gravy than I had previously managed. However, the broth lost a lot of water in the cooking process, so I had to add an additional three cups of turkey stock (after the four that had been used in the broth originally) to make up the expected volume of broth for the gravy.

We made one major change to the stuffing recipe, which was adding about one additional cup of my turkey stock (beyond the amount called for in the recipe), so that the completed stuffing would have more of a bread-pudding texture to it rather than the distinct cubes of bread that we had experienced with this recipe before. After all this, I was still left with about a cup of turkey stock, so I never needed to break out the emergency-backup chicken broth.

I delayed starting the spinach cooking until much later than I should have, so everything else ended up waiting on the creamed spinach (The Silver Spoon‘s “Spinaci alla crema“), but the end result was excellent. It is still a shock to watch more than two pounds of spinach cook down into just four servings, after being drained and squeezed of excess liquid. (And no, the spinach wasn’t boiled: it was cooked “in just the water clinging to the leaves after rinsing”. Spinach is mostly water, and as soon as you get it hot, its cell walls burst, releasing that water into the pan. We used a tall stockpot, and added more spinach in at the top as it cooked down, until all two and a quarter pounds had reduced down to a few inches thick.) I should figure out a way to cut this recipe down so I can do it myself. (I should also try the other two recipes I found, which I had rejected because they contained onions, although this turned out to have been unnecessary.)

Everything turned out excellent, but there was so much food that no one went back for seconds. For leftovers, I took home about a pound of turkey breast, and left my parents with one whole breast, both wings, one leg quarter, and about a pound of miscellaneous trimmings. It’s important to use a proper boning knife for cutting up a turkey (or other poultry); my parents used to use a huge slicer, which is just about the worst possible knife for the job. The boning knife is just the tool for removing the wings and legs intact, and taking each breast off the carcass whole, so that it can then be sliced with an electric slicer. (This is just about the only reason for anyone to own an electric knife — otherwise they would be pretty much useless.) Cook’s Country‘s garlic mashed potato recipe makes an enormous amount — it starts with four pounds of potatoes — so there is always several meals’ worth left over. We also had two thirds of the stuffing left, most of the cranberry chutney (none of us had more than a quarter of a cup), and about a serving and a half of the creamed spinach.

One final note: the yeasted cornbread was very interesting — it tasted almost like a cross between a corn muffin and a dinner roll — but the recipe made far too much: we had three pieces left over in the basket on the table, plus half of the 13×9 pan. It sure looks like this recipe could be cut in half easily; one suspects the author of being afraid of requiring only half a packet of yeast. (Of course, my yeast doesn’t come in “packets” so I’m not remotely concerned about this.) It might have been nice to add some additional flavor to the bread, perhaps some seasonal herbs, and to my northern taste, it could have used a bit of sweetening as well.

Posted in Food | Tagged

Playlist serendipity

On Twitter just now, I noted “Sometimes it feels like an inspired segue, but I know it’s really only the output of a pseudorandom number generator.” Here’s what I was hearing:

  1. Patty Larkin, “Hotel Monte Vista” (Regrooving the Dream)
  2. Bruce Springsteen, “Radio Nowhere” (Magic)
  3. Joan Osborne, “Running Out of Time” (Righteous Love)
  4. The Alarm, “Sold Me Down the River” (The Best of The Alarm)
  5. Indigo Girls, “Virginia Woolf” (1200 Curfews)
  6. [skipped a John Hiatt song that wasn’t compatible with the mood]
  7. Grey Eye Glances, “Days to Dust” (Eventide)

Now time to head downstairs for some quality time on the stationary bike…

Aside | Posted on by | Tagged

Menu planning for Thanksgiving

T-day cookbooks
Two Thursdays from now is probably the last Thanksgiving in a while that I will drive down to my parents’ house and take the lead in preparing the big family meal. Once they move away, I’ll either be flying out just for the holiday, or we will all travel to some other location, so I won’t be in a position to make Thanksgiving dinner again. (Any attractive, athletic younger people who have a thing for homely, overweight older guys are totally welcome to help remedy that!) I’m still working out the menu plan, and when posted a photo of a big pile of cookbooks on Twitter a few minutes ago, it immediately occurred to me that I should have made a blog post instead. So here is where I’m at:

I make a standard turkey stock from frozen turkey wings a few weeks ahead of time, which is used in the gravy and the stuffing. The bird itself will continue to be the Test Kitchen’s dry-salted and barded bird (“Old Fashioned Stuffed Turkey”, Cook’s Illustrated, 11/2009) with the Web-only companion recipes for sausage and fennel stuffing and turkey gravy. This recipe gives a juicy, well-seasoned bird with a crisp, golden skin, but does require some advance preparation. Photos to come on the day appointed.

Usually my mother will take care of vegetables, dinner rolls, and pie. I generally don’t like the traditional Thanksgiving vegetables (carrots, parsnips, Brussels sprouts) so we’ll do something like spinach, which I actually do like. I’m looking at three different creamed spinach recipes (normally I wouldn’t do creamed spinach, but it’s better for sharing than my usual sauteed spinach with onions and peppers, and fits with the traditional richness of the day):

  • “Creamed spinach with nutmeg and parmesan”, from Christopher Kimball’s Yellow Farmhouse Cookbook, p. 189. (3 lb spinach, 1 pt cream, onion)
  • “Spinaci alla crema”, from The Silver Spoon, p. 568. (2 1/4 lb spinach, 1 cup cream)
  • Alice Waters’ creamed spinach, from her The Art of Simple Food, p. 312 (1 lb spinach, 1/3 cup cream, onion)

All of the recipes are made with nutmeg and parmesan, so I’m not sure why Kimball feels the need to call it out. Kimball’s notes claim that other recipes are not rich enough (!) and leave a bitter taste, hence the 3:1 ratio of spinach to cream (the others are 4.5:1 and 6:1, respectively). I’m not sure I believe this, but it might be worth a try. The principal open question for me is whether I’ll be able to use one of the recipes with onion in it; some family members are anti-onion, but I don’t know if they’ll be having dinner with us or would eat creamed spinach anyway. I’m pretty sure one pound of spinach is not enough (I can eat half of that just on my own!) so if I made Waters’ recipe I’d probably have to double it.

The other thing I’d like to do is to dispense with the dinner rolls and have cornbread instead. I’ve found numerous possibilities, and I’m not sure which will be best:

  • The Test Kitchen has both Northern- and Southern-style cornbread recipes (The New Best Recipe, pp. 693–695), which are fairly traditional.
  • King Arthur Flour’s Whole-Grain Baking has a recipe (p. 556) that uses buttermilk and honey, and also includes some odd flour. (Whole-wheat pastry flour? Whole corn meal? Where am I going to get those without mail-ordering them from the KAF store?)
  • Greg Patent’s Baking in America has a very old-fashioned (as in, before chemical leaveners) recipe (p. 118) for a yeasted cornbread that sounds interesting, although it makes a much larger quantity than any of the other recipes (which are all done in a 9×9 square pan or a 9-inch cast-iron skillet).
  • Alton Brown’s I’m Just Here for More Food has a skillet cornbread recipe (p. 118) which, oddly enough, uses all-purpose flour, unlike traditional Southern cornbread which has no wheat flour. Brown’s Good Eats cornbread recipe uses creamed corn, which makes it unsuitable for this time of year (since corn has been out of season for a couple of months now).

Any comments from people who have tried these recipes are welcome. Hopefully other people can benefit from the experience, and I’ll update once I’ve made a final decision.

UPDATE (2013-11-18): We’re going to do the onion-free Italian version of creamed spinach, and I’m probably going to do the leavened cornbread (although, having done the research, I’m feeling mildly inclined to try some of the others at home now).

Posted in Food | Tagged , , , , , , | 2 Comments

LISA’13 recap

The first week in November I went to the 2013 Large Installation System Administration conference, one of the Usenix Association‘s two annual flagship conferences (the other being the summer Annual Technical Conference, which is part of what they call “Federated Conferences Week”).

I arrived in Washington on Saturday, November 2. When I arrived, I discovered that DCA’s Terminal A, where JetBlue is located, is a very long walk from Terminal B, where the MetroRail station is located. Like the MBTA, WMATA surcharges customers for every trip made with non-RFID fare media, so I bought a SmarTrip card, figuring I’d be making multiple MetroRail trips during the week. It took about an hour and a half to get from B6 baggage claim to the Marriott Wardman Park Hotel, the overpriced conference hotel, where I checked in and picked up my conference materials. The conference rate at the hotel was $259/night plus tax; I checked our business travel Web site and found no better deal in the neighborhood. (There is only one other hotel nearby, an Omni, and there’s a lot of benefit to being in or at least very near to the conference hotel.)

On Sunday I attended the configuration management workshop. There were about 30 people attending the workshop, including (it seemed) most of the bcfg2 community. The consensus evaluation of the workshop seems to be that these workshops, which have been held annually for the past several years, always seem to cover the same topics and never really produce anything, and that this year’s workshop was more of the same. CM is still not a solved problem, and a lot of the issues are a result of there being too many different models and not enough standardization. (Microsoft’s Desired State Configuration is mentioned as one major effort in this direction, for Windows systems only, of course.) Orchestration was a major topic, and there were a few new systems mentioned, including Ansible and Salt. The organizers canvassed attendees prior to the conference for questions to discuss, and some of the questions I sent in were brought up. I felt that the scribe for the “small group” segment of the workshop had missed the point of nearly everything I said. Sunday evening I had dinner with two people from the workshop, a woman from CMU central IT and (I think) a guy from Alberta. (Apologies in advance for not remembering anyone’s names!)

On Monday and Tuesday I had no conference-related activities planned, and did radio stuff instead. Scott drove down from Rochester and joined me for those two days, and returned home on Wednesday after covering the oral argument in a home-town Supreme Court case (Town of Greece v. Galloway). On Monday, we visited the WFED transmitter site in Wheaton, and the WBQH transmitter site in Silver Spring, and then had lunch at a Proper Jewish Deli (Max’s in Wheaton) before heading back into town to see the new NPR network studios on North Capitol St. and the new WAMU studios on Connecticut Ave. in the Van Ness neighborhood. (The NPR studios are the home of Morning Edition, All Things Considered, the NPR hourly newscasts, Tell Me More, and formerly Talk of the Nation. The WAMU studios are the home of the local NPR news/talk station and the Diane Rehm show; they also run an all-bluegrass service heard on translators in Maryland and NoVa.) On Monday evening we had dinner with a former colleague of Scott’s who is a producer for Morning Edition.

Tuesday morning started in Friendship Heights at the WMAL/WRQX (Cumulus) studios on Jenifer St. From there we went over to the WJLA/WRQX/WASH tower to see WRQX’s transmitter, and then up into a very ritzy part of Maryland just outside the Beltway where the WMAL (630) transmitter site is located. (We didn’t see the WMAL-FM (105.9) transmitter site as that station, which is licensed to Woodbridge, Virginia, transmits from the other side of the Potomac.) WMAL has the best AM signal in the market, but that doesn’t mean much these days, particularly in sprawling Washington, which has no full-market signals at all, AM or FM. We then made our way over to the studios of Hubbard’s WTOP-FM and WFED, where we shared some laughs at the expense of CBS Radio’s hapless WNEW-FM (an attempted competitor to WTOP which regularly shows up in the Washington ratings behind Fredericksburg and Baltimore stations). Of course, the Hubbard folks can afford to laugh when they own the #1 station in the market, which also happens to be the top-billing station in the entire country. (WFED doesn’t do much in the ratings, but its very weird format is designed to appeal to managers at federal contractors, a particularly lucrative, if only-in-Washington, market.) Tuesday evening we had dinner with Trip Ericson and his girlfriend. Pictures for all of the broadcast stuff to come some day, I hope, on The Archives @ BostonRadio.org.

The conference proper started on Wednesday, with a keynote address by Jason Hoffman of Joyent. He talked about the convergence of storage and compute power, and also at some length about the process of building a platform (SmartOS, an OpenSolaris/Illumos derivative) and a company based on that platform at the same time. (Joyent is primarily a platform-as-a-service or “cloud” provider, competing with EC2 on performance and storage services.) I then went to an Invited Talk by Mandi Walls of Opscode (the company behind Chef) entitled “Our Jobs Are Evolving: Can We Keep Up?”, which was mostly about how sysadmins can provide “strategic value” to companies, and how to make that case to corporate management. (She was also a big believer in changing jobs frequently, as I recall, and suggested that everyone should go to a job interview at least once a year. At least I think that was her.)

I spent the lunch break in the vendor exhibition; I talked with the guy from Teradactyl who isn’t Kris Webb. He was plugging his new storage system and gave me a white paper on it that he had done for another client. I also stopped by the Qualstar booth, where they were demonstrating a modular LTO-5/6 tape library that can hold more than 100 tapes in just a quarter of a rack. (It’s “modular” in that you can stack them vertically; there’s an elevator mechanism that allows tapes to move between modules. Each module can hold four drives and two import/export modules.) This prompted a visit and a follow-up email from the annoying sales guy from Cambridge Computer.

After lunch, I went to the paper and report session to see some guys from Brown University talk about PyModules, a hack similar to our .software kluge, which is designed for HPC-type uses where different researchers may require multiple versions of multiple software packages optimized for multiple CPU types (paper here). I left the session after that paper and went to see an invited talk by Ariel Tseitlin of Netflix, about how they inject random (synthetic) failures into their production network to validate their fault-tolerance and fault-recovery designs. This can range from shutting down individual VMs to simulating the partitioning of an entire EC2 availability zone. They call this system the “Simian Army”, after the first and best-known of its mechanisms, “Chaos Monkey”. After this, I attended the Lightning Talks session, in which people are invited to talk for five minutes or less on any relevant topic. (This used to be the “WIP” — “Work in Progress” — session.) I had a chance to talk about building storage servers, and the session ended before I could volunteer. In doing this, I skipped seeing a talk about NSA surveillance by Bruce Schneier, which turned out to be no great loss as he wasn’t physically present and gave the talk by phone. (I had already seen Cindy Cohn’s talk for Big Data a few weeks previously so this was no great loss.)

On Wednesday evening, I went to the LOPSA membership BoF, which was not very interesting or enlightening. LOPSA wants more members so it can arrange for better member benefits, but there’s still a huge marketing gap between what LOPSA is and what would induce the target audience to join. One useful bit for information: the registration fee at LOPSA-East (a smaller-scale and cheaper event, held in the spring in New Jersey, that some people in our group might want to attend) includes a free LOPSA membership. Many of the tutorials on the LISA program are also given, by the same instructors, at LOPSA-East. When I got back to my room, I found that my home network
had gone down.

Thursday morning began with a change in schedule, as the originally scheduled presenter for the morning plenary session had an emergency and was unable to attend. In her place, Brendan Gregg of Joyent did a presentation about Flame Graphs, a new visualization for aggregating counted stack traces, which had originally been scheduled for a shorter session later in the day. I then went to a session in the papers track, The first presentation had the guys from Argonne talking about the challenges of managing a private cloud, particularly when it’s necessary to shed load or shut down users’ VMs for hardware maintenance. They described a system called Poncho which allows users to make SLA-type annotations to their VMs, including a notification URL that administrators can use to inform users of an impending shutdown, so that they (the clients) can checkpoint their state or remove a system from job dispatching (paper here). The second paper presented experiences diagnosing performance problems on a very-large-scale (11,520-disk) GPFS cluster, which has since been shut down. The third paper, which won the best-paper award, talked about block-level filesystem synchronization, which might have been interesting were it not for the fact that it required (Linux-only, natch) kernel modifications. ZFS can do the same thing with snapshots; their selling point was that they didn’t need snapshots. It’s not clear to me how resilient such a system would be in the face of incomplete transfers, and I haven’t read the paper, but they did have some nice performance graphs. I spent Thursday lunch in the exhibit hall once more, and talked to a few other vendors, of which nothing came.

For the first session after lunch, I saw two Invited Talks. The first was by David Thaw, a lawyer and law professor with a CS degree, who talked about an ongoing study examining the relative efficacy of two different strands of security regulations. One type, exemplified by Gramm-Leach-Bliley, HIPAA, and Sarbanes-Oxley, requires regulated entities to come up with a reasoned information security policy, disclose it to relevant market participants, and periodically revise this policy as new threats and new best practices become known. The other type, exemplified by numerous state data-breach laws like 210 CMR 17.00 in Massachusetts, function effectively as a command: “Encrypt this data”. Thaw’s analysis of reported data breaches shows that entities which were subject to the self-regulatory kind of law had a lower rate of incidents than those that were only subject to directive-type legislation. Furthermore, the directive-type laws are generally written by lawyers who know nothing about technology and either specify too much (such as requiring a specific cipher, mode, and key length), or more often, specify too little, thereby offering little actual security. (For example, some state data-breach laws would be satisfied if the sensitive data were encrypted with 56-bit DES in ECB mode, which is known to be well within the reach of an attacker today.)

The second talk was by Sandra Bittner, who manages information security at the Palo Verde nuclear plant in Arizona, the largest nuclear power facility in the country and the only one not located near a source of water. She discussed the challenges of security in the SCADA realm, for a heavily regulated facility where changes can only be made every 18 months, and where it takes a minimum of five years for any change in control systems to be fully deployed across all three units of the station. I skipped out during the question period and spent some time talking to David Parter (UW-Madison) about the challenges of IT management, and finding competent people to do this job, in the campus environment. He spoke very strongly in favor of a model where the CIO is a dean-level position, primarily engaged in long-term planning, and a deputy CIO is responsible for day-to-day management of the university’s IT organizations.

It was about this time that the idea began to crystallize for me that our operation today is far more like HPC, as opposed to corporate IT, than it was fifteen years ago when I started attending LISA. This goes in part to changes in our operation (e.g., OpenStack, large high-performance file servers, BYOD for students, etc.) but much more so to changes in the corporate IT environment: essentially no new company will ever build its own infrastructure again, unless there are regulatory requirements forcing it to keep critical business data and applications in-house. This idea will come up a few more times throughout the remainder of the conference.

My last slot on Thursday was again split between two sessions; first, a report by Marc Merlin of Google about the horrifying way they switched their OS platform from very old Red Hat to more recent Debian. (Summary: Linux is just an elaborate bootloader for Google, and they use an in-house analogue to rsync for copying system images around. They first minimized the size of the Red Hat image they were using, and then slowly replaced the old Red Hat packages with new Debian packages — which the converted to RPM and then installed, one by one, in their golden OS image — until the system was all Debian. Then they got rid of the RPM infrastructure to force their internal developers to build Debian packages instead of RPMs. An interesting approach that, like most Google “solutions”, is practical only for Google.) The second talk was by Matt Provost of Weta Digital, Peter Jackson’s CGI firm in New Zealand. The talk, entitled “Drifting into Fragility”, sounded interesting from the abstract, but the speaker had an unfortunate tendency to drone, and I was unable to keep my attention on the content of the talk.

Thursday dinner is always the conference reception, and this year’s (held in the hotel exhibit hall where the vendor exhibition had been just a few hours previously) was lamer than usual. (The food, from the hotel’s catering department, is pretty much always awful; the “nice” things are invariably ones I absolutely will not eat, and the rest consists of salad and fatty carbohydrates, like pizza and fries, that I should not have eaten but did anyway.) I did get to spend some time chatting with Brendan Gregg about DTrace and performance issues. (And throughout the whole conference, I never managed to speak with David Blank-Edelman at all, although I did deliver greetings from everyone to Tom Limoncelli, and we chatted a bit about the culture shift in moving from Google, where he was responsible for engineering one small component of a single service, to Stack Exchange, where his responsibility covers an entire software stack.)

Thursday night was dominated by the Google vendor BoF, which was held in a much-too-small room partitioned off from the exhibit hall, a few hours after dinner was over. I ran in to get my free ice cream and ran back out, having no desire to spend any more time packed like a sardine in a very loud room full of Google recruiters. I was not the only person to do so. (I and a number of other people complained about the over-abundance of vendor BoFs. The unfortunate reality, however, is that these events — which suck the life out of the BoF track — pay for about half the cost of the conference, so it’s very difficult for Usenix to cut back on them without substantially increasing the registration fee.) I did attend the very late “small infrastructures” BoF, which was run by Matt Simmons from Northeastern, where we again discussed the idea that small-scale data centers are “legacy infrastructures”, and those of us who run our own are thus “legacy sysadmins”, and should expect very poor job prospects going forward unless we learn to do something else, or else resign ourselves to being part of a low-value “platform provider”. There was some general discussion of this proposition, and no universal agreement. One guy at the BoF was an admin for a small infrastructure, as part of a very small team, for the largest online ad network in Sweden.

Friday morning, I went to Dan Kaminsky’s talk about the future of security, or the security of the future, or some such. One interesting viewpoint shared by both Kaminsky and the Google speaker at the closing plenary was the notion that virtual machines are not necessarily a particularly good idea. Real processors are Hardware, and Hardware comes with software-design requirements of the form “don’t do this, or something undesirable will happen”. Kaminsky’s view is that processor vendors are in the performance business, not the security business, and given the existence of Intel errata like the one allowing unprivileged processes to reprogram the processor microcode, it seems unlikely that no currently existing security container, whether hypervisor- or OS-based, is safe from malicious programs escaping. Kaminsky has much more confidence, oddly enough, in asm.js, and notes that all of the JavaScript security issues in browsers have been the result of native-code interfaces and not the core JavaScript functionality. asm.js simplifies this even further, and can be effectively optimized to within 50% of native-code execution speed.

I skipped the next session and went down to the National Building Museum, one of the DC museums I had never seen. It’s located in the old Pension Building, on Judiciary Square, which was built after the Civil War to house the Pension Bureau, which at the time accounted for fully a quarter of the entire federal budget. I took a guided tour concentrating on the history of the building itself, ate lunch at the cafe, and then saw about half of a special exhibit on the architectural history of Los Angeles. (There were about half a dozen special exhibits running at the time, so I could easily have spent all day, but I needed to rush back to the hotel for the afternoon sessions.) Mac-heads may be interested in two Invited Talks I skipped, “Managing Macs at Google Scale” and “OS X Hardening: Securing a Large Global Mac Fleet” (also at Google); the video and slides for these should be available in a few weeks, and I expect that both will be discussed at the BBLISA LISA-in-review meeting on Wednesday evening.

In the first afternoon session, I saw an Invited Talk by Zane Lackey at Etsy about what they learned while rolling out “HTTPS Everywhere” for their site. They did it in four stages: first for internal clients, then for external sellers, third for external buyers, and finally by disabling the unencrypted fallback option. Their original architecture was the common “SSL terminated at the load-balancer” (rolling eyes here as I’ve heard this story before), which resulted in a severe capacity restriction due to session-count limits on their load-balancer licenses; in order to meet user expectations, they had to terminate SSL on the actual servers, and then scale up their server pools significantly to account for the increased resource requirements of encryption. This session was followed by Jennifer Davis from Yahoo on “Building Large-Scale Services”, but I have no recollection of what she said.

The closing plenary address was given by Todd Underwood of Google. He made the provocative claim that system administration as a profession needs to be eliminated completely; operations is a job for automation and robots, and humans need to refocus their attention on running services, lest they be stuck in a low-value, low-skill, low-wage position as the people who wander data centers replacing servers when they fail. He also noted, almost in passing, that Google doesn’t see much point in VMs. (It should be obvious to any observer that since at Google’s scale, services require anywhere from tens to tens of thousands of machines, there is never a need to multiplex disparate services onto a single machine, and the overhead of a VM hypervisor is simply wasted resources that would be better put into running whatever service is the machine’s business purpose.) He made several handwavy arguments that this was the future and not just something that only exists at Google scale. One thrust of his presentation was a justification of the SRE role, and a claim that it differs not just in scale but qualitatively from system administration. With so many current and former Googlers in the audience, he was speaking to something of a friendly crowd, and didn’t receive much in the way of serious questioning during the Q&A; it would be interesting to see how the many thousands of administrators in Windows shops, very few of whom attend LISA or any other Usenix conference, would respond to these claims.

Friday night after dinner I attended the usual “dead dog” party in the conference organizer’s suite. I was very tired and rather “socialed out” so I don’t recall much of what was said. Lee Damon and I talked about a common acquaintance of ours with a cute UDel sophomore who was attending the conference for the first time. There were homemade brownies, and bottles of water, cans of soda, bottles of whisk(e)y, and bags of chips, and all the Usual Suspects were there, except Parter, who I don’t recall seeing again after speaking with him on Thursday. Adam Moskowitz (who is something of a cook but not a wine connoisseur) talked amusingly about cooking for Kirk and Eric (who definitely are wine connoisseurs).

On Saturday I flew home, and when I pulled in to my parking space, I was disturbed to see a 20-foot roll-off dumpster on the lawn next to my building. However, my doorbell was still lit, proving that my condo hadn’t burned to the ground. (In daylight, the container appeared to be full of shingles, suggesting that the association had replaced my building’s roof while I was gone.) I found my UPS feeping constantly, and power shut off to my home server and wireless access point; I power-cycled the UPS and things came back to normal.

Abstracts for all presentations, plus (eventually) slides and video

Posted in Computing | Tagged , , , , , , ,

An addendum to the findslowdisks DTrace hack

In the previous installment, I posted a script that will output a series of lines that look like this:

multipath/s33d7: 417.3 ms > 372.9 ms
multipath/s25d22: 682.9 ms > 372.9 ms
multipath/s29d5: 699.2 ms > 372.9 ms
multipath/s25d4: 1449.1 ms > 372.9 ms

every five seconds. If you leave it running for a while, and DTrace doesn’t report any warnings, you’ll get a whole bunch of these, and it might not be obvious which slow disks are the most urgent to be replaced. Or at least, it wasn’t to me (other than the top one or two). So a way to further summarize this data is desirable.

My first step, as it usually is, was to paste the output into an Emacs buffer and then sort it with M-x sort-lines. Deleting the extraneous matter (shell prompts, blank lines, etc.) makes a somewhat more informative display, but not quite as summarized as I would like. So I hit upon using shell-command-on-region (M-|) with uniq, but of course the lines are already unique due to random variation in the reported service times, so a little bit more shell hacking would be necessary. I finally came up with the following pipeline for shell-command-on-region (which works for vi users too, although I have no idea what particular obscure key sequence the Left Coast Editor requires):

cut -d: -f1 | uniq -c | sort -n

This produces a nice summary of a minute’s worth of findslowdisks output:

   1 multipath/s29d22
   7 multipath/s25d18
   8 multipath/s21d7
   9 multipath/s29d21
   9 multipath/s29d5
  11 multipath/s25d12
  11 multipath/s25d4
  13 multipath/s33d7
  14 multipath/s25d22
Aside | Posted on by | Tagged , , ,

My first (well, ok, third) DTrace hack

We have a storage server with a bunch of marginal disks in it. They aren’t bad enough to return read errors, and of course the SMART data claims they’re just fine, but the built-in error-recovery takes much too long (sometimes on the order of seconds), so I’ve been writing some DTrace scripts to try to identify the problem drives and get them replaced. This is my first real foray into DTrace, so it’s been a learning process.

From one of the DTrace gurus (I’ve forgotten which one), I got the idea of postprocessing DTrace output with awk to find drives whose maximum response time is more than six standard deviations away from the mean. I had to write another awk script to parse the output of gmultipath list to generate some code for the first script to translate disk units into the names we actually use. (I wouldn’t have had to do this if I had instrumented the gmultipath layer, but I understood the CAM peripheral driver for SCSI disks, /sys/cam/scsi/scsi_da.c, much better than I understood how requests are passed around GEOM, so that’s where I wrote my DTrace instrumentation. Finally, it became clear that the SSDs in our storage servers were pulling the mean and standard deviation down too much, so I had to kluge the DTrace script to exclude those devices by hand. (I don’t think there’s a way to identify SSDs in pure DTrace code; if anyone has an idea how to do it, please let me know.)

I am by no means a DTrace expert. I’m not even an awk expert — I would probably have done better to write this code in Perl or Ruby, which would have made it much simpler and avoided the whole business of scripts-running-scripts-to-generate-scripts. (Or better yet, I could learn R and DTrace at the same time, joy!) But what I had seems to work, so I’m sharing it here.

The driver shell script, findslowdisks:

#!/bin/sh

temp=$(mktemp /tmp/sixsigma.awk.XXXXXX)
trap "rm -f $temp" 0 1 3 15

{
        echo 'BEGIN {'
        gmultipath list | awk '
/Providers:/ { state = 1 }
/Consumers:/ { state = 2 }
/Name:/ && state == 1 { providername = $NF }
/Name:/ && state == 2 { printf("providers[\"%s\"] = \"%s\";\n", $NF, providername); }'
        echo '}'
        cat sixsigma.awk
} > $temp
dtrace -s daslow2.d | { trap '' INT; awk -f $temp; }

The DTrace script, daslow2.d:

#!/usr/sbin/dtrace -s

#pragma D option quiet
#pragma D option dynvarsize=2m

dtrace:::BEGIN
{
        /* ZeusRAM */
        ssd[6] = 1;
        ssd[54] = 1;
        ssd[102] = 1;
        ssd[126] = 1;
        /* flash */
        ssd[31] = 1;
        ssd[79] = 1;
        ssd[151] = 1;
        ssd[175] = 1;
}

fbt::dastrategy:entry
/ssd[((struct cam_periph *)args[0]->bio_disk->d_drv1)->unit_number] == 0/
{
        start_time[args[0]] = timestamp;
}

fbt::dadone:entry
/(this->bp = (struct bio *)args[1]->ccb_h.periph_priv.entries[1].ptr) && start_time[this->bp] /
{
        this->delta = (timestamp - start_time[this->bp]) / 1000;
        @sigma = stddev(this->delta);
        @mu = avg(this->delta);
        @tmax[args[0]->unit_number] = max(this->delta);
        start_time[this->bp] = 0;
}

tick-5s
{
        printa("mu %@d\n", @mu);
        printa("sigma %@d\n", @sigma);
        printa("da%d tmax %@d\n", @tmax);
        printf("\n");
        clear(@tmax);
}

tick-60s
{
        clear(@mu);
        clear(@sigma);
}

dtrace:::END
{
        printa("mu %@d\n", @mu);
        printa("sigma %@d\n", @sigma);
        printa("da%d tmax %@d\n", @tmax);
}

This DTrace script generates output that looks like this:

mu 21655
sigma 59125
da0 tmax 0
da1 tmax 0
da73 tmax 72163
da7 tmax 75229
da72 tmax 80966
da98 tmax 84600
da96 tmax 92696
da80 tmax 92742
da23 tmax 95349
da83 tmax 95807
da27 tmax 97571
da95 tmax 99413
da76 tmax 101202
da8 tmax 104040
da22 tmax 104586
da59 tmax 106235
da50 tmax 106523
da99 tmax 107958
da53 tmax 108361
da81 tmax 108938
da49 tmax 109653
da36 tmax 110585
da52 tmax 111933
da97 tmax 111972
da94 tmax 113516
da100 tmax 115035
da11 tmax 116034
da82 tmax 116527
da56 tmax 117776
da29 tmax 118982
da34 tmax 119512
da26 tmax 120884
da74 tmax 122808
da84 tmax 123039
da58 tmax 123882
da87 tmax 124629
da25 tmax 125528
da47 tmax 127538
da70 tmax 128523
da55 tmax 134356
da101 tmax 137618
da35 tmax 137632
da10 tmax 139364
da60 tmax 141884
da33 tmax 144913
da32 tmax 153962
da48 tmax 162202
da13 tmax 170598
da86 tmax 173577
da9 tmax 178848
da90 tmax 183577
da61 tmax 185236
da85 tmax 190790
da14 tmax 192199
da38 tmax 204740
da68 tmax 217665
da69 tmax 218239
da12 tmax 230249
da64 tmax 233869
da92 tmax 235744
da44 tmax 243507
da20 tmax 250953
da63 tmax 267525
da42 tmax 267868
da19 tmax 292027
da18 tmax 293504
da37 tmax 301723
da45 tmax 304621
da28 tmax 308943
da91 tmax 309112
da39 tmax 324018
da62 tmax 329701
da93 tmax 339661
da16 tmax 341316
da17 tmax 345690
da15 tmax 349360
da41 tmax 351064
da67 tmax 355273
da40 tmax 376592
da66 tmax 386287
da88 tmax 426114
da89 tmax 426324
da21 tmax 500642
da30 tmax 502026
da75 tmax 734421
da65 tmax 1319797
da71 tmax 1347838
da46 tmax 1354558
da57 tmax 1525634
da51 tmax 2403745

…which is obviously not very human-readable, but you can look at it and see that it’s generating data that looks like it might be correct. sixsigma.awk is the tail end of the script (the head being a BEGIN block that is constructed by the shell script above):

$1 == "mu" { mu = $2 * 1.0; }
$1 == "sigma" { sigma = $2 * 1.0; }
$2 == "tmax" {
        tmax = $3 * 1.0;
        if (tmax > (mu + 6*sigma)) {
                printf("%s: %.1f ms > %.1f ms\n", providers[$1], tmax / 1000.0, 
                        (mu + 6*sigma) / 1000.0);
        }
}
/^$/    { print }

The output, updated every five seconds, looks like this:

multipath/s29d17: 1688.0 ms > 547.1 ms
multipath/s29d22: 2847.9 ms > 547.1 ms
multipath/s25d12: 3381.1 ms > 547.1 ms

multipath/s25d22: 986.1 ms > 585.7 ms
multipath/s29d17: 1498.7 ms > 585.7 ms
multipath/s25d12: 2728.8 ms > 585.7 ms
multipath/s25d4: 2894.7 ms > 585.7 ms
multipath/s29d22: 3099.7 ms > 585.7 ms

multipath/s29d17: 719.5 ms > 651.1 ms
multipath/s25d18: 895.4 ms > 651.1 ms
multipath/s25d4: 1749.2 ms > 651.1 ms
multipath/s29d22: 1995.8 ms > 651.1 ms
multipath/s25d12: 3047.1 ms > 651.1 ms

multipath/s25d18: 580.4 ms > 579.1 ms
multipath/s25d4: 734.8 ms > 579.1 ms
multipath/s29d22: 1379.6 ms > 579.1 ms
multipath/s25d12: 2586.5 ms > 579.1 ms

multipath/s29d22: 3021.9 ms > 645.4 ms
multipath/s25d12: 3492.1 ms > 645.4 ms

This gives me a pretty good idea of which disks still need to be replaced.

Posted in Computing, FreeBSD | Tagged , ,