Chuck Thacker

Chuck Thacker died yesterday, and the world is poorer for it.

Chuck won both the Draper prize and the Turing award. He’s been described as “an engineer’s engineer”, epitomizing Antoine de Saint-Exupery’s remark that “Perfection is achieved not when there is nothing more to add, but when there is nothing left to take away.” He established a track record of simple, beautiful, and economical designs that is exceedingly rare.

Over the last day I’ve been struggling with how to explain Chuck to non hardware engineers.  He could achieve amazing results with fewer components than anyone else and yet after the fact, mere mortals could understand what he had done.  But he also understood the physics and technologies very well, and knew just where to apply the unusual part or custom design to make the entire project coalesce into a coherent whole. If you are a software developer, think of Chuck as someone like Niklaus Wirth who invented Pascal. If you are an aviation buff, think of Chuck as someone like Kelly Johnson who designed the SR-71. Chuck really was at that level.

I had the privilege to work directly with Chuck on three different computer system designs.  I was a coauthor on several papers with Chuck and coinventor on a networking patent, so I suppose my Thacker number is 1.

I first met Chuck Thacker when I was a summer intern at Xerox PARC in 1977.  We both joined Digital Equipment’s Systems Research Center, working for Bob Taylor, in 1984.  At SRC, Chuck led the design for the Firefly multiprocessor workstation.  I wrote the console software for the 68010 version, and designed the single and dual microvax CPU modules. I wanted to add sound I/O to the Firefly and Chuck helped me figure out how to do it by adding only three chips to the design for the display controller.

Later at SRC Chuck launched the idea of the “Nameless Thing” which was to be a liquid immersion cooled computer built around an ECL gate array running at 200 MHz.  I worked on the first level caches, to be built out of 1.2 nanosecond Gallium Arsenide static rams.   We had to rewrite the CAD tools to get sensible board layouts that could run at those speeds.

NT was never built because it was overtaken by the Digital Semiconductor Groups’ design of the first Alpha processor. Chuck led a team of Digital Research folks to build development systems for the Alpha chip.  The effort was credited with advancing Alpha’s time to market by about a year. At the time, Digital had a standard design for multiprocessor systems based on the “BI” bus.  The specification ran to over 300 pages.  Chuck was incredulous, and worked out a design for the Alpha Development Unit multiprocessor bus that was 19 pages long.  The Alpha EV-3 and EV-4 chips were very unusual in that they could be configured for either TTL signaling on the pins, or ECL signaling.  The ADU became an unrepentant ECL design.  Strict adherence to ECL transmission line signaling and a complete disregard for power consumption allowed for exceeding fast yet low noise signaling.  Chuck designed the bus and the memory system.  If I remember correctly, he commissioned Gigabit Logic to build custom address line drivers so that the memory would meet timing.  Dave Conroy designed the CPU module, and I designed the I/O module.  I recall that SRC built the chassis and ordered cables for the 400 amps of -4.5 volts from a local welding shop.  They asked “what kind of welder needs 18 inch cables?”

I learned a tremendous amount from Chuck’s economy of design and from his ability to make hardware vs software tradeoffs to achieve simplicity.  I also learned that it was completely allowed to rewrite all the software tools to make them do what you want.

Chuck was a “flat rock engineer”, in his own words.  The reaction of such a person to a new project is to first rub two rocks together to make a flat working surface. He was a lifelong opponent of complexity, not only in hardware, but in software as well, remarking that unnecessarily complicated software was apt to collapse in a rubble of bits – a phrase I adopted as the title of this blog.

Chuck Thacker was unique, and I deeply mourn his passing.  Evidently he didn’t wish a memorial service, but I think the duty falls on all of us to edge our designs a little closer to simple, elegant, straightforward, and beautiful.

 

Bob Taylor

Robert W. Taylor died yesterday.  While working at ARPA, he funded the work that led to the Internet.  He managed the legendary Xerox PARC Computer Science Lab, where the Alto and the Ethernet were created. He won the National Academy of Engineering’s Draper Prize. You can read about these things more elsewhere.

Bob Taylor hired me, with my new PhD, into CSL.  Later, he hired me again, at the Digital Equipment Systems Research Center.  I learned not everything I know, but quite a lot of it, on his watch. Bob had the special genius of assembling groups of people who could invent the future.

At Xerox, the weekly group meetings were called Dealer, as in Dealer’s choice.  The speaker set the rules.  The culture was for the audience to do their level best to challenge the ideas.  Bob talked about civility, and about the necessity of “turning type one disagreements into type two disagreements”.  A type two disagreement is where each party understands and can explain the position of the other.

I was first exposed to CSL as a research intern while a graduate student. On either side of my office were Dave Gifford and Eric Schmidt. When I graduated, I turned down a couple of faculty appointments to stay at CSL. There was no place else that had the same concentration of talent and the freedom to build new things.  Both of those factors were the work of Taylor.  He felt his job was building the group and building the culture, then defending it from outside influence.

In 1984, corporate finally got the best of him and Taylor left to start the Systems Research Center at Digital Equipment.  I was number 24 to quit and follow him.  Against all odds, Taylor repeated his success and built another outstanding research group at Digital.  Occasionally, some dispute or other would arise, and folks would go complain to Bob.  He had a plaque on his wall “Men more frequently need to be reminded than informed.”  Bob would gently remind us of the rules of disagreement.

It’s not well known, but Taylor was from Texas and a little bit of the Lone Star State followed him around.  One time, Dave Conroy and I had succeeded in getting a telephone audio interface running on our lab-built Firefly multiprocessor workstations, and mentioned it on our way out to lunch.  When we got back, we found Taylor had dialled in and left us a 30 second recording.  Dave and I knew this had to be preserved, but the test program we had had no code to save the recording!  Eventually, we sent a kill signal to create a core dump file and fished the recording out of the debris.  Here’s Bob Taylor:

 

 

Go Square!

I got a misaddressed email today, with a receipt for someone’s Square account.

At the bottom, there is a button “Not your receipt?”

When clicked, the page reads “Someone must have entered your email address” with an option to unlink it.  Easy and sensible.

This is by far the best design I’ve encountered.

 

Wikileaks Bait

One of the interesting developments in the 2016 electoral cycle is the use of offensive cyberespionage.  Wikileaks is publishing internal email from the campaign of Hillary Clinton, with the publications timed to attempt to damage the campaign.

Maybe this is the work of Russian spies, with Wikileaks an unwitting stooge, maybe not, but the case is quite interesting.

What should a campaign organization, or corporation, or government agency do?  Their emails may be next.

One possibility is to salt the email stream with really tempting tidbits suggesting illegal, immoral, or unethical behavior, but also put these emails in escrow somewhere.  Then, when the tidbits come to light, you can derail the news cycle with one about how your infosec team has pwned the leakers and trolled the media.

The technique will only work the first time, but even later, professional news organizations are not going to want to take the chance that their scoop is a plant.  That is how Dan Rather lost his job.

If the plants are subtly different, they could also be used to identify the leaker or access path.  (This was suggested in “The Hunt for Red October” by Tom Clancy, written in 1984, but the idea is surely older than that.)

More on point, it should be obvious at this point that email is not secret, nor is any electronic gadget secure.  [[ How do you identify the spook? She’s the one with a mechanical watch, because she doesn’t carry a phone. ]]

Until we get secure systems, and I’m not holding my breath, conspirators really shouldn’t write anything down.  In the alternative, their evil plans must be buried in a sea of equally plausible alternatives.

 

Stingray countermeasures

A Stingray is a cell tower lookalike device.  It broadcasts its presence and nearby phones connect to the Stingray thinking it is a legitimate tower.  The Stingray can then log each phone or act as a man in the middle to incercept call metadata, text messages, or even call contents.

There are a number of public databases of legitimate cell towers.  For example, http://opencellid.org  Some databases are government, for example, the FCC license database, while others are crowdsourced.

It should be possible to modify a phone to only connect to towers which are legitimate by checking the purported tower ID against a cached copy of the database for the local area.  A stingray could, of course, use the id of a real tower, but that would disrupt communications in the whole area. This might not prevent the Stingray from logging the presence of such a phone, since the Stingray could hear the protocol handshake with the legitimate tower.

It should also be possible for a phone to passively listen for tower broadcasts, and to compare the tower ID against the database,  An unknown ID might be a new legitimate tower or it might be a Stingray.

It is likely quite difficult to get at and modify the low level radio software in a commercial smartphone, but there is a complete open source suite of cell infrastructure software at http://openbts.org

That code could serve as a starting point for a software defined radio device for detecting and tracking Stingrays.  One could make a box with a red light on top which lights up when there is an unknown tower in the area.

In some areas, use of Stingray devices requires a warrant, but this is not universal.  The courts have also determined that use of location data from legitimate cell towers does not require a warrant

.

.

PIN Escrow

The FBI has dropped their request to require Apple to write code to unlock the terrorist iPhone.  Supposedly a third party offered a way in.  Yesterday the FBI said they did get in, so they no longer need Apple’s help.

For those whose first instinct is to distrust the government, this looks like the Justice department realized they were going to lose in court and hastily discovered a way out. “Never mind”.  This preserves their option to try again later when public opinion and perhaps law would be more on their side.  I am a little reluctant to think Justice would outright lie to a federal judge, but it wouldn’t be the first time.

This morning on NPR there was a different sort of heartbreaking story.  A woman and her baby were murdered, and there might be evidence on the woman’s phone, but it can’t be unlocked.  So what to do?

My idea is “PIN Escrow”.  Everyone should have a letter written with a list of their accounts and online passwords, to be opened by someone in the event of death or disappearance.  Everyone should have a medical power of attorney and so forth as well, to give a family member or trusted friend the power to act for you in the event of a sudden disability.  Just add your smartphone PIN to the letter,

In the alternative, one could write an app that encrypts your pin with the public key of an escrow service and sends it off.  This facility could even be built into the OS, with opt-in (or even opt-out, after a sufficient public debate), so it would automatically track changes.  The government could operate such a service, or it could be private.  There could be many such services.  Some could be offshore.  Some could use key-sharing for the private key, so PIN recovery could not be done in secret.

Let’s leave it up to individuals whether they want someone to have the power to unlock their phone in the event of an emergency.

From a security perspective, a PIN escrow service would be a dangerous and attractive target, so such a thing would have to be well designed in order to be trustworthy.  It should be kept offline, with no network connection.  The private key should be in a hardware key module.  Several people would have to collude in order to unlock a key, and there ought to be hardware safeguards to prevent bulk PIN recovery.

This is not a general back door for government surveillance, it wouldn’t grant remote access to a phone.  It wouldn’t be useful for hacking into criminal’s or terrorist’s phones (if they are smart), but it might help in cases where the phone owner is the victim of tragedy or accident.

And if you change your mind about having your PIN escrowed?  Just change your PIN.

 

Apple v FBI

I’m beginning to build up a full head of steam.  The first step seems straightforward.  I’m going to write my congressman.  It may not have much effect, but if enough of us write, it might.

Here’s my letter to Massachusetts Senator Elizabeth Warren.  I’ll be sending similar letters to Sen. Ed. Markey and Rep. Katherine Clark.

2016, March 16

The Honorable Elizabeth Warren
317 Hart Senate Office Building
Washington, DC 20510

Dear Senator Warren:

I write about the Apple FBI affair.  Please oppose any attempt by government to weaken the security and privacy of all Americans by demanding security “backdoors” in our technology or to require the conscription of Americans or American companies to weaken their own security.

First, regarding backdoors. I hold a PhD in Electrical Engineering and have worked with computer systems and computer security for over 40 years.  I am coauthor of the well-regarded book on E-commerce systems “Designing Systems for Internet Commerce.”  In other words, I know quite a lot about this area.  There is simply no way to create a backdoor that does not also reduce the security of the system for everyone.

Second, speaking as an ordinary citizen, I do not know how the courts will rule on the government’s request to use the All Writs Act to compel Apple to write software to unlock the San Bernadino iPhone, but my own view is that the constitution does not and should not allow it.

The government is being deliberately disingenuous when it claims this case is only about one terrorist’s phone. I have no sympathy for the killers, but the privacy and security of everyone is at risk should the government prevail.  Should that happen, I expect you to propose and support legislation that outlaws backdoors and forbids the conscription of individuals or companies into the government’s service.  This has happened before.  In 1980, Congress passed the Privacy Protection Act of 1980 which corrected the overreach of government in Lurcher v. Stanford Daily.

Sincerely yours,

Lawrence C. Stewart

Smartphone Security

Zdziarski’s Blog of Things has an article about possible enhancements to iOS security, in the wake of the Apple vs FBI affair.

Another idea is one I’ve mentioned before: Duress Passwords

If you are asked to unlock your phone, you could use a different finger, the duress finger, and the fingerprint sensor could appear to accept it, but erase the phone.  If you enter the duress password, the phone could erase itself or, perhaps, just start recording what is going on and uploading it to the cloud.

Another idea are Landmine Passwords.  These are passcodes whose purpose is to defeat brute force searches.  If you avoid landmines within hamming distance one or two of the correct passcode you would have litle chance of hitting one while trying to enter the correct code, but any searcher would be very likely to hit one before hitting the correct passcode.

The obvious missing feature

I think there are great opportunities for sensible people to make money doing usability analyses of web based systems.

Let me give some examples of well intentioned systems with the obvious feature left out.

Email addresses

I have a Capitol One credit card, and in my user profile, there a place to enter an email address so they can send me stuff.  (In another post I will rant about email addresses further)  Recently I happened to log in to set up alerts for spending and so forth.  The email notifications were disabled because, they said, the email address I had entered had been refused.  Yet the address was actually correct.

This is not unknown.  We had a crash a while back of our cloud email server, and we didn’t notice for hours, so it is possible mail was bounced.

There was no way to tell the Capitol One system “test it now please”.  Instead, I had to change the address to a different one.  This made them happy even without a test.  I suppose I could then change it back, but how much time do I have to spend working around a bad design?

Phone numbers

Many sites require phone numbers.  They have no uniform way of entry.  Some have free form fields, but limited to exactly 10 characters.  Some forbid hyphens.  Some require hyphens.  Some have exactly three fields, for area code, exchange, and number.  Is it really that hard to parse a variety of formats?  Do they really think making me keypunch my number is helping their image?

Notifications

I have my bank account and credit cards set up to send my text notifications when there is activity. One bank only allows notifications for amounts above $100.  Why does that even make sense? They can handle small deposits, but they can’t handle sending a text for a $10 charge? At least the text on the page explains the limit.

A credit card company has the same feature, but allows texts for any transaction amount, except $0! If I want notificications on all transactions, what limit value should I use?  I telephoned, and the agent suggested $0.01.

I’m getting to be a curmudgeon when things like this offend me.

 

Notifications – unclear on the concept

Tthis is a post about organizations trying communicating with their customers but getting it wrong.

I have signed up for various notifications, typically by text or email.  Tragically, sometimes organizations manage to use these in a way that makes me think they are idiots.

  • I just received a text from my local library that a book I’ve had on hold forever has come in.  The problem is that I picked it up last night.
  • I got an email from my Honda dealer that my minivan is due for service – two days after the service was done, by them.
  • I get both emails and texts from Target that my store credit card payment due date is coming up — even though my balance is zero.

To me these seem like violations of a  simple and obvious design principle:  don’t send a notification that is moot.  All it does it point out to your customer that your systems are broken.  And that means that your organization is clueless and really should not  be trusted with my business.

Delay is also important.  I have my Bank of America profile set so that I get texts notifying me of ATM withdrawls.  I should get them when I do a withdrawl, but never at other times.  Often, these arrive within minutes, but sometimes, they take 6 hours or so to arrive.  The immediate feedback ratchets up my confidence that I would find out immediately if fraudulent activity were to occur.  The delayed feedback?  They are having the opposite effect.  I obviously cannot trust BofA systems to notify me of activity in a timely way.  Should I trust them for anything else?