ADB was pretty solid. A shared-single-wire bus that allows multiple slaves by including an address in the message... but includes a protocol-level reset that's used regularly, so a missed single bit doesn't jam things up unrecoverably. When I think of how many fewer hours of my life I would have wasted had I2C learned this lesson...
The only thing was… it was very easy to destroy a device (keyboard for example) if you hot-plugged it. I lost a keyboard or two (well, my employer did) because I plugged in a device without first powering down the Mac. (Unlikely the ADB protocol though? I wonder if this is one of those examples where having some pins longer than others so they get electrical contact first could have prevented the problem.)
The I2C protocol is a distributed state machine. Each slave node needs to know what bit in the message sequence is being sent, to match it against its own address. As a result, a single missed bit results in a slave not recognizing its own address and going dark -- or, worse but rarer, potentially recognizing its address in the midst of data sent to another slave, and responding inappropriately. Observing this in a real system indicates either a signal integrity issue or a bug, but signal integrity issues do happen, and this non-recoverable behavior is somewhat unique to I2C, and somewhat cursed, turning what would be a transient error into a permanent one. In practice, most I2C devices have some mechanism for recovering (sufficiently long pulse on clock line, so many clock edges all low, a dedicated reset pin); and for the few that don't, forcing a reset of the device through a high-side drive or similar works fine; but the fact that these mechanisms are outside of the protocol, rather than inside it, is also cursed, and means that for an I2C bus of mixed devices there's not guaranteed to be a single method acceptable to all of them.
Has anyone calculated or measured the input lag of ADB vs other protocols such as PS/2 or USB? This is unfortunately hard to search because most references on the web to ADB are for the Android Debug Bridge.
From the numbers given, it seems like ~2ms to send a packet (my math may be off), which is quite good when compared with other contemporary/modern protocols (see: https://danluu.com/input-lag/ for examples)
Honestly it'd be really cool to see some repro parts for these like an upper case (even without the Apple logo).
I junked my old AE2 ages ago and finally got a replacement today. If I knew then what I know now I would've salvaged a bunch of stuff off of it. Oh well.
Steve Wozniak was supposedly invoved in the design of ADB, but it's so hard to confirm that that I'm starting to wonder if it's a myth. The closest to confirmation I can find is a reference to "the ADB, created by Steve Wozniak" on Bill Buxton's input-device timeline https://www.billbuxton.com/inputTimeline.html (which at least dates back to 2011 https://web.archive.org/web/20110410220530/http://billbuxton... ) but there's no citation to support it. Any ideas, anyone?
Woz was almost certainly there when ADB was created. It's documented that he was quite involved in the design of the Apple IIGS. And ADB was created for the IIGS (right?)
But that doesn't quite mean he was involved in the design of ADB itself. And we know he isn't listed on the patents.
ADB was pretty solid. A shared-single-wire bus that allows multiple slaves by including an address in the message... but includes a protocol-level reset that's used regularly, so a missed single bit doesn't jam things up unrecoverably. When I think of how many fewer hours of my life I would have wasted had I2C learned this lesson...
The only thing was… it was very easy to destroy a device (keyboard for example) if you hot-plugged it. I lost a keyboard or two (well, my employer did) because I plugged in a device without first powering down the Mac. (Unlikely the ADB protocol though? I wonder if this is one of those examples where having some pins longer than others so they get electrical contact first could have prevented the problem.)
Oh is that why I2C seems to just hang/break whenever I use it for a DIY project? Does everyone just reset the bus all the time?
The I2C protocol is a distributed state machine. Each slave node needs to know what bit in the message sequence is being sent, to match it against its own address. As a result, a single missed bit results in a slave not recognizing its own address and going dark -- or, worse but rarer, potentially recognizing its address in the midst of data sent to another slave, and responding inappropriately. Observing this in a real system indicates either a signal integrity issue or a bug, but signal integrity issues do happen, and this non-recoverable behavior is somewhat unique to I2C, and somewhat cursed, turning what would be a transient error into a permanent one. In practice, most I2C devices have some mechanism for recovering (sufficiently long pulse on clock line, so many clock edges all low, a dedicated reset pin); and for the few that don't, forcing a reset of the device through a high-side drive or similar works fine; but the fact that these mechanisms are outside of the protocol, rather than inside it, is also cursed, and means that for an I2C bus of mixed devices there's not guaranteed to be a single method acceptable to all of them.
Yes, but every device is slightly different and sometimes you need obnoxious hacks to even detect that there is a problem at all...
Has anyone calculated or measured the input lag of ADB vs other protocols such as PS/2 or USB? This is unfortunately hard to search because most references on the web to ADB are for the Android Debug Bridge.
From the numbers given, it seems like ~2ms to send a packet (my math may be off), which is quite good when compared with other contemporary/modern protocols (see: https://danluu.com/input-lag/ for examples)
As the author mentions doing it, a note regarding retrobright: it seems to cause faster yellowing than not using it. https://youtu.be/_n_WpjseCXA
Maybe just let your items show their age.
Honestly it'd be really cool to see some repro parts for these like an upper case (even without the Apple logo).
I junked my old AE2 ages ago and finally got a replacement today. If I knew then what I know now I would've salvaged a bunch of stuff off of it. Oh well.
Steve Wozniak was supposedly invoved in the design of ADB, but it's so hard to confirm that that I'm starting to wonder if it's a myth. The closest to confirmation I can find is a reference to "the ADB, created by Steve Wozniak" on Bill Buxton's input-device timeline https://www.billbuxton.com/inputTimeline.html (which at least dates back to 2011 https://web.archive.org/web/20110410220530/http://billbuxton... ) but there's no citation to support it. Any ideas, anyone?
Woz was almost certainly there when ADB was created. It's documented that he was quite involved in the design of the Apple IIGS. And ADB was created for the IIGS (right?)
But that doesn't quite mean he was involved in the design of ADB itself. And we know he isn't listed on the patents.
It's okay. It's not as good as the SIO that came with the Atari 8-bit computers, but it's alright.