Consider the optic nerve and the optic tract.
Light strikes the retina and signals fire along the optic nerves, through the optic chasm, through the optic tracts and into the left and right thalamus on route to the visual cortex at the back of our brain. Before the signals reach the visual cortex, they must first pass through the limbic sections of the brain, that is, the emotional center.
By the time our brain has started to gather the shape and color and symmetry of whatever we see, long before we have words for what our eyes have met, we already have an emotional reaction. The language cortex and prefrontal cortex are almost literally the last to find out what's going on.
Riot or revolution?
When we see violence in the street, the word that appears in our mind reveals our emotional position to that violence. If we see a "riot", our heart is with the establishment. If we see a "revolution", our heart is with the protestors.
The same general principle applies for all of our senses. We are emotional creatures first and only occasionally have fits of reason. There really is no such thing as "being reasonable." We rationalize our emotional state, but we are not actually rational.
π in base eleven: 3.16150702865a4
Today's date, numerically encoded according to US conventions: 3/16/15
The time this article posted (Mounatin Daylight Time): 07:02
Woo Hoo! π day all over again.
If you casually ignore the first two digits of the Year of Our Lord. And ignore that I actually faked the publication date and time. And ignore that the Year of Our Lord is at best an approximation. And ignore that there is no 0 between 1BC and 1AD on The Number line of Our Lord. And if you ignore that at least 2/3rds of the world disagrees about the "Our Lord" part. In general you kinda have to overlook that everything about this exciting temporal milestone is layer upon layer of arbitrary human convention.
I mean, except for the ratio of a circle's circumference to its diameter and the corresponding conversion from a base ten representation of that ratio to the base eleven representation.
But by all means, don't let any of that stop you from celebrating this momentous occasion with a slice of pie.
π in base twelve: 3.184809493b918
π in base fifteen: 3.21cd1dc46c2b7
π in base sixteen: 3.243f6a8885a3
I'm actually more excited about τ in base eight day: 6.2207732504205. If you arbitrarily chose a point close to the international date line, you could almost celebrate THAT day on the solstice. Tau Day.
I rushed out to catch the bus for fear of missing it. There was only one other person waiting. I needn't have worried. There's a whole story in there about unnecessary fear. But that's not today's story.
I recognized the woman waiting at the bus stop. I've seen her fairly often on the bus. Our schedules are similar.
We're waiting together for the bus. Just the two of us. It's dark out. She's looking at her phone.
I take out my phone too. I put my phone away. Feels awkward.
Then I see a couple men walking along the sidewalk in our direction. The one looks her up and down.
And then. As he's passing us.
His. Eyes. Locked. On. Her. Face.
Too agressive. I thought. The moment passed with them as they proceeded along, yet her gaze seemed to follow him.
Or she might have been looking down the street to see if the bus was coming.
That felt creepy.
I should ask her if that was creepy to break the tension.
That's what micro-aggression looks like, right? Would it help if I said something?
What if he'd actually stopped walking to talk to her? I think the unspoken social contract calls for me to intervene. Nevermind social contract, my gut was already preparing to step in if things escalated.
"Move along," I imagined saying to him.
"What? Is she your girlfriend?" He asked knowing the answer.
I imagined her awkward body language at me picking a fight with a stranger to protect her from the escallating microagression. Was that fear that things would get out of hand? Or was it relief to not be standing alone at the bus stop?
I should ask her if that was creepy to break the tension.
This time she was a skilled martial artist. Her body language was angry at me for assuming she needed my protection.
I should ask her if that was creepy to break the tension.
But how is my impulse to talk to her any different than his stare? Would that break the tension or just pile on? Am I just looking for an excuse to talk to a beautiful woman? Am I competing for her favor?
This time things escalate. He's armed with a knife. I wake up briefly in the emergency room. Images of my young children playing at home, then interrupted by the sound in my wife's voice as she gets The Call. The joy on their faces melts to puzzled, worried looks as I fade to black.
This time I'm waiting at the bus stop with a man. The gut check is completely different. He's got this. It would be insulting to step in. None of my business, anyway.
This time I know she's transgender. This is an unexpected variation. She's beautiful. Did he know her before the operation? There's no hello nor even a nod nor raised eyebrow. Still my gut steps in to defend. "Move along."
This time she's ugly. How does this one plays out? Does he even pause in his step? Was it about her beauty? Or was it the sense of power? As he gets further away I notice a subtle weave in his path. It's dark, but way too early to already be drunk. Was this stare the best he can do for a power trip? An angry reaction to being out on a Friday night with a friend instead of a date. If he does stare and then stop, does my gut step in? Or am I only interested in competing for the favor of a beautiful woman?
Why am I still thinking about this? I thought.
Should I ask her if that was creepy?
Was she creeped out by standing alone in the dark at the bus stop with me? I was watching him, not her. Could her face have been pleading to him for protection from me?
Time itself is an infinite scroll.
you are herewish you were here
Hey design friends:
What images do you think of when I ask you about the invisible part of design... the lines and shapes and proportions that make your design hang together in a coherent way?
I'm writing an article to teach computer geeks about design and to explain why CSS sucks as a language for designers. I need some visual support to explain the invisible. Words aren't going cut it. Although beautiful images about typeface design would work nicely. :-)
I've got a few examples here: two from architecture, one a study for a figure drawing. These are in the right direction, but I'd lovee images from many other design disciplines.
Overheard: "I'm just a web designer. I don't program or anything."
Here a web designer adopts the cultural bias which values programming above design. But the bias cuts both ways. Designers are not to be trusted with code and coders are not to be trusted with design.
HTML and CSS are unfortunate consequences of this bias. In the ideal world, HTML can be purely semantic and the look-and-feel can be done completely with the CSS. Except that world doesn't really exist and HTML gets littered with extra <divs> to prop up the design needs. And CSS gets littered with duplication of paddings and margins (at the very least) to adjust and control the positions of elements on the page.
And so we have grown templating languages on the server side to try to manage the deficiencies in HTML and CSS in various ways. The menagerie of HTML templating languages is beyond imagination. For CSS we now have SASS and LESS and SCSS: basically templating languages for CSS.
What the server-side languages have in common is introducing turing completeness for languages that are not themselves turing complete. When one language doesn't do what you want, invent another language which can be compiled into the lesser language. This is how C begat C++ begat Java and C# which... never mind, I've gone too far already.
You can see Conway's Law at work here. The programmers and designers are on separate teams and speak different languages. So architectural interfaces are created between the teams. Code goes on this side. Design goes on that side. Over time the architectural boundary between the teams accumulates a lot of kludge on either side to accommodate the inability for the teams to really communicate. And that boundary becomes a point of friction that slows down development and growth.
CSS is especially unfortunate. It is intended for design and it completely misses the mark right from the outset. Seriously. The heart of CSS from a design point of view is the box model. Let me say that again just so you really get the complete and painful irony. The language designed for designers jams all web design into a BOX model. Designers by nature want to think non-linearly and outside-the-box and the language they've been given confines them to a hierarchical tree of boxes. Seriously. So it's hobbled as a programming language and it's a cruel form of torture as a design language.
Allow me to introduce you to the Framework Adoption Antipattern. And with it I will share some software history that you youngin's might do well to learn.
The software industry is built around cycles of new adoption. The churn creates an artificial pressure to keep up with the latest and greatest. There's always a new hotness. There's also the illusion that this time around maybe we'll get started on the right foot. Maybe this new hotness will not lead us into a tangled mess.
For those of us who've been to more than one rodeo, it's a depressing to watch history repeat itself in the new hotness, just like it did in the old and busted. The next wave is super tempting. Get ahead of the crowd and you can become the hot shot writing books or speaking at conferences. The early adopters always seem like the coolest kids on the block. Added bonus, you can ditch the tangled mess you're in and start fresh. But every revolution becomes the new establishment. Which is why we keep going in circles.
Advice for a young programmer
I know how this sounds to you. I'm just old and crotchety. I don't get it. I'm part of the establishment. This is a new world. What you're building is really going to change everything.
You're right. You are going to change everything. But you will also learn the truth in the cliche: the more things change the more they stay the same. In five or ten years you will look back at what you've created and see some depressingly familiar tangles. And there will be another new hotness. Your once revolutionary new hotness will grow up to become the new old and busted.
This story is for the long term. As an industry we still don't know how to teach what we do. The only way to learn these lessons is to join a revolution and experience the transformation to establishment. This advice-disguised-as-a-story is for programmers starting their second rodeo.
Historical background on the path to MVC architecture for web apps
Sherman, set the WAYBAC machine to 1995. It was a momentous year.
Three items launched with particular fanfare: the Internet was
commercialized, Sun released Java, and Netscape released
released one year earlier. The first public announcements of PHP and
ruby were also in 1995. And the first working draft of XML was in
1996. All of these things in their respective communities were the
new hotness. All of them are now the establishment. I'll also mention that
was published in late 1994 'cos it comes into the story later.
At the time enterprise computing was dominated by two-tier, client-server architecture: a fat Windows client connecting to a fat database. Over the next few years web applications would be dominated by Perl CGI and Cold Fusion and its copycats: ASP, JSP, and PHP. Sun, IBM, Oracle, WebLogic, BEA and others jumped on the new three-tier architecture. They were selling java middleware in hopes of breaking Microsoft's grip on desktop computing. Instead of a fat Windows client, businesses could use the web browser that's installed with the OS and move their applications onto expensive servers.
By the turn of the century, Internet Explorer had nearly won the browser wars. Netscape would be bought by AOL in a few years. On the server side, Sun and friends were facing backlash against Enterprise Java Beans (EJBs) and Microsoft started its push to move the ASP community to .NET. Sun began evangelizing the Model 2 architecture as the new hotness: separate the display of content from the business logic. It was a fashionable pitch at the time: CSS was promising similar benefits of separating design from markup.
Sun's model 2 marketing and MVC
It was right at the turn of the century when our cultural wires got crossed and we started using the MVC pattern to describe web architecture. MVC was a profound innovation in object-oriented user interface design from Smalltalk-80. That dash eighty refers to 1980 so we're clear that the pattern was already twenty years old at the time. In fact, MVC is used in Design Patterns as an example to help explain what a design pattern is. This was a rare moment when the new hotness was consciously applying lessons from software history.
In the final days of 1999, JavaWorld published Understanding JavaServer Pages Model 2 architecture. In May of 2000, Craig McClanahan from Sun, donated a reference implementation of Sun's Model 2 architecture to the Apache Software Foundation. Struts would become the de-facto standard for java web frameworks. No question it was a terrific improvement to apply the MVC pattern to web apps in contrast to the Cold-Fusion-JSP-ASP-PHP tag-soup model 1. And yet, and yet....
In Sun's marketing and the hype around Struts, Model 2 was described as an architecture. In every explanation the MVC pattern was used to explain the architecture. And so the Model 2 architecture, the MVC pattern, and the Struts framework were all conceptually muddled in the Java community.
And then Rails was the new hotness
Another half-decade later when Rails burst onto the scene, MVC was taken for granted as the de facto best practice for web application architecture. A new generation of programmers were introduced to web applications and MVC-as-architecture and The Rails Way at the same time.
What's wrong with MVC and Model 2 for web applications?
MVC originally lived in a Smalltalk image, which is sorta like that virtual machine you have up in the cloud running your modern web applications. Only it was a lot less complicated. Importantly the M and the V and the C were all living in the same image. When messages were passed between the different components in an MVC pattern in Smalltalk, the messages didn't have far to go.
Model 2 by contrast was full of network latency because it grew up when Sun was trying to sell hardware into the enterprise. There were browsers on the desktop and there was middleware running java on expensive hardware, and then a database (probably Oracle) running on another bit of expensive hardware.
Web frameworks have been contending with two key pieces of friction for the past decade or so. On the client side there's the statelessness of HTTP and on the back end there's the object-relational mapping to get data back and forth from a pile of object-oriented business logic on the server into a pile of relational algebra in the database.
MVC in Smalltalk suffered from neither of those key problems. Data were persisted within the image right along side the class definitions, and the View and Controller were in direct and very stateful communication.
Ever since the Model 2 architecture co-opted MVC, Model has come to mean some object-relational mapping, View is something from the menagerie of templating languages, and the Controller... Ahh the controller...
Controller as a term is meaningless. No, it's worse than that. Controller is actively destructive. I know exactly what a Controller is, and so do you. But my Controller is different from your Controller. We're using the same word to describe completely different things. The only common ground we have is that we know there's something between the Model and the View.
MVC is not an architecture and neither is your framework
MVC is a pattern. It's beautiful and full of wisdom. It's an exceptionally good example to teach the principle of separating concerns. But the coopting of MVC into an architectural framework effectively blinded us to the principles and left us with software dogma. And such powerful dogma that the Rails revolutionaries embraced the dogma wholesale even as their rhetoric railed against excessive ceremony and dogma in the java community.
If you looked at a typical rails app you'd think that MVC and ActiveRecord were the only design patterns you need. And as applications have grown from simple foundations in Rails into enterprise-sized beasts, we hear about developers reaching for Plain Old Ruby Objects to speed up their test suites. There's buzz about refactoring away from fat controllers and fat models. Rails apps have become most of what it originally opposed.
What's more insidious is the pervasive use of object inheritance in
web frameworks. Design Patterns has been published for almost two
decades and itself summarized wisdom from the previous two decades of
object-oriented design. A core principle espoused therein is to
prefer composition to inheritance and yet frameworks continue to
recommend their developers inherit. This and a database schema is all
class Post < ActiveRecord::Base; end
Yep, Rails apps are a tangled mess. Let's switch to the New Hotness.
For a few weeks I've been experimenting with some new (to me) HTML5 apis for multitouch events, device orientation and device motion. I'm planning to work these into my turtle graphics implementations, but needed to understand what information these sensors provide.
A couple weeks ago I got around to looking at Tim Bray's post about sensor kenetics on Android devices. Fun and just the nudge I needed to start visualizing the inputs from these sensors.
It was a good enough start. Fiddling around with the charts real-time helped a lot. But I'm much happier with tonight's milestone. Here is an interactive visualization of deviceorientation and devicemotion apis. The deviceorientation controls the position of three circles on the screen. The devicemotion adjusts the radius of the circles.
I'm trying to fix a bug in the popular history of computing.
We know Alan Turing and his role deciphering enigma codes in WWII. But can you name anyone who secured the Allied communications? Why were the Axis unable to decipher our codes? The heros who secured Allied communications were bound to secrecy while the history of computing was being written. Ironic: their success in keeping secrets has kept their role secret too.
Over the holidays in 2001 I met Sarah's extended family for the first time. I was introduced as a computer programmer to her grandfather, Ralph Miller. "What do you know about the Internet?" he asked like the opening question in an oral exam. Over the next couple hours I remember feeling like we were modems trying various ways to handshake. He was speaking the telecom jargon of an electrical engineer who'd been retired for 20 years. I was speaking with the limited telecom knowledge left over from configuring Ascend and Cisco routers with frame relay and ISDN lines five years earlier. It was hard to find common ground.
"Grandpa's telling Eric how he invented the Internet."
"Oh good. Maybe he'll be able to explain it to the rest of us."
In retrospect, I'm profoundly lucky to have had regular conversations over the past decade with one of the pioneers of the digital age. He didn't invent the Internet. It was more fundamental than that. Ralph worked at Bell Labs on the team which created the X System, as it was called at Bell Labs, known as SIGSALY when it was in service. The National Security Agency has heralded it as the start of the digital revolution. But it and its engineers need a promotion in the popular history.
In that first conversation with Ralph, there was one point that sticks in my mind. It was one of the few places where we had common language. He said "they brought in a hot-shot kid from MIT to try to break the code. What was his name?" It was a name I'd heard before. "Shannon. Claude Shannon." After a moment of reflection he added, "He never could break it."
I think what struck me most was his tone of voice. He completely lacked the sense of reverence I'd always heard from people talking about Claude Shannon. Here was my fiancée's grandpa describing one of the demigods of computing as a bright kid, a math wiz who'd nevertheless been beaten in by a math problem. There were other names which Ralph reveres: R. C. Mathes, R. K. Potter, and H. W. Dudley. But Claude Shannon was just a youngster in Ralph's eyes, and given too much credit as an individual for work that was created by a very high performing team.
The cypher used in SIGSALY was a one-time pad. Shannon ended up writing a proof that the one-time pad is unbreakable. Part of the reason Shannon's initial publications on cryptography and information theory were so complete is because he'd been involved in analyzing the most ground-breaking secret communications system of the day -- a system that would remain a tightly guarded military secret for another thirty years. The implementation and essential innovation came first. The groundbreaking theory came second. But the popular history of computing was written while the implementation was still under wraps.
On Friday, Talk of the Nation interviewed Jon Gertner about his new book The Idea Factory: Bell Labs and the Great Age of American Innovation. The chapter on Shannon is a perfect example of popular history missing this key part of the story. Although the rest of the world were taken by surprise by the insights in the "Communication Theory of Secrecy Systems", and "A Mathematical Theory of Communication" neither Ralph nor his colleagues were. For them Shannon had captured the common knowledge among the engineers involved in the project.
There are a few details which Gertner gets wrong about Pulse Code Modulation (PCM), by the way. On page 127 he writes "Shannon wasn't interested in helping with the complex implementation of PCM -- that was a job for the development engineers at Bell Labs, and would end up taking them more than a decade." On the contrary, a patent for PCM was filed in 1943 by Ralph and his assistant Badgley. From the outset of the project Bell Labs were looking for a way to combine the Vernam cypher, the one-time pad which had been devised for telegraph encryption, with H. W. Dudley's vocoder which could compress and synthesize speech. PCM was the inflection point. Analog to digital. Once the signal was digital it could be combined with a random key, the essential ingredient for unbreakable encryption. It's not so much that Shannon wasn't interested. That work had already been done by mere "development engineers".
Here's the other interesting part. Ralph turned 105 in March. I'll have a chance to visit with him again this summer. Got any questions for him? Imagine you could talk to someone like Turing or von Neumann or Shannon. What would you want to know?
Here's a list of inventors and their patents related to SIGSALY. These are some of the unsung heros.
|A. A. Lundstron, L. G. Schimpf||3,897,591||8/27/42||7/29/75|
|A. E. Melhose||3,891,799||9/27/44||6/24/75|
|A. J. Busch||3,968,454||9/27/44||7/6/76|
|D. K. Gannett||3,893,326||9/27/44||9/28/76|
|D. K. Gannett||3,924,075||3/20/47||12/2/75|
|D. K. Gannett||3,934,078||5/1/46||1/20/76|
|D. K. Gannett||3,944,744||5/10/45||3/16/76|
|D. K. Gannett||3,944,745||5/10/45||3/16/76|
|D. K. Gannett||3,953,677||5/10/45||4/27/76|
|D. K. Gannett||3,953,678||5/10/45||4/27/76|
|D. K. Gannett||3,965,297||5/1/46||6/22/76|
|D. K. Gannett, A. C. Norwine||3,983,327||7/9/45||9/28/76|
|H. L. Barney||3,193,626||12/29/44||7/6/65|
|H. W. Dudley||3,470,323||6/30/44||9/30/69|
|H. W. Dudley||3,985,958||12/18/41||10/12/76|
|K. H. Davis, A. C. Norwine||3,024,321||12/29/44||3/6/62|
|L. G. Schimpf||3,394,314||7/17/43||7/23/68|
|M. E. Mohr||3,076,146||12/27/45||1/29/63|
|M. E. Mohr||3,188,390||12/20/43||6/8/65|
|N. D. Newby, H. E. Vaughan||3,373,245||8/27/42||3/12/68|
|R. C. Mathes||3,967,066||9/24/41||6/29/76|
|R. C. Mathes||3,991,273||10/4/43||11/9/76|
|R. H. Badgley, L. G. Schimpf||3,405,362||12/20/43||10/8/68|
|R. H. Badgley, R. L. Miller||3,912,868||7/17/43||10/14/75|
|R. K. Potter||3,340,361||7/9/45||9/5/67|
|R. K. Potter||3,967,067||9/24/41||6/29/76|
|R. L. Miller||3,887,772||6/30/44||6/3/75|
|R. L. Miller||3,965,296||6/30/44||8/24/76|
|R. L. Miller||3,976,839||6/30/44||8/24/76|
Typing on phones is a drag. Programming requires typing. Game over for programming on touch screens, right? What if programming didn't require typing? Here's a first shot, very incomplete.
Still too early to call this programming. If you squint and turn your head to the side you can see something like a function with no parameters. There are no loops, branches, nor variables. But it does work on a touch device.
Try it on your phone: http://dobbse.net/turtle/wander