On Leaving W3C

Today is my last day at the World Wide Web Consortium (W3C). It’s been an amazing experience working there for the past almost-a-decade (9 years and 6 months). In the time-honored tradition of reflecting on your last job as you leave it, I thought I’d jot down a few thoughts. It’s hard to know what to say… building Web standards has been so much of my identity, so central to how I approached the world, for so long, that it’s hard to unpack it all.

It’s pretty common for people leaving the W3C Team (as we call W3C staff) to take a negative attitude toward W3C. That’s fair; it’s a political atmosphere, and most of us pour ourselves into the importance of the mission, so we accumulate a strong emotional charge, and leaving releases that potential energy. There’s also the frustration of living in the bureaucracy (and sometimes inertia) that comes from painstaking stewardship, and wishing it could work better. But I’m making a conscious effort to focus also on the positive aspects of W3C… not just what it’s doing wrong, but how well it’s doing many things. And there’s a lot of good there, more good than bad.

For me, there’s also the regret of things left undone.

But it’s always left undone. It’s time to turn the page.

Continue reading “On Leaving W3C”

Topic of Cancer

I’m now officially a cancer survivor! Achievement unlocked!

A couple weeks ago, on July 27th, during a routine colonoscopy, they found a mass in my ascending colon which turned out to have some cancer cells.

I immediately went to UNC Hospital, a world-class local teaching hospital, and they did a CT scan on me. There are no signs that the cancer has spread. I was asymptomatic, so they caught it very early. The only reason I did the colonoscopy is that there’s a history of colon cancer in my family.

Yesterday, I had surgery to remove my ascending colon (an operation they call a “right colectomy”). They used a robot (named da Vinci!) operated by their chief GI oncology surgeon, and made 5 small incisions: 4 on the left side of my belly to cut out that part of the right colon; and a slightly larger one below my belly to remove the tissue (ruining my bikini line).

Everything went fine (I made sure in advance that this was a good robot and not a killer robot that might pull a gun on me), and I’m recovering well. I walked three times today so far, and even drank some clear liquids. I’ll probably be back on my feet and at home sometime this weekend. Visitors are welcome!

There are very few long-term negative effects from this surgery, if any.

They still don’t know for certain what stage the cancer was at, or if it’s spread to my lymph nodes; they’ll be doing a biopsy on my removed colon and lymph nodes to determine if I have to do chemotherapy. As of right now, they are optimistic that it has not spread, and even if it has, the chemo for this kind of cancer is typically pretty mild. If it hasn’t spread (or “metastasized”), then I’m already cured by having the tumor removed. In either case, I’m going to recover quickly.

My Dad had colon cancer, and came through fine. My eldest sister also had colon cancer over a decade ago, and it had even metastasized, and her chemo went fine… and cancer treatments have greatly improved in the past few years.

So, nobody should worry. I didn’t mention it widely, because I didn’t want to cause needless grief to anyone until after the operation was done. Cancer is such a scary word, and I don’t think this is going to be as serious as it might otherwise sound.

I’ll be seeing a geneticist in the coming weeks to determine exactly what signature of cancer I have, so I know what I’m dealing with. And I want to give more information to my family, because this runs in our genes, and if I’d gotten a colonoscopy a few years ago, they could have removed the polyp in the early stages and I’d have never developed cancer. (And because I’m otherwise healthy, I probably wouldn’t have gotten the colonoscopy if I hadn’t had insurance, which I probably wouldn’t have had if Obamacare didn’t mandate it. Thanks, Obama!)

Yay, science!

Future Plans

So, the cliché here is for me to say that this has opened my eyes to the ephemerality and immediacy of life, and that I’m planning to make major decisions in my life that prioritize what I truly value, based on my experience with cancer.

But the fact is, I’ve already been doing that recently, and while the cancer underscores this, I’ve already been making big plans for the future. I’ll post soon about some exciting new projects I’m trying to get underway, things that are far outside my comfort zone for which I’ll need to transform myself (you know, in a not-cancerous sort of way). I’ve already reduced my hours at W3C to 50%, and I’m looking at changing my role and remaining time there; I love the mission of W3C, which I see as a valuable kind of public service, so no matter what, I’ll probably stay involved there in some capacity for the foreseeable future. But I feel myself pulled toward building software and social systems, not just specifications. Stay tuned for more soon!

I’m optimistic and excited, not just about leaving behind this roadbump of cancer, but of new possibilities and new missions to change the world for the better in my own small ways.


Today (Friday, 26 August), I got the results of my biopsy from my oncologist, and I’m pleased to announce that I have no more colon cancer! The results were that the cancer was “well-differentiated, no activity in lymph nodes”, meaning that there was no metastasis, and I’m cured. This whole “adventure” emerged, played out, and concluded in just a month: I heard there was a tumor, was diagnosed with cancer, consulted an oncologist, had surgery, recovered, and got my cancer-free results all in 30 days. It felt much longer!

A Life in a Day, in 2024

I woke up startled; my glasses were ringing. I was late for a telcon… again. I’d stayed up working too late last night.

I slipped on my glasses and answered the call. Several faces popped up a few feet in front of my eyes. Okay, so it was a videocon… sigh. I muted and blanked my glasses, switched them to speakerphone, and placed them on the table, the lenses vibrating as speakers. I pulled on some clothes and rubbed my face awake, trotting into the bathroom with my glasses in my left hand. As I splashed some water on my face, I heard my name called from my glasses on the counter; “Doug, did you get in contact with them?”

“Specs, delay,” I told my glasses, and my phone agent told the other participants, politely, “Please wait for 10 seconds for a response.”

Drying my face quickly on a towel, I put my glasses back on, looked into the mirror, unblanked the camera and unmuted the mic, and replied, “Hey, folks, yes, sorry about that. I did talk to them, and they are pretty receptive to the idea. They have their own requirements, of course, but I’m confident that we can fold those into our own.” I noticed my own face in the display, broadcast from my camera’s view of my reflection in the mirror, and hastily straightened out my sleepy hair.

A few minutes later, when the topic had changed, I opened a link that someone dropped into the call, and started reading the document in the glasses’ display. With the limited space available, I scanned it in RSVP (rapid serial visual presentation) mode, but quickly found it too distracting from the simultaneous conversation, requiring too much concentration. So I muted and blanked again, and walked down the hall to my office. Ensconced in front of my big screen, I re-routed the call to use the screen’s videocamera and display.

On the screen, it was easier to scan the document at my leisure. I could easily shift my focus back to the conversation when needed, without losing my place in the document. I casually highlighted a few passages to follow up on later, and made a few notes. I did the same with another document linked from the telcon, and my browser told me that this page was linked to from a document I’d annotated several months before. I marked it to read and correlate my notes in depth after the call. One thing that stood out immediately was that both documents mentioned a particular book; I was pretty sure I’d bought a physical copy a couple of years before, and I stepped over to my bookshelves. I set my glasses camera on auto-scan, looking for the title via OCR, and on the third set of shelves, my glasses blinked on a particular book; sure enough, I had a copy. I guess I could have simply ordered a digital version, but I already the physical edition handy, and sometimes I preferred having a real book in my hands.

My stomach started grumbling before the call ended. I decided to go out to lunch. Throwing the book and one of my tablets into my bag, I asked my glasses to pick a restaurant for me. It scanned the list of my favorites, and looked also for new restaurants with vegetarian food, looking for a nice balance between distance, ratings, and number of current patrons. “I’ve found a new food truck, with Indian-Mexican fusion. It’s rated 4.5, and there are several vegetarian options. Dave Cowles is eating there now. It’s a 7-minute drive. Is that okay, or should I keep looking?”

“Nope, sounds great. Call Dave, would you?” A map popped up, giving me an overview of the location, then faded away until it was needed. A symbol also popped up, indicating that my call to Dave had connected, on a private peer session.

“Hey, Doug, what’s up?”

“I was thinking of going to that food truck…”, I glanced up and to the right, and my glasses interpreted my eye gesture as a request for more context information, displaying the name of the restaurant, “… Curry Favor. You’re there now, right? Any good?”

“I just got here myself. Want me to stick around?”

“Yeah, I’ll be there in about 10 minutes.” I headed out the door, and unhooked my car’s charger before I jumped in. My glasses showed the next upcoming direction, and the car infographics; the car had a full charge. “Music”, I said, as I drove off; my car interface picked a playlist for me, a mix of my favorites I hadn’t heard in a while, and some new streaming stuff my music service thought I would like. As I got out of range of my house’s wifi, my glasses switched seamlessly to the car’s wifi. It was an easy drive, with my glasses displaying the optimal route and anticipating shifting traffic patterns and lights, but I still thought how nice it would be to buy one of the self-piloted cars. My car knew my destination from my glasses, and it alerted me that a parking spot had just opened up very near the food truck, so I confirmed and it reserved the spot; I’d pay an extra 50¢ to hold the spot until I arrived, but it was well worth it. My glasses read out the veggie menu options out loud on demand, and I chose the naan burrito with palak paneer and chick peas; my glasses placed my order in advance via text.

I pulled into my parking space, and my glasses blinked an icon telling me the sub-street sensor had registered my car’s presence. Great parking spot… I was right across the street from the food truck. I walked over to the benches where Dave sat. “Hey, Dave.”

We exchanged a few words, but my glasses told me my order was ready in a flash. I went to the window, and picked up my burrito; the account total came up in my view, and I picked a currency to pay it; I preferred to use my credit union’s digital currency, and was glad when the food truck’s agent accepted it. “Thanks, man,” I smiled at the cashier.

Dave and I hadn’t seen each other in a while, and we caught up over lunch. It turned out he was working on a cool new mapping project, and I drilled him for some details; it wasn’t my field, but it was interesting, and you never knew when a passing familiarity might come in handy. With his okay, my glasses recorded part of our conversation so I could make more detailed notes, and his glasses sent me some links to look at later. We finished our food quickly –it was tasty, so I left a quick positive review– and walked to a nearby coffee shop to continue the conversation. While we were talking, Dave recommended an app that I bought, and I also bought a song from the coffee shop that caught my ear from their in-house audio stream; Dave and the coffee shop each got a percentage of the sale. I learned that the coffee shop got an even bigger share of the song, because the musician had played at their shop and they’d negotiated a private contract, in exchange for promotion of her tour, which popped up in my display; that was okay, I liked supporting local businesses, and I filed away the tour dates in my calendar in case it was convenient for me to go to the show.

Dave went back to work, and I settled into the coffee shop to do some reading. First I read some of the book I’d brought, making sure to quickly glasses-scan the barcode first so I could keep a log; I found several good pieces of information, which I highlighted and commented on; my glasses tracked my gaze to OCR the text for storage and anchoring, and I subvocalized the notes. I then followed up on the links from earlier; my agent had earned its rate, having found several important correlations between the documents and my notes, as well as highly-reputed annotations from others on various annotation repos, and I thought more about next steps. I followed a few quick links to solidify my intuition, but on one link, I got stopped abruptly at an ad-wall; for whatever reason, this site insisted I watch a 15-second video rather than just opting-in to a deci-cent micropayment, as I usually did when browsing. I tolerated the video –unfortunately, if I took my glasses off while it played, the ad would know– only to find that the whole site was ad-based… intolerable, so I did some keyword searching to find an alternate site for the information.

Light reading and browsing was fine in a public place, but to get any real work done, I needed privacy. I strolled back to my car –my glasses reminding me where I’d parked– and I returned home. Back in my office, I put on some light music, and started coding. I started with a classic HTML-CSS-SVG-JS doc-app component framework on my local box, because I was old-school, and went mobile from there, adding annotated categories to words and phrases for meaning-extraction, customizing the triple-referenced structured API, dragging in a knowledge-base and vocabulary for the speech interface and translation engine, and establishing variable-usage contract terms with service providers (trying to optimize for low-cost usage when possible, and tweaking so the app would automatically switch service providers before it hit the next payment threshold… I’m cheap, and most of my users are too). I didn’t worry much about tweaking the good-enough library-default UI, since most users would barely or rarely see any layout, but rather would interact with the app through voice commands and questions, and see only microviews; I paid more attention to making sure that the agents would be able to correctly index and correlate the features and facts. Just as I was careful to separate style from content, I was careful to separate semantics from content. At some point, I reflected, AIs would get powerful enough so that information workers wouldn’t have such an easy time making a living; I wondered if we’d even need markup or APIs or standards at all, or if the whole infrastructure would be dynamic and ad-hoc. Maybe the work I was doing was making me obsolete. “‘Tis a consummation devoutly to be wished,” I thought to myself wryly.

I put the finishing touches on the app prototype, wrote some final tests, and ran through a manual scenario walk-through to pass the time while the test framework really put the app through its paces, spawning a few hundred thousand virtual unique concurrent users. Other than a few glitches to be polished up, it seemed to work well. I was pretty proud of this work; the app gave me real-time civic feedback, including drill-down visualization, on public policy statements, trawling news sites, social networks, and annotation servers for sentiment and fact-checking; it balanced opinion with cost-benefit risk-scenarios weighted by credibility and likelihood, and managed it all with voting records of representatives. It also tracked influence, either by lobbying or donations or inferred connections, and correlated company ownership chains and investments, to give a picture on who was pushing who’s buttons, and it would work equally well for boycotting products based on company profiles as it would on holding politicians accountable. As part of the ClearGov Foundation’s online voting system, it stood a chance of reforming government, though it was getting more adoption in South America and Africa than it was in the US so far. Patience, patience…

Megan came home from work with dinner from a locavore kitchen; the front door camera alerted me to her approach, and I saw she had her hands full. “Open front door,” I told the house as I rose to help her. We ate in front of the wallscreen, watching some static, non-interactive comedy streams; we were both too tired to “play-along” with plots, character POV, or camera angles, and it wasn’t really our style anyway. I hadn’t gotten enough rest the night before, so I turned in early to read; the mattress turned off the bedside light when it sensed my heart-rate and breathing slow into sleep.

Note: This story of the Web and life in 2024 is clearly fictional; nobody would hire someone who’d worked in web standards to do real programming work.

Archived Link Thunderbird Extension

This week is our first Geek Week at W3C. The idea is to have a week where we improve our skills, collaborate with each other on prototypes and fun projects that help W3C, and to come up for air from our everyday duties. I’m working on a few projects, some small and some larger.

One of my projects is to make a plugin for Thunderbird, my email client of choice, which exposes the Archived-At email message header field, normally hidden, as a hyperlink. This is useful for W3C work because we often discuss specific email messages during teleconferences (“telcons”), and we want to share links to (or otherwise find or browse to) the message in W3C’s email archives. It’s also handy when you are composing messages and want to drop in links referring to other emails. (I do way too much of both of these.)

I’ve made extensions for Firefox before, but never for Thunderbird, so this was an interesting project for me.

Continue reading “Archived Link Thunderbird Extension”

Retain Accessibility Immediately

There has been a heated argument recently on the W3C Canvas API mailing list between accessibility advocates and browser vendors over a pretty tricky topic: should the Canvas API have graphical “objects” to make it more accessible, or should authors use SVG for that? I think it’s a false dichotomy, and I offer a proposal to suggests a way to improve the accessibility potential of the Canvas 2D API by defining how SVG and the Canvas 2D API can be used together.

This brings together some ideas I’ve had for a while, but with some new aspects. This is still a rough overview, but the technical details don’t seem too daunting to me.

Continue reading “Retain Accessibility Immediately”

The Timble

Anyone who has seen Tim Berners-Lee do any public speaking knows that he speaks very quickly. Too quickly, in fact, for non-native speakers, and some native speakers, to follow along. The words seem to tumble out of him, long after his mind has moved onto the next thing he’s planning to say, and the thing beyond that. W3C’s communications lead will frequently signal him to slow down, and Tim will step down to a slower-than-normal rate of speech and slowly build back up to his own “normal” auctioneer rate.

It’s not a coincidence that he’s one of the creators of the Web. From working with him at W3C these past few years, I’ve observed that his mind does seem to spin at a few cycles faster than the norm. He makes connections quickly, and even when I don’t agree with his conclusions, I admire his ability to grasp situations rapidly, and to revise his opinions progressively as he is given more information. He also shows a remarkable humanist take on topics, not just a technologist take. The Web, for him, was always less about the technologies involved than about the goals that could be accomplished with those tools; technology is necessary but not sufficient, just a means to an end.

And Tim is impatient to get to that end. It’s reflected in his rate of speech. It’s clear from the way he moved on from the solved problem of HTML (including XHTML and HTML5, mere refinements on the basic approach), to the idea of linked open data. People laughed at the Semantic Web a decade ago, and now companies like Google, Yahoo, and Microsoft are scrambling to put their own stamp on it, and governments are deploying it. Once again, Tim was ahead of the game, leading the pack.

On the W3C staff, we laugh about how Tim (or “timbl”, his email shortname and IRC nick) types as quickly as he speaks, with a cornucopia of typos. Sorting out the jumble is left as an exercise to the reader.

Some people can understand the spoken word at an astonishing rate. I once called a blind colleague, who listens to his screen-reading software at treble-speed, and he impatiently told me to speak more quickly. If you’re a seeing (and hearing) person, and you get a chance to listen to a blind person use their screen reader, prepare to be blitzed and dumbfounded. Paragraphs roll by at a modulated buzz, and you’ll be lucky to pick up a word or two; menu navigation is a staccato of half-spoken stutters as familiar items are tripped through like a stone skipping across water. Tim doesn’t speak that fast, thankfully… he speaks just fast enough that you have to listen carefully.

That’s why some of us on the W3C staff have developed a new unit of measurement: the timble. 1 timble is the uppermost rate of speech at which a normal person can understand what’s being said in their native language. On average, I’d guess most people speak in the range of 0.5 to 0.7 timbles; screen-readers are often operated at 2 or even 3 timbles; southerners (I live in North Cackalacky, USA) speak at about 0.4 timbles.

I recently teased TimBL about the timble at dinner in Bilbao, Spain, after he’d given a wonderful presentation at a local Web conference at a very equitable 0.8 timbles. He graciously offered an alternate definition: speech at more than 1 timble is difficult to understand; speech below 1 timble is simply boring.

Getting In Touch

Last week, I published the first draft (and subsequent updates) of the Web Interface specification, which defines touch events. This is the first spec from the new W3C Web Events Working Group.

Peter-Paul Koch (PPK) gave it a positive initial review. Apparently, others thought it was news-worthy as well, because there were a few nice write-ups in various tech sites. Naturally, cnet’s Shank scooped it first (he has his ear to the ground), and it was fairly quickly picked up by macgasm, electronista, and Wired webmonkey.

I thought I’d go into a few of the high-level technical details and future plans for those who are interested.
Continue reading “Getting In Touch”