The Four Jedi (software architects) You Meet in Heaven

Millenium Falcon at Galaxy's Edge

*with apologies to Mitch Albom, of course. We recently got Disney+, and I finished watching The Mandalorian, and then, casting about for what else there is worthwhile, dove headlong into The Clone Wars animated series. It’s great! And somewhere deep in my subconscious, this analogy emerged.

After writing about the paths to Senior Developer, I’ve been thinking about what comes after that. Specifically about what comes after that at TELUS Digital, because that’s where I’m working. Important caveat here:
these are my musings, and not official TELUS Digital policy.
It should also be noted that the types of technical roles quickly bifurcate over and over like a career river-delta if you survey the industry: tech leads, team leads, staff engineer, fellows, principals, architects, managers, etc, so as you talk about more senior technical roles, there’s a higher-chance that you’re being very specific to your current organization. I asked on twitter about one facet of this too, as I thought about tech leads vs architects.

That I ask this question probably reveals a fair amount of how things work at TELUS Digital. But this also made me thing very specifically about the delta in those roles here, as I see it. I’ll talk about the tech lead role another time, but want to think about Software Architects, and the kinds of skills, focus and interests that make for good candidates.

Anakin Skywalker: The Solution Architect

Anakin Skywalker is confident, quick thinking – surging headlong into trouble and thinking creatively on his feet to find a solution to the problem in front of him. This is the kind of architect I’m the most comfortable speaking about because it is how I self-identified when doing this kind of work. It is a broad rather than deep role – a good solution architect can think creatively, laterally about a problem and propose a variety of potential outcomes to solve it. I’ve seen this role advertised many places as a web architect as well. If you’re a developer who loves the challenge of working in an unknown space, who prefers prototyping to refactoring or incremental improvements, this is a great fit. A solution architect has an obligation to keep current on shifting trends – be knowledgeable across a varied set of technologies, frameworks, languages, etc. More importantly than what you do know is a deep understanding of what you don’t know. A good solution architect will know to bring in other expert voices to stress-test and verify an idea. Because of this need, solution architects need to have pretty solid social skills as well to succeed, and should work more closely with developers on a day-to-day basis than others.

Of note: if you know the story of Star Wars, you’ll know that things do not end well for Anakin Skywalker, who falls to the dark side and becomes a force for evil. While this may be really stretching this metaphor, this is a useful thing for solution architects to think about too – too often we hear of Ivory Tower, or “architecture astronaut” if you don’t stay connected to the work happening.

Mace Windu: The Domain Architect

Mace Windu is a master of lightsaber combat, wise and careful of words, thinking deeply about the practice of his craft. He is a specialist, through and through. Domain Architecture is a path for deep specialization and knowledge. This is also variously called a practice lead. This is a good role to think long and deeply about a particular topic to develop unparalleled expertise in it. Spending time learning from, and talking to other experts in your domain is important in this role, to ensure that you do not end up so specialized in your particular company’s needs as to lose relevance. A key duty of this architect role is to impart knowledge to the people around you – both technical and non. More so than a solution architect, ensuring that there is a succession plan – that there is a clear apprentice to follow on is a good practice. Some examples of domains of specialization: Network Security, Testing, Telemetry, Performance, Data, etc. This role is still one of software, and so spending time writing, refining best-practice examples; pushing the state-of-the-art forwards are key outputs from a domain architect. Leading development of mission-critical implementations within your domain is also an important role for this kind of architect.

Obi-Wan Kenobi: The Integration Architect

Obi-Wan Kenobi always saw the underlying truth of a scenario as a way to find the way through it – never hurried, always already where others were going to. He understood, holistically, what was happening around him. This is a role for understanding what lays beneath: how different tools, architectures, endpoints, etc work together. It demands an incredible ability to keep hugely complicated systems in mind and be able to anticipate side-effects and unexpected outcomes from new additions. I’ve only encountered this type of architect at very large companies that have very complex, often legacy systems – a smaller scale operation is unlikely to require this kind of specialization. Interestingly, I’ve met several excellent integration architects who came into this from a business or system analyst role, as well as from a developer role. Integration architects, in my experience, make and maintain the best diagrams, are excellent teachers to others, and unparalleled in their ability to reduce complexity into more digestible chunks. If you’re a developer who likes working on edge-cases, who likes to find the weird exception to every rule, or who likes to think holistically about an entire system, this is an excellent role. It is very much a role of incremental operation of a whole system, rather than rapid prototyping or deep dives. It’s a great place to iterate on a particular system for a long time. If you want to have impact at scale through small incremental change, this is exactly that type of role.

Yoda: Principal Architect

Yoda guides the entire Jedi council with a clear vision of what a Jedi can be, should be, and helping everyone stay on the path to that vision. Including a Principal Architect may seem something of a cheat, in that I would consider this an additional tier of seniority than others – certainly at where I work now it is – but it isn’t always. Sometimes, it is the only architect. And this is largely a visionary role, to look forwards towards what might be, what can be, what must be for organizational success. It needs to be an architect who can think broadly about the range of opportunities to improve technology, who can work with other leadership to translate technological vision into business relevance. At some places, this role may well simply be the Chief Technology Officer (CTO). At somewhere large enough, there may be multiple principal architects, spread across the vast swathes of organizational complexity. A key duty of this role is to help support all the other architects as well – to provide support for their decisions, to provide rigour in decision-making, to ensure that everything roles up towards a coherent vision of good.

Do these kinds of role exist where you work? There’s probably all sorts of variants of this. For TELUS in particular, the software architect path is generally for people who want to stay primarily as individual contributors, rather than leads/managers. A path of bits, rather than heartbeats as it were. One thing I wish I had more insight is to what some key differences between something like a staff engineer and a software architect. Do places have both? My instinct is what I’m calling a solution architect may be akin to the staff engineer elsewhere, but YMMV.

Making Money isn’t Enough: Considerations for a digital company in 2017

I’m starting a new company, with Steve Fisher. I can’t reveal just yet the first product we’re building, but don’t worry, there’ll be more on that in the not-so-distant future. This post is focused on deeper considerations about the company we’re building and why those matter.

In 2015, at The Design & Content Conference, I watched Sara Wachter-Boettcher give a career-altering talk on designing for kindness.

In 2016, this theme was built upon at the same conference by virtually every speaker, but I want to call out two in particular who’ve stuck with me:

Anil Dash’s talk “Storytellers” — particularly the unintended consequences of design decisions, and accountability at a personal and organizational level for them. The Alexa Story, starting at 3:45 is such an excellent illustration.

Ron Bronson’s talk “Designing for Empathy: Context Matters”, largely about the idea of recognizing one’s own blinders and working through them to discover other context, other uses. The story about Pokemon Go/Ingress starting at 6:35 is solid example.

These three talks in particular have informed a pretty massive shift in my approaches to: code, operations, design, business, etc. Positive shifts that also come with a set of concerns that I hadn’t been as focused on.

Concerns we will address:

Geophysical Portability

We’re going to have customers, at least to start, across North America. I also recognize that where your data resides has become more important than ever. For many companies and organizations, where their data is stored, and the locations of libraries they use matters. In BC (Canada!), government entities aren’t allowed to use US-soil services. Governments will use our product, and so we need to ensure that every piece of code we run, every piece of data we store resides in their country. So, from the get-go, we need to write software that is portable and works with third-party-hosted libraries, tools or services that are similarly portable. It’s an interesting and exciting constraint.

Our Privilege

We’re two middle-aged, middle-class white dudes from Canada starting a tech company that is mostly self-funded. The privilege in that is staggering. Basically, we’re walking around with blinders on when we think about the product, the company. We need to do the work to see past our own biases and blinders. First step for us is to engage with a set of advisors who are significantly more diverse than we are, to provide alternative insights and voices in our planning. AND, most importantly, pay these advisors fairly for their time.

Protecting the Vulnerable

The product we’re building exists to help our customers. But it only took a few minutes to think of how someone could abuse our product to help them advance their less-wholesome objectives. Unintentionally so, but it would be easy to use this tool for bad. It is our responsibility to build in constraints, restrictions, community, etc that could prevent this. But, how far does that go? Not everyone is going to do things I like or agree with — but they’re not necessarily bad. Nothing is politically neutral, but do we want to have “neutral” software, run by an opinionated company? We’ve seen first hand the damage that can cause with companies like Twitter. We need to dig in and discover what tools, features, guidelines, legalese we can, and should, build into the product and company. It should be part of our job to build tech that protects the most vulnerable.

Data Security

We will be tracking lots of activity within our product. This is for growth targets, sales, insights and customer support. All good reasons. And we will require user accounts. For most use-cases that’s not an issue, but I can’t guarantee that’s true for everyone who might see a use of our product.

Some guidelines we’re working from:

  1. Data security must be baked-in at every level.
  2. We hold our customers’ data in trust, and data should only be used for our collective greater good
  3. Our customers’ data remain theirs, and they should be able to remove it as needed.
  4. Data available online is vulnerable. It is our duty to both minimize that vulnerability and to act responsibility if it is ever targeted.
  5. Privacy doesn’t mean the same thing to everyone. We must support various levels.

Tools & Software

We’re building the first version of this software on our own, but hope to be able to hire engineering staff quickly. Cutting edge is less important than broadly known — for both hireability and learnability. Our team will leverage, and contribute back to, existing and potentially new open source projects. Where possible, we will prefer open-source tools over proprietary (particularly given our concerns about geophysical location). We’re building a web-app first, but I expect we’ll soon see use for both a mobile-native app for at least parts, and, perhaps less expectedly, a desktop app.

HR

We need things such as revenue, first, but I’ve learned from my previous companies that hiring happens faster than you’d expect, and having values and policy in place is essential.

Some things we’re thinking about:

  1. Both Steve and I have dogs, who we want to have around the office.
  2. We have kids, and we like to see and hang out with them.
  3. We like to travel.
  4. Be able to offer valuable perks like full parental leave, childcare and flexible hours.
  5. Encourage and support community involvement and personal passions.
  6. Collaboration and conversation is better than competition and direction.

But, I don’t want an office full of perks and free things. There’s some middle ground of “recognize your staff have lives and accommodate fully” and “your staff should be treated like adults” Be passionate, stay passionate, engage and encourage passion, but leave work at work. A few years ago, I wrote a piece about finding other paths to success (“A million dollars isn’t cool, you know what’s cool?”), and the ethos behind that still resonates with me. I want to build that into this new company.

Utility

I’m confident we’ve identified a need and a niche that we can work in, but there’s something to the axiom “the greatest minds of my generation are figuring out how to make people click ads”. There are real problems that people can, and are solving. Teams at Code for America & the USDS are solving real problems, for real people, using tech. We need to make sure we’re contributing too — if not directly through our product, through our expertise, our staff, our community.

A Company for Good

Starting a company is hard. Most fail quickly, so adding additional constraints, or targets could feel like a fool’s game. I’d like a company that structurally contributes back to our community (see HR, above). And I’d like a company that isn’t just building a product and selling it. Existing isn’t enough. We need to put our effort, success, and privilege towards something more. We’re not totally sure where we’ll draw the line, but it’s something to figure out. It’s too easy to leave these decisions to later, and then later never comes. We need to start these as we start everything else.


These are not in order of importance and jostle from day-to-day in my head for position. As we move forward I will to write more about how we embrace these challenges and opportunities How each point becomes not only part of the product, but of our new company, and more part of ourselves as partners in this. And finally to talk openly about where compromises were made, where things are left to do, and how we’re changing.

“All we have to do is go to first principles and what you all know every single time you ship something, every single time you put something out in the world. You know what it looks like when it counts. You know what it looks like when you care about it, and all we have to prioritize, one simple thing, that the story we tell includes the most vulnerable people. If we do that much then we can be worthy of all the things that people say about the transformative potential that technology has on the world.”
Anil Dash at The Design & Content Conference

Stray thoughts about this week’s Computer Announcements

On Wednesday, Microsoft the Surface Studio:

Immediately followed by Apple announcing the Macbook Pro (2016) the very next day:

These computers show very much where each of these companies are right now. Having recently watched Steve Jobs, it also struck me as how much these companies have changed over the past 20 years, in many ways, each becoming what the other one was.

There is nothing in the new Macbook Pro that still smacks of the old “Think Different” campaigns that Apple used to live by. They’re no longer the plucky underdogs targeting the people who think different — they’re the massively popular overlords who’re targeting their audience: everyone.

Conversely, Microsoft (despite the still-dominance of Windows in corporate installs), has more or less lost the popular mindshare of personal computing. Everyone has macs. And so their audience isn’t everyone anymore. It’s people who’re not being served by Macs.

So, 20 years ago, Apple targeted creatives — who were not well served by Microsoft. Today, the opposite is true — Apple, effectively, has ceded the “pro” space, the people who need all the things on their systems. Michael Tsai’s excellent evisceration of the new Macbook Pro says that better than I could. Whereas the Surface Studio is explicitly aimed at creatives who need something “different” than what is usually available out there.

20-odd years ago, Apple was squarely targeting the 10% of people who were not being served by what Windows offered (nb: I’m not saying they didn’t want everyone using a Mac — they certainly did, and, indeed, their current success indicates their ideas were right). They targeted designers, creatives, thought leaders who helped turn their brand around. What’s weird for everyone who’s been a long-time Mac user is that whereas previously there’s a sense that “they” were the audience for Apple — that’s no longer true. Apple’s now got to appeal for the very general user, in many ways the lowest common denominator user — what used to be the domain of the Microsoft. Given their volumes of devices sold, there’s zero financial incentive to add features that appeal to only a small subset of people anymore. Conversely, and what is a pretty big cognitive shift for everyone — consumers, tech media, probably the company itself, is that it is important for Microsoft to do so: they’ve ceded the mobile space. They appear to have recognized that desktops & even laptops are increasingly niche devices, and, on top of that, they’re no longer the default choice for most shoppers. So, stealing directly from Apple’s playbook, it looks like Microsoft is asking people to Think Different these days. Gabe (from Penny Arcade) wrote a long puff-piecereview of his involvement with the development of, and use of, the new Surface Studio that is incredibly compelling.

A last thought: I’m not arguing that Apple is in any way less innovative than it was. I feel like there’s a pretty direct line of continued hardware innovation from them. What’s dramatically different is where they’re innovating. They’re no longer pushing the bleeding edge for a particular subset of users — they’re finding ways of making the bleeding edge more palatable for the mainstream.

My guess: Whereas for the past 3–4 years I pretty much only saw MacBooks & iMacs on coffee shop tables & office desks, over the next 3–4, we’re going to see an increasing number of Surfaces on both too, because suddenly, the industrial design of Microsoft products has caught up (surpassed?) with Apple, but with Windows 10, there’s a fundamentally different, but also very compelling approach to modern desk(lap)top computing. I’m not sure we’ve had a time (maybe the early-to-mid 80s?) where multiple computing hardware and OS companies have been firing on all cylinders creating such compelling competition for our dollars.

Software is eating the world like…

A river cuts through sand

This adage is well-know. Based on a Google search, it would appear that Marc Andreessen has made several comments on this topic since 2011, and is has become defacto accepted as truth. And, of course, bastardized to suit (eg. Mobile is eating the world, IoT is eating the world, etc)

I’ve been thinking a fair bit about how “regular” companies interact with software. Over the past 20 years, I’ve worked with lots of companies as they slowly embraced to-them new technologies several years after, as someone in the field, I would have thought it was a necessity – so there’s a disconnect. And, to my horror, the way in which these accounting firms, lawyers, retail shops, etc, embrace the tech now available to them is really hodge-podge and often to their initial detriment. Quite consistently, it takes an initial failed dive to get the next version “right” – which isn’t necessarily bad. But maybe there’s other ways.

That thought lead me down the path that maybe earlier recognition of technology trends for the non-technical would be useful, if possible. And then I started to think about how that might surface. Which brought me full-circle to this adage, about Software eating the world, which has an interesting corollary in “the future is already here, it’s just not evenly distributed”. So, yeah. Software is eating the world – just not evenly.

… And this is where this serious train of thought goes a little off the rails: If software is eating the world, but unevenly, what’s the best simile here, and why? And could that exercise actually be useful?

Software is eating the world like the Sarlaac:
Software is more of a trap that unsuspecting industries fall into, without realizing they’re about to. They’re then digested slowly, painfully over a long period of time. This feels like an apt metaphor for the transportation industry, who were rolling along and were suddenly confronted by Uber, Lyft, etc – and are now in the midst of slow, painful contraction and likely death.

Software is eating the world like a river:
This is another “slow” simile, but has an inevitability factor about, as well as a randomness: Software, like water, eats the easiest path through, and it meanders, is uneven, and, importantly, is constantly digging deeper in paths where it has already been. Think about communications software, and this feels true to me: it wasn’t everywhere, it was channels, and it entrenched (literally?) quickly, and kept re-doubling the victories in those spaces as it moved and opened new channels.

Software is eating the world like Galactus:
Galactus is the “world devourer”, a giant space entity and roams the universe, and when hungry, eats entire planets, destroying everything that was there, leaving nothing as it moves on. Apart from the immense disruption that occurs when Software arrives at a new industry, I’m not sure how apt this is: Software is mostly transformational, rather than destructive (although, maybe Telephone Operators and Transcription artists would beg to differ). But I like how strongly this simile enforces how radical the arrival of new software in an industry can be.

Software is eating the world like a vulture:
With this similar, the idea is that when software enters a new industry, that industry is actually already dead – it just doesn’t know it (maybe?). When design software started replacing letter-setting and manual type-setting industries – were they already on the decline? A lost art? This doesn’t feel like an appropriate simile in most cases to me.

Software is eating the world like like a cheetah:
(Or any big hunting cat). I quite like this simile, because it correlates intent, as well as the thrill of the hunt. When software enters an area, is it because there was an intent to disrupt/enter that space and remove it? The old “find a Linux command and build a web-service around it” adage of Web 2.0? Although, the chase of a big cat and it’s prey implies an element of chance that doesn’t really seem to exist. Has software ever failed to eat an industry once it has started down the path?

Software is eating the world like Prometheus’ Eagle:
I feel like this is an apt simile for software’s relationship to the media industry. Media, like Prometheus, has done so much, but is now left out, chained to a rock, while software slowly pecks away at it every night, only for it to regenerate every morning to continue on its way. It implies an immortality to the media industry, which I feel is true, but it denies the transformational nature of software: the media is not regenerated, unchanged, each morning. It is rather transformed. But this constant pecking, picking at an industry that is suffering, left exposed rings somewhat true.

Software is eating the world like…:
Like what? Probably in different ways, in different industry. Software is a many-headed hydra with innumerable ways of attacking a new enemy. But understanding both that software is inevitable (to pervert an idiom: The only things certain in life are taxes, death & software is going to eat you) in whatever your industry is, and also, that, in each of the above original stories there are survivors, heroes, escapees and even beauty is important. Most Yoga professionals I’ve met hate all work that takes them away from their passion – and resent technology for it. But I’ve also learned that I can write software that lessens their time away from their passion, that reduces resentment, because it can still bring joy and delight. Software that reduces grunt-work, without eliminating the whole job is often appreciated. Bringing software into an industry, into an office, into a single person’s life in a way that yes, perhaps does make what they used to do redundant, but also yes, provides them new tools to either do more, or do different in a newly rewarding way is a good thing, not something to fear.

Thinking about Twitter

Thinking about Twitter

I like Twitter a lot. I’ve been using it nearly a decade now. It’s an indispensable part of my professional and personal social life. Chances are, you’re reading this after seeing a link on Twitter. But Twitter is having some problems, some which I think are real, some less so. And because punditry is fun, here’s a few thoughts about Twitter, product opportunities and issues at hand.

Scale & ongoing growth

I’m going to be upfront about this: I don’t think continued growth is necessarily always a necessary, or even a good thing. Twitter is big. Not Facebook big, but, that’s probably ok. There’s lots of commentary about the lack of growth of users as being essentially a death-knell for Twitter. That’s possibly true – but it becomes true if you say it’s true. Having run a company, I knew early on that there was a sort of ideal size for what I was trying to do with the company. We had opportunity to grow much beyond that size, and resisted, because it changes the nature of the company.

Twitter’s like that. I’m not advocating that it regress in size, but, I will say that I find as it has grown, there was a tipping point, which for me was in mid 2013, when the signal-to-noise ratio became unwieldy and I needed to change how I used Twitter. So – what if there’s only 300 million users? That’s still the size of America’s population. And certainly some don’t believe in incessant growth of the national user base, or they’d be clamouring for more immigrants, right?

Part of Twitter’s problem, as it relates to users, is that in the modern free-app world, the only way to make money is advertising, and advertising demands ever-more eyeballs to make money – which in turn seems to drive down the value of each eyeball, so…huh.

But maybe Twitter could spend more time examining how to monetize it’s own users more. Also: I should preface this by stating I don’t actually know how Twitter makes money, besides selling data access and advertising. They may already do some of this. Looking back at the original WhatsApp model, they charged $1/year for users. What if Twitter did that? Or provided a free model, which works as is, but charged $1/month to power users? Or significantly more for Corporate or Verified users? Or, like Facebook, charged business users for reach within their own network (maybe they do this already?) when they post? There’s likely a tonne of money to be made just be re-examining access, stats, limits, etc on existing users. & I strongly suspect that Wall St would be much quieter about user-growth if per-user revenue and overall revenue would grow.

Of course, implementing all of this would certainly cost them users. Would celebrities still tweet if it cost them $$$ to reach all of their followers? Who knows. But looking at experiments like YouTube Red, and other paid-access social networking, there’s clearly money being left on the table.

Changing the product itself

There’s lots of complaints about ongoing Twitter changes:

  • Moments suck
  • I don’t like Hearts, I like Stars!
  • Quote? Retweet?
  • Longer Tweets?
  • Longer DMs?

Etc. Essentially every change to Twitter has brought much derision. But Twitter’s own UI, own feature set has been going more or less constant tweaks and refinement since it started. A major difference now is that since Twitter cut off it’s own developer network at the knees, these changes have been top-down, rather than bottom-up. So feel imposed, rather than organically grown. But even still, the Twitter community has managed to build cultural tools, with Tweetstorms and fun quoted-tweet rabbit-holes and other items. But, even when I don’t like the individual change, I love the feeling that Twitter is a growing, changing organism willing to explore fundamental changes to itself. While there’s a lot of worry about changing the “core” of Twitter – this ongoing change to me, is “core” Twitter. It’s been changing since the day I joined. But! I do, wholeheartedly, believe that Twitter needs to find a way to re-embrace it’s own community. It will be very hard, nigh impossible, to win back the trust of developers at this point. But, with a massive leadership change underway, this becomes a window to do so.

Abuse and dealing with it

I’m writing this as a white, cis male – so let it be know I don’t experience this much. But friends do, and I’m certainly well aware it’s out there. And that lots of much smarter people are working really hard to confront this. But, strangely, Twitter doesn’t seem to be doing much. So, here’s some of my own thoughts:

  • Where’s the “Akismet” for Twitter? A user-subscribable service to auto-filter inbound responses? Most of my miss-tweets are Scots angry at their local TV station – it’s pretty easy for machines to learn that this isn’t relevant to me or a recent tweet. S0 – quarantine them for me. In-app, give me a spot to review them if I want. But otherwise, just let me ignore them.
  • Network-rating? When someone I don’t know/follow joins in a conversation, one of the first things I do before I decide to respond is check: a) are they an “egg”? (I ignore all eggs – why this isn’t a feature in and of itself is beyond me) b) what’s their following/follower ratio? C) what’s their tweet count? D) do we have common networks of followers/followees? E)content of their recent tweets? Now, I don’t actually do all of the following every time. But you know what? A robot sure could. It could give me a score based on these (or other) criteria. And let me decide what sort of score would let their tweet come through or not.
  • Banning is, for me, a very problematic tool (free speech and all). But blocking should be easier and work better. And yes, should probably have better in-network tool too (ie, if this person is blocked by X% of people I follow, then…)
  • A large part of the problems I read about stem from piling-on across networks. I don’t have a great solution here – even if you don’t see the abuse, it could still be public and thus affect others, and even get back to yourself. Tools like Slack’s purported public-broadcast channels could be an interesting tool here: if your account is “public-broadcast” only, then no one can mention your account (or something) – people you follow could still DM, possibly even mention you still. That might reduce direct abuse, but not sub-tweet (and sundry off-line extensions that, TBH, I don’t even know where to begin).

Other Thoughts

  • As above: Letting users decide how interactive Twitter is for them is potentially useful: Twitter for large-follower users is already essentially broadcast – why not create an explicit channel/toolset for broadcast vs conversation?
  • As I tweeted last night: Liking/Favouriting is fine, but I’d really like two additional tools: a “mark” this tweet – which lets me use some sort of single-emoji to mark a tweet – the stats about which emoji I use for what could surface all sorts of interesting data, and a “react” to this tweet – again – a single/short emoji would work well here, like Slack reactions, which would go back to the tweeter, but perhaps not as a public tweet – perhaps in a method similar to poll-responses – show up in notifications. But maybe this, again, would be available as public data, so I could see when I look at a tweet, the various reactions to it – but not directly in the timeline.
  • I, personally, love the idea of long-tweets, or tweets with embedded stories. What if every Medium article I posted would auto-embed in my related tweet? Great! Feels like a win for everyone. I was unsold about embedded images/videos, but those are generally good. So why not longer text?
  • Media likes Twitter. Twitter likes Media. There’s probably a very interesting intersection here about the end of TV, channels-as-apps, Netflix or Twitch-style streaming…and Twitter. Periscope is definitely down this path. But how much money could there be if, say, I could pay monthly to ESPN to get access to a particular Twitter feed of ESPN programming, directly in my Twitter app. Or if I could make money by streaming through Twitter – particular tweets (or whole accounts) that people had to pay to see? (Beyond streaming, there’s simply an interesting Patreon-style model in here for supporting interesting writing, etc directly in Twitter
  • Seriously Twitter: let go of app-control. Let other people build various, use-case-specific or even general apps. Simply enforce feature-sets, so that as tools get added to Twitter, apps need to update too, if it applies to their use-case. Many common interactions now all grew out of app-developers experimenting with how to display Twitter.
  • I’ve always thought of Facebook as AOL, Twitter as IRC. Both are, in their own way, walled gardens, But one is a super-controlled walled, curated, corporate walled-garden, while the other is more of a slightly-off-kilter free-for-all that is still self-contained. Twitter almost feels like a utility, or a protocol, rather than an app to me – which probably explains why it has such a hard time dealing with Wall St – it is a fundamentally different tool than many of the other social network competitors (Who, increasingly, are just Facebook), and needs to find a way to revel in that.

So, Twitter. I love you, but you’re bringing me down. But we’re all in this together, so let’s figure this all out, so you can become a sustainable company and I can go on loving you.

Design & Content Conference: A sort-of review

So let’s get this out of the way up front: I have a lot of bias regarding the Design & Content Conference:
My company was a sponsor of the conference.
– The host & organizer, Steve Fisher, is a good friend of mine.

I’m not going to review all the talks — the levels were very good, with 2 exceptions, but wanted to highlight some, as I talk about the “structure” of talks and what I like and don’t. This conference, as I experienced it, consisted of 4 types of talks: The Story, The Essay, The Review & The Editorial. I’ll use examples of talks to talk about each. It should be noted I’m not a public speaker (though I’ve been thinking a lot about giving it a shot), and it’s entirely likely I’m making up terms for actual styles that already exist and are described, so… well. I’m already here, so let’s just dive in.

The Story

The Story talk is where the speaker guides you through one or more related tales, and uses these tales to illuminate her point. Sort of like fables of old, where the story contains important lessons. These are generally always my favourite talks, because I love stories and parables.

Sara Wachter-Boettcher’s talk, let’s just get this out of the way, was one of those really special, you-just-had-to-be-there, life-changing sort of experiences, which makes it hard to review in any way. As soon as video is up, go watch it. Better yet, find out where she’s next giving this talk and pay more or less whatever it takes to see this. Seriously. That good.

Sara’s talk consisted of 3 short, personal stories, all relating to the central theme of kindness. Her slides, which was common in story-style talks, were additive to the talk, but not required. Progressive Enhancement, if you will.

The Essay

The Essay talk is, to my mind, the most basic kind of talk. Think of your standard university essay with an intro, several key points, and a conclusion. That. They tended to be more technical, and, as a group, tended to use slides to explain points (or to repeat what they were saying). I feel like this might sort of a “beginner’s” format, or a safety format. With the exception of Rebekah Cancino & Eileen Webb’s talks, who rose above, these talks were the least interesting to me. One, they tended to be drier, but two, I don’t really want to have to read lots of text on a slide, and information-dense slides were common elements amongst these. Really, nothing is more off-putting than a speaker half-turning their head to read the slide out loud to me.

The Review

James White’s talk, Design Renegade, should have been one of those god-awful masturbatory “portfolio review” talks that artsy designers seem to like to give. But. It wasn’t. It rose above, way above thanks mostly to his unbridled energy, humour and pace. It reminded me very much of a talk by Aaron Draplin I saw a couple of years ago. Packing hundreds of slides into a rapid-fire cross-section of his professional and personal explorations it was wonderful, although, I suspect that how my inner 13-year-old comics-obsessed self identified with it helped. I spoke to several others who were much less enamoured with the talk. Given the nature of the portfolio review talk being very much about the speaker, the success of these would likely be directly impacted by two things: how much you like the speaker’s work; how much you like the speaker. Self-hagiography is a dangerous game.

The Editorial

This style I named because it straddled a line somewhere between a story and the essay. While the essay talks were very linearly-structured, and stories were pretty loose, the editorial tended to talk around a particular point, and were more opinion than essay (which had a veil of objectivity), without the flourishes of the story. In particular, Denise Jacob’s & Parker Mclean’s talks fell into this category. Denise was opining on how to be creative through banishing your inner critic (oh how this talk spoke to me!), while Parker was giving a whirlwind tour through how & why to make accessible content & design decisions. Denise’s slides tended to be illustrative of her points (often literally, with heavy use of photos), while Parker’s slides seemed, for the most part, to be punchlines, or accessories to his jokes. This style of talk greatly appeals to me intellectually, as while it demonstrates clear expertise, it couches it in experience & opinion, so comes across didactic than the essay-style talks.

Special Mention: The Click-bait

I want to take a moment to talk about Jared Spool’s talk, which was great (as his talks are — this was the fourth time I’ve seen him speak). Very Much an Editorial-style talk, but he’s sort of the Buzzfeed of this — in a good way. Everything is set up in a slide that shows some obscure image, and Jared will ask a question to the audience about what they think it means “Look at this graph — You’ll never guess what happened next!” over and over. Which…when I’m thinking back, really should have been annoying. And, perhaps it was. I don’t particularly like being asked an actual question that is really rhetorical. But. And it’s a really big BUT. It was awesome. I learned a lot about metrics (and how/when to ignore them/go beyond them), it was hugely entertaining, and a great way to close the conference.

In Toto

Really, it’s hard to believe that this was the first time this conference was put on — it was so smooth, so well run. Live captioning! Excellent, diverse, high-calibre topics! Tasty lunches! A party at Science World! The Venue itself isn’t fantastic — no coffee allowed in the auditorium, it was split across two-levels with some odd narrow hallways, but the team at the venue clear is great. Linking to the conference to two other Vancouver events: Style & Classmeetup & Creative Mornings was an interesting initiative to showcase Vancouver’s culture. I’m less convinced of the Creative Mornings inclusion (caveat: I didn’t attend Creative Mornings) only because the post-event chaos made the start of Day 2 a little weird.

When it’s announced next year, definitely grab a ticket. I hope I can still be involved somehow too.

Just-in-time disk

I feel like this is an ongoing complaint from me – but it’s really time for someone – some combination of corporations and likely-existing technologies – to solve the issue of mobile computing + storage.

Here’s my standard problem:

I have, currently on a Drobo in a corner of my home office, my Lightroom catalog. It’s just over 2.5 TB in size – I take a lot of photos. But! I’m not often at my home office. Most of the time, when I’m taking photos, I’ve got my phone, my iPad, my laptop. & I want to edit photos. Lightroom Mobile syncs mobile photos through the cloud to my main Lightroom catalog (on said Drobo).

But – my primary computer is my laptop. It’s got a pretty beefy 500GB drive. And, because it’s a laptop, it’s not always in the same place as my Drobo (it’s actually never plugged into my home Drobo – but it is plugged in sometimes to one at my office). an because my laptop has a retina screen, and the home iMac does not the laptop is an infinitely superior experience when editing photos.

  • So how can I access all those photos, that catalog, on my laptop, when I’m editing?
  • How can I import photos, while travelling somewhere, and have those sync back to my primary photo repository?

When Adobe announced Lightroom Mobile‘s Sync feature, I was pretty excited – this felt like exactly what I wanted. I take photos on my phone, and, magically, they get pushed to my primary catalog. I can also selectively pull photos from that catalog for editing on my iPad (which is also a lovely editing experience). But!

  1. I have to be on my primary catalog computer to indicate which photos I want to push back
  2. You can only sync to 1 primary catalog
  3. You can only use this to push from mobile devices to a computer, not from 1 computer to another.

I also use Dropbox, and pay for 1TB of storage. This is great! I long ago set up the automatic back up of photos to DropBox – however, as I discovered today, this only works if the camera uploads folder is synced to my computer(s). Which means all those photos are also taking up precious space on my laptop. It turns out I have just over 60GB of photos in my camera uploads folder – which, at 1/5 of the total storage on my laptop, was a nice gain to NOT sync. But now the automatic photo-upload feature doesn’t work. Which is dumb – because I don’t want ALL of my camera-upload photos to be synced locally. The last, say, 5GB? sure. but not all forever in history.

So Dropbox doesn’t really solve this problem either.

 

So, here’s what I’m thinking should be possible:

The Adobe Solution

With Creative Cloud, I already am bought into their ecosystem. & Clearly, with how Lightroom Mobile works, they have a way of sending me “pointers” to a photo, without the actual file. So, let’s complete this circle, Adobe. Let me say where I want to keep my photos – this might be in your cloud, this might be a cloud service, this might be a drive attached to a particular device. Then let me open Lightroom anywhere – one of several computers (I currently regularly use 4 different traditional computing devices), mobile devices (where I regularly use 3 different devices). Let me plug in a camera or card, and add photos to my catalog, but edit where I am. If this means some back-and-forth syncing, so be it. Show be small “previews” of recent photos, perhaps a simple text-catalog of older ones until I request more. But stop making me think of how/where I’m adding photos and just let me work. For bonus points, let me easily/selectively share access to my catalog with friends & family.

The Apple Solution

Any third-party solution could be made easier if Apple let me configure and reserve a certain portion of my hard drive as a sort of cloud-storage scratch disk – so that when Adobe needed to download photos, it knew it could safely remove anything else in that drive to give me access to what I need – sort of how memory management currently works. By default, maybe take 10% of the computer’s hard drive – but again, let the user tweak these preferences.

Related, but different, would be a built-in Apple-y way of mapping cloud storage as a local folder or drive – indeed, I hope this is really the promise of iCloud – but it’s certainly not there yet. It works this way, for the most part, with Apple apps – Pages, Numbers, etc – where things are just stored there and we no longer care about “where” it is locally – but this needs to “just work” with other services, and needs to not be completely closed off (so that I could make “my” cloud be a device I control, like in the scenario above) (Also: this is never going to happen. Apple likes walled gardens). But in the same way Mail supports built-in Gmail configuration, I should be able to configure *any* cloud storage as just a place I can drag & drop to like any other folder – a little like how Dropbox works, but without the local copy of *everything*.

The Dropbox solution

This, to me, feels like it should be the easiest, but I could well be wrong. Dropbox, and all sorts of storage tiers, now offers vastly more storage than most computers have (on the assumption that most computers are laptops, and new laptops in general seem to have <1TB storage – particularly if you’ve gone solid state). And Dropbox helpfully offers “Selective Sync”, which prevents this silly duplication of files locally and on their servers for particular folders. But if you turn off Sync for a folder, it can break useful Dropbox features, like automated Camera Uploads & the screen shot sharing service. Which is dumb. Because I really do want all my photos backed up to drop box – but I don’t want to have to keep every last photo locally too in order to do that.

& again, I often don’t need, in order to find what I want, the actual files. Some kind of index of files, that integrates with services like Spotlight, are all I need, so that when I do need a particular file that exists only on Dropbox, I can find it easily, download it and do what I want, before returning it back to Dropbox, removing itself from my local drive again.

Wrap up

I realize as I write this that what I am describing in many ways is a thin-client. But, because connectivity is not ubiquitous, thin clients aren’t totally a fit – I need some local storage/access, but not all the time – for most common scenarios, I can predict when I need to download things locally, and when I can rely on the cloud to store things. And that’s really what I mean by “just-in-time” disks – It feels like there’s very little need for any content-files to exist only locally, but they might need to occasionally. When cloud storage is so cheap as to be essentially free, and laptop hard drives are still so expensive, why do we keep pushing things there? And when multi-device computing is so common, why is sync across them, user-controlled, still so broken?

Identity: Solve this

  • The computer I use to edit my photos & videos is the home iMac. It’s where my Drobo is, it’s where I’ve installed Lightroom, it’s where we keep our iPhoto library. For various reasons, this Mac is tied to Leah’s iCloud account. our Apple TV’s, my iPhone’s Photostream are tied to my iCloud account. There’s currently no way I’m aware of to get photos from Lightroom into my Photostream without linking this computer to my account. Solve this.
  • We currently have in our possession 3 iPhones & 2 iPads. The old iPad 1 is more or less Liam’s portable gaming device – Leah uses it for recipes, but I’ve more or less moved to using the office’s iPad 3 for all my browsing/email needs. Because we download all our iOS apps through 1 account, my account, as a family account, all 5 iOS devices are tied to it. This somehow or other lead to my Game Center account being on all devices. My Pocket Planes game, which I play on my iPhone, somehow overwrote Liam’s Pocket Planes game, which he plays on the iPad. There were tears. I’ve now created & installed Liam’s Game Center ID on the old iPad – but now I can’t play games with my account there. Solve this.
  • Liam is 7, highly literate, fairly computer savvy, and interested in gaming & creativity apps. He doesn’t understand passwords. He doesn’t really get working in a budget yet. I’d love to give him the ability to buy some apps regularly, without my intervention – or least my direct intervention. I’d love to have diurnal review of his app-store activity so I could, as a last resort, retroactively remove an app, but not have to provide up-front approval/password entry to our shared iOS account, thus removing any semblance of freedom. Solve this
  • I’m currently working on a web app/social tool that will be used by kids as young as kindergarten, all the way through high school seniors. This means that many of the users of the app are pre-literate. However, we have to maintain identity. How do I create a secure login system for users that might not know how to spell their name, let alone have an email address or be able to remember a password? How can this then scale, linguistically and visually, as these users grow and become high-school seniors themselves? Solve this.
  • I  try and use Stv as my online handle across all social networks. This means I often end up signing up for early access, alphas, betas, etc, just so that I have a chance to reserve this 3-letter handle. Many networks don’t let me have a handle that is only 3 letters. On some networks like Twitter, it can’t figure out when someone from Scotland is address their local TV network, known as STV), that they’re not addressing me. In the real world, I personally know 5 other people who go by the name of Steve. We are rarely confused for each other. Online identity & handles need to be as flexible. If I want people to address me as Stv online, that should be possible across all networks, regardless of how the network itself identifies me. Solve this.

Fixing “View All”

Lots of websites paginate their stories  – we all know the so-called justifications for why – so I won’t expand on that argument. However, many also helpfully include a “view all” link at the footer of each page so that if you want to view the whole story on one page you can (aside: with the various time-delay readers like Instapaper or Reader mode in Safari, this might be moot now, but I still use it a fair bit).

Unfortunately, these “View All” links are totally, and totally unnecessarily, broken. Have a look at this recent Wired article: “3-D for your ears”. It’s split into 3 pages, and at the bottom is one of those links. But when I click on it, it reloads the entire page, and leaves me at the top, not where I was.

I understand their desire to do a whole page refresh – in the game of page-view-counts, one more view is presumably a good thing. But their internal CMS must also know how/where a page splits – as it provides a split option. And we have the ability in HTML to link to any arbitrary point on a page via an Anchor Link. So why not tag the bottom of each page with an anchor – something as basic as “page1”. In the View All, you could then link me to the anchor link for the page , using the #page1 anchor link. You still get your extra page view, but you don’t annoy me in the process by making me find where I was in the now much larger content block.

I think the best solution of these “View All” links would be to simply load in the rest of the content in the current page via AJAX call – so the content simply stretches down – less bandwidth for everyone involved, and significantly more “refined” feeling in practice – but as I said, that doesn’t help the page-view-count game.

Because these view all solutions are currently such an annoyance, I find myself clicking the “Reader” button in Safari on almost every site I go to that paginates their stories. But for much the same reason I was never big on reading sites in RSS readers, I don’t particularly like this interface. I like the content to be snuggled in the chrome it was intended to be in.

File structure

Sign on an RE office
Old-Fashioned Service, New-fangled Tech

I have always been super-anal about file-structure & organization. I loathed big-bucket directories. I would carefully arrange applications into categorical subfolders, my documents were all organized by project in subfolders. My music has always been organized by letter->artist->album, with the large folders of letters being further sub organized. The idea was that I would never be looking at any folder list with more than around 30-40 items. Not sure how or why this happened. Likely years and years of working in DOS with tiny monitors and hating having to paginate listings.

But I realized lately that I essentially *never* browse my filesystems anymore. If I’m looking for something, I search. And I don’t care where it, because the spotlight search is so fast & efficient. We have historically had a pretty strict file-structure for web files too – for software packages, this is still useful. But we used to do it for assets as well – but now more and more we’re rsyncing uploaded files to big-bucket CDN/cloud servers anyway, so our precious file-system organization is not useful. And because there’s often a software layer between where files are stored and browsing them, we can use software to categorize the display in a vastly more useful manner than folders  – this is the glory of tagging & meta-data.

And so, after my external HDD died at home, along with it my home music library, when I started to rebuild my iTunes library, I did something that would be anathema to me just years ago – I’ve let iTunes organize the files internally, however you like. & have set up watch folders for it to auto-import as I download from eMusic and whatnot. And it is very liberating. I haven’t pulled the trigger on it yet, but I might just do that for photos too, getting away from the year/month/day/file structure I’ve been keeping since 1996.

The future is scary and cool.

 

%d bloggers like this: