Jul 31, 2025
As engineers move into senior roles (title dilution not withstanding—”senior” here means many years of practice), their work becomes less and less about writing code all day. It’s more about judgment than logic. They’re shaping systems, guiding teams, making decisions that ripple through architecture, culture, and future projects. They’re solving fewer “how do I write this” problems and more “what’s the right thing to build and how do we build it responsibly” problems (note the alignment of this statement with LLM-assisted programming—a different topic for another time).
So it’s strange—honestly, absurd—that when interviewing for these roles, we still reach for the same tired approach, typically some combination of live coding, whiteboarding systems design, random technical questions, and/or rushed hypotheticals. As a proxy for real conversation, it’s a worn-out strategy for predicting whether someone is right for the role.
The Interview as Performance
For seasoned engineers, technical interviews can be a challenging format for learning anything useful. It’s a performance—a show of nerves and endurance rather than ability or judgment. When you ask someone to rebalance a binary tree or walk through a caching strategy in an interview, what you’re really asking is how well they can adapt to an artificial environment. Can they quickly decode what’s expected? Can they perform under pressure? Can they think out loud in the “right” way?
That might be helpful if you’re hiring stage actors, but for thoughtful, accomplished engineers who are used to solving real problems over weeks or months, it’s just noise. Worse, sometimes it’s a power move, especially when the candidate is very senior. In that situation, throwing down technical puzzles isn’t about assessment—it’s about control, and it’s toxic.
Why Do We Keep Doing It This Way?
Because it feels fair—or at least, it feels standardized. Everyone gets the same kinds of questions, the same time limit. It creates the illusion of objectivity. But fairness isn’t the same as insight. Just because something is easy to score doesn’t mean it’s meaningful.
There’s also inertia. We’ve done it this way for years, it’s how the big companies do it, and isn’t it reasonable to expect everyone to know the same things? But that’s bias, not purpose. At best, it reflects a narrow definition of engineering competence. At worst, it’s just lazy hazing.
It’s not easy to get off the stage and into the real world. It can be intimidating to really talk shop with someone you just met, especially a battle-hardened senior engineer. Asking questions where you don’t already know the answer takes vulnerability. The standard bag of tricks offers comfort and control.
Don’t Ask Trick Questions—Talk About the Work
Real signal comes from lived experience. What did this person build? Why that way? What broke? What would they do differently? What tradeoffs did they make? How did they define “done”?
If you listen with interest and curiosity, you’ll get more insight from a single war story than ten rounds of technical tests or whiteboard sessions. Ask what failed in production. Ask what tough call they had to make. Ask what they learned. If the conversation is honest, you’ll come away with a real sense of how they think and lead.
If your team isn’t equipped to have that conversation—if they need a script, actors and a theatre instead—then the problem isn’t the candidate.
Jun 28, 2018
Growing up in Kentucky, I fell in love with video games, one coin at a time, in arcades. There were so many great games that inspired me. Early on it was Space Invaders, Asteroids and Galaxian. I was hooked by the relentless pursuit of a new high score.

I turned over my first million points on Galaga, a game that would become a favorite, not too long after its arrival in our little town.

I was so into arcade games that, after being hired to paint the logo of our new local arcade (that’s me, on the left), every quarter earned from the job found its way back to the arcade within weeks. Of course, the owner of the arcade already knew that’s what would happen.

That jet-fueled teenage competitive streak transformed into a much richer gaming experience by way of D&D. I was already lost to story and plot, improvisation and character when, wet behind my elf ears, I started college. I majored in art and theatre, but spent more time playing D&D and fiddling with computers and coding than in class or acting in plays. I enjoyed being a DM — but I absolutely loved battling wits with other DMs as a player.

I began to think about how I might blend the excitement of twitchy arcade mechanics with something like world-building in D&D, and that deceptively simple thought was what began a life-long relationship with creating and programming video games, starting with the humble TRS-80. The TRS-80, by today’s standards, was an exceedingly modest machine, and perhaps not all that well-suited to video games. But I managed to pound out my first adventure game on it, complete with battle mechanics and stat bonuses, in BASIC. The programming constraints were unimaginable by today’s standards.

I left college with a year to go, dreaming of working in video games. I moved to Indianapolis, Indiana and looked for companies making games, a futile quest at the time. I decided to keep making games on my own while working various other odd jobs for a few years, then went to college again, this time to study mathematics and creative writing. It was an unusual combo — I’m pretty sure I was the only student there doing it, and my advisers didn’t quite know what to think. I’d been a math and puzzle geek since my first algebra class in 7th grade, which was profound and revelatory, almost a religious experience; on the other hand I loved to write, mostly short fiction, and had some talent for it.

I TA’d my last two semesters, including a new calculus class where the professor would use Mathematica in the lab four days a week, while I would work through problems with chalk in hand every Friday from 9 am to noon. I thought it would be a cakewalk — who would want to work problems on the board on a Friday morning? I couldn’t have been more wrong — not only did most of the class show up, half the students from the same course in two other time slots started coming. It turned out that learning advanced calculus on a computer was not an easy thing, and the prof was so focused on the shiny goodness of graphing and playing with equations in software, there was never any time to practice.

After graduation I took a job at one of the biggest tech-like companies in Indy, Macmillan Publishing, developing reference books on programming, networking, and the newest technology on the block, the Internet.

I kept tinkering with games and graphics in my spare time, mainly on Windows. I can’t tell you the number of little games I wrote and programmed, though in all honesty most of them were more like tech demos. But with each new idea, OS version and language/compiler iteration (mostly C), I became more and more interested in graphics, eventually spending as much or more time finding ways to optimize rendering as programming game logic. I became obsessively interested in tools and 3D authoring and rendering, including a brief descent into the magnificent rabbit hole that was the Amiga (which, by then, was no longer even a supported platform).

At work, I moved up, and found myself producing video games in Macmillan’s small software division. We did mid-range and value PC titles and add-ons, and we had some real hits (and plenty of duds, too). I worked on some of the first early 3D games for the lower end of the PC market. I went to my first GDC, my first E3, then back again each year with a half-dozen programming/platform conferences in between. I got to meet stars like John Carmack and Sid Meier. I began to understand how the industry was evolving, what players valued, the different genres, game mechanics, gameplay.

I also played a ton of games on PC and consoles, and from Meridian 59 onward, was hooked on MMOs. A lot was going on both in gaming and with the Internet. Macmillan was willing to take some chances on new business models, and I was in the right place at the right time. We started a new business for distributing add-on levels for popular PC games; RealmX was a highly ambitious, very early attempt at a form of DLC, something now commonplace, but it failed spectacularly. We then created an even more ambitious web product called InformIT, which was arguably the first online collection of professional technical books on the Internet, including books on games. It survives to this day. I’ll never forget the weeks it took us to finalize the first-cut of the data model. By the time we were finished, we had hundreds of sheets of whiteboard paper wrapping every wall in the office.

But by then it was 1999, and I was ready to up my game.

My first job in Silicon Valley had nothing to do with games, but it was a foot in the door. I was hired to help relaunch a large hotel reservations website, both the content system and the server framework. Back then there was no Google infrastructure or AWS like there is today — you had to roll your own on top of other, relatively nascent, software. One of most important things we did was to switch the back-end from Microsoft’s IIS to Apache — a decision prompted by the reality that two full-time, on-call engineers, whose primary job was to reboot the server every four hours, was utterly absurd.

In six months we were done, and by that time I had turned back toward gaming, to a startup in Mountain View. Staccato Systems developed an audio subsystem for the PC that replaced a $27 wavetable chip on sound cards and also was used to create unique audio effects in PC games. I came in with a focus on helping the games side of the business and wound up coding applications to make the core technology accessible and usable by game developers including EA, Lucas and a few others. It was remarkable tech — physically-modeled, logically-controlled audio at a granular level. The engineers I worked with there were absolute geniuses (and there was no shortage of egos), although it never failed to amuse me that, at the end of the day, they were mostly hard-working hackers, like most people who do anything authentically novel in software. Staccato’s technology was first licensed by AC97 Codec manufacturers SigmaTel and then Analog Devices. It was sold to Analog Devices for $30M in 2001.

Around the same time as the acquisition, a whole new game market was starting to make waves — mobile games. I started programming feature phone games and eventually moved into smartphones, around 2007 when the iPhone landed. Companies I worked for, and helped lead, won awards. We brought dozens of titles to market, including high-profile mobile games like Guitar Hero Mobile, Duke Nukem Mobile and Prey Invasion. I started to get a little recognition. I spoke at GDC a couple of times. I was a gameplay programmer, a senior software engineer and engine architect, then a VP of Production, then a CTO. Through it all I was continually amazed by the talent and dedication in the industry, an industry that was going places it had never been!

These days I’m still working on games and tools, but I get to hop around a bit more from project to project. Not long ago I helped bring a wonderful children’s game to Unity/HTML5 and before that spent over a year working on a mobile casino game, right after a couple of years engineering a large framework for performing, essentially, extensive mobile CAD functions in Unity.
There’s almost always something new and exciting to do (right now it’s VR/AR/MR/XR — yes, the acronyms never end!), though there’s nothing like a great new stealth project, or prototype, or a new take on an old shader, or a fresh API. So much to do, so little time! I’m still in love, and I’m comforted by the thought that my best game projects are ahead of me.
Aug 27, 2016
Fabulous project and write-up by one of the nicest guys in the video game business. Not to be missed.
Dec 28, 2014
I’ve been reading Paul Graham’s essays for many years and almost always find something insightful. His latest post, Let the Other 95% of Great Programmers In, is no exception.
However more great programmers will not help Silicon Valley.
Most US companies are based on a strongly-typed hierarchy whose evolutionary path is entropic and bureaucratic. This means shallow leadership, ineffective hiring practices and the inability to identify and reward greatness. A programmer cannot be a commodity if his or her value is dependent on this cluster-fuckery and as a non-commodity he is indistinguishable.
I wish that Graham didn’t think of programmers as commodities to begin with. Maybe he doesn’t, but I don’t know how he could have written the essay otherwise.
Mar 22, 2014
Dear Spaghetti Coder,
I grasp that you can’t be bothered with declaring your allegiance, once and for all, to a particular brace style. I know you like to “mix it up”.
I understand your need to never delete anything and instead leave in long blocks of old, unusable, commented-out code. You never know when you might need it.
I realize that there is never time to make real comments in your code, particularly anywhere near your numerous long, difficult switch cases. It’s not your fault you had to hard-code all those strings.
I know you must be clever since you use so many inexplicable, often funny, variable names. You’re such a show off.
I see that you keep re-writing the same unoptimized, two-to-four-banger nested loop functions over and over again instead of wrapping them all into a single, elegant function. You like to flex your muscles.
I’m sorry that you’ve been hurt bad by the Tab key. You’re appropriately working out this issue in your code instead of paying for expensive therapy.
And I grok your single-minded desire to arrange your code in such an arbitrary fashion that we get to Treasure-Hunt our way through your gamified tome. It must give you endless hours of DungeonMaster-like pleasure.
Sincerely,
He Who Must Fix All Your Crap And Make It Actually Work
P.S. I think you should consider a move into management. No, really.
Jun 30, 2012
Draw Something has continued to do very well in the App Store and now we’re seeing more derivatives — apps and games with basic painting and sketching capabilities. Last weekend I had some fun playing around with a basic painting setup, just to see how much brush (pun intended) I’d have to go through to get to the painting picnic.
On iOS, there are really only a couple of ways to implement a painting app — Quartz or OpenGL ES (there’s a nice little walk-through using Quartz here, and Apple put together a cool little OGL example called GLPaint here).
It should be relatively clear that the OGL approach is cleaner, more flexible and a bit faster. But while GLPaint is a nice place to start, it’s not very app- or engine-friendly in the context of a full-fledged OGL app. Since the point of GLPaint’s example code is demonstrate how to do basic painting, it has no need to consider the rest of your OGL surface’s render loop, nor does it concern itself with other important engine pieces, like your OGL redundancy state checker, sorting and synchronization issues between render and update calls, and most important, clearing the buffer.
That last bit really is most important because a nicely-performing painting app should never clear the buffer. Doing so will quickly slow things down to a slide-show. This should be obvious: In order to paint to the screen — whether you’re using GL_POINT_SPRITE_OES or rolling your own quads — you’ll need to draw a ton of sprites on-screen to get a continuous line of color and/or texture. If you clear every frame, you have to re-draw every frame, and voila, you’ll have molasses in less time than it takes to launch the simulator. If you don’t clear, you’re only drawing a handful of new sprites each frame.
The GLPaint example does this — it doesn’t clear the buffer. However in a real-world app, you must clear every frame in order for anything else — GUI elements/textures, mesh rendering, camera changes, etc. — to work. Hence the conundrum — you need a nice, normal clear/render loop but you also need a render-only call each time you want to paint.
Luckily there’s a straightforward solution: painting to a texture, then render that texture in your normal render loop. And setting up a texture to paint to is easy, by attaching it to an FBO. For example, a full-screen texture buffer:
- (GLenum) CreateRenderTexture
{
m_texW = [[UIScreen mainScreen] bounds].size.width;
m_texH = [[UIScreen mainScreen] bounds].size.height;
glGenFramebuffers(1, &m_texFrameBuffer);
glBindFramebuffer(GL_FRAMEBUFFER, m_texFrameBuffer);
glGenTextures(1, &m_texTexture);
glBindTexture(GL_TEXTURE_2D, m_texTexture);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST);
glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MIN_FILTER, GL_NEAREST);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_S, GL_CLAMP_TO_EDGE);
glTexParameteri(GL_TEXTURE_2D, GL_TEXTURE_WRAP_T, GL_CLAMP_TO_EDGE);
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGBA, m_texW, m_texH,
0, GL_RGBA, GL_UNSIGNED_BYTE, NULL);
glFramebufferTexture2D(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0,
GL_TEXTURE_2D, m_texTexture, 0);
return glCheckFramebufferStatus(GL_FRAMEBUFFER);
}
From there it’s simply a matter of drawing to the texture and only clearing the texture when you need to, e.g., by calling a function like this:
- (void) StartTextureRender:(BOOL)clear color:(COLOR)color
{
glBindFramebuffer(GL_FRAMEBUFFER, m_texFrameBuffer);
if (clear)
{
glClearColor(color.r, color.g, color.b, color.a);
glClear(GL_COLOR_BUFFER_BIT);
}
glViewport(0, 0, m_texW, m_texH);
// setup ortho matrix, render and client states here...
}
One of the cool things about Draw Something is that it records and replays your drawing. This is relatively straightforward functionality to implement, and GLPaint kinda-sorta does it as a nice bonus. However their implementation is on the oddball side and a bit shy of more readable, that-makes-sense-to-me production code. A clearer way to implement it is to do a standard 2D lerp between the current touch (as you move your finger on the screen) and the last touch. Then record the time it took between finger-down and finger-up for playback later. For instance:
- (void) Draw:(float)x y:(float)y
{
if (numVerts == MAX_PATH_VERTS) return;
end = Vec2(x, y);
dist = Vec2Dist(start, end);
num = (int)dist;
if (num > 0)
{
startVert = numVerts;
numVerts = MaxInt(numVerts + num, MAX_PATH_VERTS);
for (int i = startVert; i < numVerts; i++)
{
Vec2Lerp(&verts[i], start, end, (i - startVert) / dist);
}
time += [Engine GetTimeElapsed];
}
start = end;
[self DrawRender];
}
Below is the entire class (note that several of the vars are structs elsewhere in the engine, but you get the gist).
// INTERFACE
#define MAX_PATH_VERTS 20000
@interface Path : NSObject
{
@public
VEC2 verts[MAX_PATH_VERTS];
int numVerts;
int startVert;
VEC2 start;
VEC2 end;
VEC2 cur;
Texture* texture;
COLOR color;
float size;
float tick;
float time;
int num;
float dist;
BOOL replaying;
int vertCount;
int curVert;
int endVert;
}
@property (nonatomic, readwrite) BOOL replaying;
- (id) initWithColorTextureSize:(COLOR)c texture:(Texture*)t size:(float)s;
- (void) DrawStart:(float)x y:(float)y;
- (void) Draw:(float)x y:(float)y;
- (BOOL) Replay;
- (void) ReplayStart;
@end
// IMPLEMENTATION
@implementation Path
@synthesize replaying;
- (id) initWithColorTextureSize:(COLOR)c texture:(Texture*)t size:(float)s
{
if (!(self == [super init])) return nil;
numVerts = 0;
texture = t;
color = c;
size = s;
return self;
}
- (void) DrawRender
{
if (num > 0)
{
glEnablePointSprite(GL_TRUE, size);
glSetTexture(texture.index);
glSetColor(color.r, color.g, color.b, color.a);
glSetVertexPointerEx(&verts[0], sizeof(VEC2), 2);
glDrawArrays(GL_POINTS, startVert, num);
}
}
- (void) DrawStart:(float)x y:(float)y
{
if (numVerts == MAX_PATH_VERTS) return;
verts[0] = start = end = Vec2(x, y);
numVerts = num = 1;
startVert = 0;
time = 0;
[self DrawRender];
}
- (void) Draw:(float)x y:(float)y
{
if (numVerts == MAX_PATH_VERTS) return;
end = Vec2(x, y);
dist = Vec2Dist(start, end);
num = (int)dist;
if (num > 0)
{
startVert = numVerts;
numVerts = IEMaxInt(numVerts + num, MAX_PATH_VERTS);
for (int i = startVert; i < numVerts; i++)
{
Vec2Lerp(&verts[i], start, end, (i - startVert) / dist);
}
time += [Engine GetTimeElapsed];
}
start = end;
[self DrawRender];
}
- (BOOL) Replay
{
if (replaying)
{
tick = Max(tick + [Engine GetTimeElapsed], time);
curVert = endVert;
endVert = (int)Max(Lerp(0, numVerts, tick / time), numVerts);
dist = Vec2Dist(start, end);
end = verts[endVert];
for (int i = startVert; i < endVert; i++)
{
Vec2Lerp(&verts[i], start, end, (i - startVert) / dist);
}
start = end;
int count = MinInt(endVert - curVert, 1);
if (count > 0)
{
glEnablePointSprite(GL_TRUE, size);
glSetTexture(texture.index);
glSetColor(color.r, color.g, color.b, color.a);
glSetVertexPointerEx(&verts[0], sizeof(VEC2), 2);
glDrawArrays(GL_POINTS, curVert, count);
}
replaying = (endVert != numVerts);
}
return replaying;
}
- (void) ReplayStart
{
curVert = 0;
endVert = 0;
replaying = (curVert < numVerts);
if (replaying)
{
tick = 0;
time = Min(time, 0.001);
start = verts[0];
end = verts[1];
}
}
@end
An NSMutableArray of multiple instances of this class is kept by the caller; each instance is born on finger-down (where we set color, brush texture and size) and dies on finger-up. Replay is easy — essentially just a programmatic rendering of the verts that were previously recorded while iterating over the NSMutableArray, handled with a flag in the Render() function. Below is the basic idea.
- (void) TouchDown:(float)x y:(float)y
{
if (curSize == -1) // we're erasing here
{
[Engine StartTextureRender:YES color:curBackColor];
}
else
{
[Engine StartTextureRender:NO color:curBackColor];
curPath = [[Path alloc] initWithColorTextureSize:curColor
texture:curTexture
size:curSize];
[paths addObject:curPath], [curPath release];
[curPath DrawStart:x y:y];
}
}
- (void) TouchMove:(float)x y:(float)y
{
[Engine StartTextureRender:NO color:curBackColor];
[curPath Draw:x y:y];
}
- (void) TouchUp:(float)x y:(float)y
{
curPath = nil;
}
- (void) Render
{
[Engine RenderToTexture];
if (replaying)
{
[Engine StartTextureRender:NO color:curBackColor];
if (![curPath Replay])
{
curPath = nil;
for (Path* m in paths) { if (m.replaying) { curPath = m; break; } }
replaying = (curPath != nil);
}
}
}
One of the cool things about using OGL for painting and sketching is that you can very easily change up the brush texture, for nice Photoshop-like texture brushes (care should be paid to how you setup the blending, however, due to pre-multiplied alpha on iOS). While it’s possible to do this with Quartz, it’s much easier to grok using OGL. And of course you can do silly/fun stuff like paint a background behind your 3D orc model (maybe there’s a game idea in there somewhere, hmm — ok, maybe not).