I grew up in a little town in the western part of Kentucky, in a stereotypical farm economy just east of the breadbasket. My family was decidedly rural and we were farmers — soybeans, corn, tobacco, gardens, cows, horses, tractors, fences.
My dad wasn’t especially good at it — his reach often exceeded his grasp — but he managed to hold things together. Farming is as tough as you’d imagine (to quote a friend from high school who is still farming, “the American farmer is the biggest gambler on the planet”). I loved the pastures and creeks and woods, but I hated the all-consuming nature of farm work — it’s an endless amount of very hard manual labor, and I never warmed up to it.
Whether I liked it or not, I was stuck being a “farm kid”, but I had my coping strategies: reading, writing, music, games, electronics and eventually, programming. Books and stories were my first big bulwark against it all, and I was a manic reader of anything I could get my eyes on. In grade school I was a precocious, over-achieving book lover — bored to death and annoying to my teachers.
SRA was a weird bright spot. SRA cards were essentially the gamification of reading, the brainchild of Don Parker, who in the 50’s successfully pitched the system to Science Research Associates Inc., a small Chicago-based publishing company that developed vocational and aptitude tests. (The SRA Reading Laboratory Kit was first published in 1957 and eventually wound up at McGraw-Hill in 1989, where it still lives today, with over 100 million served).
SRA consisted of large boxes filled with color-coded cardboard sheets, each including a reading exercise. A student would work on a card independently of other students, then grade their own answers to multiple choice questions. Correct answers meant moving ahead until you leveled up to the next color.
The reading was short and dull, but it was a glorious grind through the colors as fast as possible. Its foundation was pure behaviorism — the pleasure of moving to a new color, or being sent to “the box” as a reward for some other good behavior.
This is a perfect example of a complicated, rather than complex, system, the epitome of gamification. The ostensible goal is to become better at reading comprehension, but the adaptive response that emerges is to become an “expert” at short-term, rote memorization. It has the effect of training the reader to navigate a complicated space in an ordered manner (“read, test, reward”) — an attention-limiting exercise that leaves little time to explore a richer, more complex space (for instance, “read, discuss, reply”).
For kids like me it was something new to combat boredom, but ultimately it did little to make me more capable as a young reader and thinker. It simply made me specialize, which is what all gamification systems do.
Unfortunately, a lot of what passes for games these days, especially on mobile, are much more like gamified apps. These games sacrifice complexity for task completion, choice for selection, and meaning for mastery. Skill is purely a function of correct responses rather the deeper learning that accompanies a well-designed risk/reward game mechanic. This is not unlike the problems inherent in Social Media, as discussed in this wonderful piece by Jordan Greenhall. Cherry-picking from the article:
In a truly complex environment, we are always empowered (and indeed often required) to generate novel (creative) actions in response to perceived circumstances. In other words, our field of choice is unbounded and, therefore, symmetric to the unbounded field potential of the complex system in which we are living. We are thus challenged to and trained to improve our responsive capacity to complex circumstances.
In a complicated environment, we are ultimately engaging in the very different mode of simply selecting the “right” or “best” action from a finite list. This is an optimization game, and while it can be extremely useful when competing in finite complicated environments (e.g., Chess) it is a capacity that is oblique to creative response. Therefore, again, the basic problem is that meaningful (and widespread) participation in this kind of platform is training our agency away from capacities that are truly adaptive and towards a narrow specialization for particular complicated games.
For the video game industry to mean something beyond profits, we’ve got to get better at generating complex gaming experiences. We must forego the desire to slam together twitchy mechanics in favor of more thoughtful design that brings back the magic of “play”, that improves our players’ “agentic capacity”, as Greenhall puts it:
In the case of complexity, the optimal choice goes in a very different direction: to become responsive. Because complex systems change, and by definition change unexpectedly, the only “best” approach is to seek to maximize your agentic capacity in general. In complication, one specializes. In complexity, one becomes more generally capable.
In order to do this at a meaningful scale, we must first be able to rely on funding sources to give us the time we need to design and implement better games. This means that publishers have to be willing to share more risk than what is typical in the industry right now. Second, we need producers and designers to be more thoughtful and eschew the gamified design tropes that have arisen over the last decade.
If you’re making games for living — whether you’re writing checks or cashing them — start challenging yourself to go back to the days of video game yore when our industry looked a little more lovingly at quality. Ask yourself if a complicated ecosystem, already saturated with hundreds of thousands of gamified apps, needs another one.
Newzoo has mobile gaming on track for an estimated $70 billion in global revenues this year — 51% of the total game market. As a mobile game developer since the early 00’s, I’m hardly surprised — but I am in awe. This is big market.
iOS and Android are stronger than ever as the Coke-and-Pepsi rival platforms, publishing and distribution have re-structured themselves, and thousands of new developers have come and gone since the launch of the iPhone. We’re deep in the throes of a world we tried to imagine almost two decades ago, a world of wireless distribution over large-scale networks comprising a big chunk of the overall game market.
What’s left of the old retail world of physical boxes on shelves — much like video and music — are remnants of a bygone era, nostalgic, archaic, boutique. Video games don’t quite live and die by the graphics sword as they once did, either, by high-polygon counts or texture memory, and the heady days of big-dollar budgets, huge teams and deep pockets to fund them are fewer and farther between. We continue to have higher-performing GPUs, smaller footprints and new display technology. We’ve got plenty of devices and form factors, and we’re doing more at the shader level. In particular, VR/AR/MR/XR are bringing, albeit more slowly than anyone wants to admit, new ways to experience games.
But the fundamentals — the look, feel, depth, mechanics, gameplay — are less a function of technology than they are slaves to the markets that have evolved from the platforms. Big productions and high quality don’t compete nearly as well in this world. Yes, we continue to see small numbers of deeply designed, cinematic-quality games — and lest we forget that Steam has kept PC gaming alive — but they’re not the main stream anymore.
And there’s the rub, because the biggest problem we face is now our biggest market segment — mobile games. I’ve said before that the thorn in mobile gaming’s side is discovery, which is almost exclusively a function of the platforms, which are in turn a function of two things: Free-to-play (F2P), which allows players to play without paying (or with the illusion of not paying) and pushes developers to spend a large part of their time micro-managing monetization, and what I call Free-to-create (F2C), which is the low barrier to entry to game development. By themselves, F2P and F2C are beneficial, desirable, worthwhile. Together, they may be destroying what it means to be a game developer.
Like its grandfather the game demo, F2P works by enabling players to optimize for avoiding a false positive: It costs players nothing to try before they buy. If they’re in love, they’ll invest time and money; if not, they don’t feel quite so duped or dumped or disappointed. F2P is more sophisticated than dear old grandpa though and relies on a careful in-game IAP (In-App Purchase) plan. Developers must take care to give players the right combination of free and paid experiences unique to the game. Risk/reward mechanics, quantity and timing can be very tricky to nail, and add significant costs to development.
Poorly implemented IAP misses crucial opportunities to make money or, on the flip side, players feel ripped off or manipulated. More than ever, developers must be at the right place and at the right time to become profitable, but for the user, it’s a seemingly endless source of low-risk gaming potential.
F2C in the purest sense is a wonderful thing — anyone, with some effort, a little money and a reasonable amount of time can design, develop and deploy a game or app that functions like a game. The new developer gets to learn something exciting and rewarding (design and programming), gets the distinction of doing it, and dreams of having a hit.
The platform gets the benefit of massive amounts of content to sell, theoretically for every conceivable taste, desire or need that billions of potential players may have. This makes the platform more popular, draws new players, makes current players stickier and encourages developers to believe that they’re competing for consolidation. The opportunity cost is so low that, not only do novel games emerge from unlikely, would-be developers, the sheer size of the developer base greatly increases the chance of game content that more closely or clearly taps into current cultural trends and preferences. We’ve seen it many times already on both iOS and Android, despite the overwhelming number of games made with shallow or amateurish content.
Where’s the Value?
In the heyday of feature phone games (2001-07), mobile games were mostly P2P (Pay-to-play), developed by professional teams who sold them for a fixed cost — the retail model. This was also the case in the early days of smartphone games.
The charts were more volatile than they are today. There were far fewer games, and players voted with their money up-front. Developers were diligent, and an MVP was closer to a full-on release candidate than a beta release. They still had to do a solid job of writing advertising copy, producing image assets and providing support. They could also price their games higher and had a much better chance of being noticed on the deck. (Development was also more frustrating due to the number of different phones and operating systems and APIs — but that’s another, much longer, discussion).
Back then developers did their best to score coveted first-party deals with device manufacturers, but most either self-funded or had publishing partners who were ready and willing to fund them. The platforms were new and the distribution all-digital, but the traditional retail publisher-developer model worked much as it always had.
Things changed with the arrival of the iPhone. A handful of developers early on saw the potential for massive amounts of traffic via F2P, and their successes attracted more amateurs, which was only possible because of the low barrier to entry — F2C. The platforms flooded with new games, a large number of which were half-conceived or half-implemented, or both. This quickly had the effect of cramming the digital shelves so full that P2P (with some exceptions, like Angry Birds and Infinity Blade, for example) could not compete with F2P.
In fact there was a period — nearly all of 2010 — when an explosion of very good games that would have otherwise been P2P were free with very little IAP. This created a small group of lottery winners — developers who pulled massive traction, bolstered by the new mobile gaming press and incremental improvements to the platforms’ storefronts. These crucial early hits trained players to expect higher-value F2P, but when the smoke cleared developers still had to pay salaries — so they dove more deeply and cleverly into IAP.
Fast-forward to today, where we’re now at over a million active games across the iTunes App Store and Google Play, growing at a rate of hundreds of new game submissions per day, most of which are F2P or have a F2P option in addition to P2P. Practically speaking, it’s absurd. Imagine a million titles in a physical retail store — if you spent just one minute per title reading each description, you’d be trapped in the store for almost two years. (Further imagine that every game on the shelf is free — you can just take it home and try it.)
The result of all of this is that, unlike the old retail model in which developers tended to focus on quality, successful mobile developers had to become masters of low budgets, test markets and market triage, frequent updates, daily engagement with players and of course, IAP. Otherwise they were just playing hit-game roulette. The bottom line is that we now have a mobile genome full of junk DNA whose value, if any, is not well understood. Discovery is still fundamentally broken, and while we have the requisite gaming genres and digital endcaps with featured titles, they’re mostly games whose survival depends on clever monetization models, not quality.
So, where’s the value? For players, it’s whatever is on those few endcaps fronting an endless aisle of other games they will never see (though the aisle serves an important function by creating the illusion of platform power, credibility and trust). It’s a dollar here and a dollar there, and the latest loot crate on sale. It’s not always easy to tell how much fun players are having, but it’s clear they are still under the illusion that mobile games are practically “free”, though “whales” — that small percentage of players who will rack up hundreds, even thousands of dollars in IAP — are ever-present.
In terms of quality, it’s still a race to the bottom, though we are seeing some improvement as the most successful publishers become bored with their large catalogues and over-optimistic about their successes. But publishers still rarely fund anything that isn’t high-profile IP or derivative, and you can’t blame them — the discovery problem and the immense cost of marketing a title into a featured slot or Top 3 list would make anyone risk-averse. (This is not uncommon in traditional games, either, but’s it’s far more profound in mobile.) For self-funded new developers, most still wind up as “one and done” — they make a single game then close up shop when it’s immediately clear they will never recoup their costs.
For professional developers who have the stamina and resources to stay in the game, they’re largely reactive and resistant to re-investing profits into higher quality. They understand what they’re up against. To quote Trip Hawkins from a few years ago, “There really ought to be an institute for studying virtual economies… It’s about thinking about your game like you’re the merchandising manager at Bloomingdale’s. Once you have made a game that has good lifetime value, then you can afford to buy marketing.”
Trip is dead-on, but that kind of reality-check messes with our basic worldview as game developers. We cling to the idea that what we’re doing is magical and novel and creative. Meaning, Mastery, Skill, Flow, Risk, Reward and Story are our prophets, and Fun is our God, but IAP is the Boss and without him, there is no game.
Growing up in Kentucky, I fell in love with video games, one coin at a time, in arcades. There were so many great games that inspired me. Early on it was Space Invaders, Asteroids and Galaxian. I was hooked by the relentless pursuit of a new high score.
I turned over my first million points on Galaga, a game that would become a favorite, not too long after its arrival in our little town.
I was so into arcade games that, after being hired to paint the logo of our new local arcade (that’s me, on the left), every quarter earned from the job found its way back to the arcade within weeks. Of course, the owner of the arcade already knew that’s what would happen.
That jet-fueled teenage competitive streak transformed into a much richer gaming experience by way of D&D. I was already lost to story and plot, improvisation and character when, wet behind my elf ears, I started college. I majored in art and theatre, but spent more time playing D&D and fiddling with computers and coding than in class or acting in plays. I enjoyed being a DM — but I absolutely loved battling wits with other DMs as a player.
I began to think about how I might blend the excitement of twitchy arcade mechanics with something like world-building in D&D, and that deceptively simple thought was what began a life-long relationship with creating and programming video games, starting with the humble TRS-80. The TRS-80, by today’s standards, was an exceedingly modest machine, and perhaps not all that well-suited to video games. But I managed to pound out my first adventure game on it, complete with battle mechanics and stat bonuses, in BASIC. The programming constraints were unimaginable by today’s standards.
I left college with a year to go, dreaming of working in video games. I moved to Indianapolis, Indiana and looked for companies making games, a futile quest at the time. I decided to keep making games on my own while working various other odd jobs for a few years, then went to college again, this time to study mathematics and creative writing. It was an unusual combo — I’m pretty sure I was the only student there doing it, and my advisers didn’t quite know what to think. I’d been a math and puzzle geek since my first algebra class in 7th grade, which was profound and revelatory, almost a religious experience; on the other hand I loved to write, mostly short fiction, and had some talent for it.
I TA’d my last two semesters, including a new calculus class where the professor would use Mathematica in the lab four days a week, while I would work through problems with chalk in hand every Friday from 9 am to noon. I thought it would be a cakewalk — who would want to work problems on the board on a Friday morning? I couldn’t have been more wrong — not only did most of the class show up, half the students from the same course in two other time slots started coming. It turned out that learning advanced calculus on a computer was not an easy thing, and the prof was so focused on the shiny goodness of graphing and playing with equations in software, there was never any time to practice.
After graduation I took a job at one of the biggest tech-like companies in Indy, Macmillan Publishing, developing reference books on programming, networking, and the newest technology on the block, the Internet.
I kept tinkering with games and graphics in my spare time, mainly on Windows. I can’t tell you the number of little games I wrote and programmed, though in all honesty most of them were more like tech demos. But with each new idea, OS version and language/compiler iteration (mostly C), I became more and more interested in graphics, eventually spending as much or more time finding ways to optimize rendering as programming game logic. I became obsessively interested in tools and 3D authoring and rendering, including a brief descent into the magnificent rabbit hole that was the Amiga (which, by then, was no longer even a supported platform).
At work, I moved up, and found myself producing video games in Macmillan’s small software division. We did mid-range and value PC titles and add-ons, and we had some real hits (and plenty of duds, too). I worked on some of the first early 3D games for the lower end of the PC market. I went to my first GDC, my first E3, then back again each year with a half-dozen programming/platform conferences in between. I got to meet stars like John Carmack and Sid Meier. I began to understand how the industry was evolving, what players valued, the different genres, game mechanics, gameplay.
I also played a ton of games on PC and consoles, and from Meridian 59 onward, was hooked on MMOs. A lot was going on both in gaming and with the Internet. Macmillan was willing to take some chances on new business models, and I was in the right place at the right time. We started a new business for distributing add-on levels for popular PC games; RealmX was a highly ambitious, very early attempt at a form of DLC, something now commonplace, but it failed spectacularly. We then created an even more ambitious web product called InformIT, which was arguably the first online collection of professional technical books on the Internet, including books on games. It survives to this day. I’ll never forget the weeks it took us to finalize the first-cut of the data model. By the time we were finished, we had hundreds of sheets of whiteboard paper wrapping every wall in the office.
But by then it was 1999, and I was ready to up my game.
My first job in Silicon Valley had nothing to do with games, but it was a foot in the door. I was hired to help relaunch a large hotel reservations website, both the content system and the server framework. Back then there was no Google infrastructure or AWS like there is today — you had to roll your own on top of other, relatively nascent, software. One of most important things we did was to switch the back-end from Microsoft’s IIS to Apache — a decision prompted by the reality that two full-time, on-call engineers, whose primary job was to reboot the server every four hours, was utterly absurd.
In six months we were done, and by that time I had turned back toward gaming, to a startup in Mountain View. Staccato Systems developed an audio subsystem for the PC that replaced a $27 wavetable chip on sound cards and also was used to create unique audio effects in PC games. I came in with a focus on helping the games side of the business and wound up coding applications to make the core technology accessible and usable by game developers including EA, Lucas and a few others. It was remarkable tech — physically-modeled, logically-controlled audio at a granular level. The engineers I worked with there were absolute geniuses (and there was no shortage of egos), although it never failed to amuse me that, at the end of the day, they were mostly hard-working hackers, like most people who do anything authentically novel in software. Staccato’s technology was first licensed by AC97 Codec manufacturers SigmaTel and then Analog Devices. It was sold to Analog Devices for $30M in 2001.
Around the same time as the acquisition, a whole new game market was starting to make waves — mobile games. I started programming feature phone games and eventually moved into smartphones, around 2007 when the iPhone landed. Companies I worked for, and helped lead, won awards. We brought dozens of titles to market, including high-profile mobile games like Guitar Hero Mobile, Duke Nukem Mobile and Prey Invasion. I started to get a little recognition. I spoke at GDC a couple of times. I was a gameplay programmer, a senior software engineer and engine architect, then a VP of Production, then a CTO. Through it all I was continually amazed by the talent and dedication in the industry, an industry that was going places it had never been!
These days I’m still working on games and tools, but I get to hop around a bit more from project to project. Not long ago I helped bring a wonderful children’s game to Unity/HTML5 and before that spent over a year working on a mobile casino game, right after a couple of years engineering a large framework for performing, essentially, extensive mobile CAD functions in Unity.
There’s almost always something new and exciting to do (right now it’s VR/AR/MR/XR — yes, the acronyms never end!), though there’s nothing like a great new stealth project, or prototype, or a new take on an old shader, or a fresh API. So much to do, so little time! I’m still in love, and I’m comforted by the thought that my best game projects are ahead of me.
“Peaches and cream. You see, dear heart of hearts, everything will be peaches and cream.” She holds up a spoonful. “A toast — I propose a toast! To peaches, cows, and the freezing point of milk.” We tap-tap plastic tips together. I scrape my finger across the spoon-skin dugout part of my head. I drop the spoon, no sound to speak of unless I was a little dust mite or a needling meddler-fly pausing to throw up on an infinitesimal universe.
“Pooky — it’s gone,” I say to her, suddenly out of my mind.
“Well, just ask for some more, silly goob. I don’t know what I’m gonna do with you!” She spot-swipes a tiny piece of sex-starved peach from her chinny-chin-chin; oh yeah, she knows how to do just that but she doesn’t know how I could care less. And scrolls her eyes, her slow motion slot-machine eyes until two peaches rest for a moment, barely off the mark from each other, glow-glower-ing.
“Pooky-pooh — it’s gone.”
“Now what’s gone, ’cause it’s not the creamy-cream-cream I can see that now can’t I?” She fakes a paused grin then looks serious, then pauses, then seriously grins.
Grasping her hand I paint it along the back of my head over the concave lack-of-a-knob. “Well do you feel anything? The growth — do you feel it?”
And she gets to be astonished. For once in her life she gets to witness a miracle, a tap-tap-tap into something that isn’t sponsored by the next commercial after an evening’s corn-fed dinner-sofa diet. “Well, what does this mean?”
This does not make sense. I had a tumor there, a funny little sonofabitch thing that crumpled my curses up like newspaper (the obituary section) as I obsessed over how fast it takes rigor mortis to set in, all the while sucking pity from her and the whole family, me (me!), the attention-getter and newly sanctioned medicator, cramming in night-light walks along Kalimdor Road with no regard for splinters, bones, bears, cats, disease or losing myself in some fuzzy logic.
But I can feel the color returning to my face already, I don’t get off that easily. So I straighten, take stock of the room — the fine fakety-fake oak counter, the fuzzy green pool table (I wonder if felt dust mites are green), the sudden lack of ping-pong ball pops and the squelch of old video games, the fluorescent ceiling-tube flicker now swinging slow-mo like a cape around my neck, man, like a swoosh–
“Excuse me — well, I thought so!” A tasteless, familiar voice, behind me. Pivoting around in my seat (I still have motion balance, you know), it’s a surprise, it’s him. “Doctor? The thing, in my head! It’s gone!”
“Now that’s just not funny, I’m not here on business. My son and I are trying out the new ping-pong tables. He’s a student here, you know.” The doctor is animated and that’s odd, really odd but he cheshire-cats his words. He’s wearing a university sweatshirt and shorts. He beams, then contorts for a second and beams again. “Serendipitous, I tell you!”
“But, Doctor, Doc — Pooky, tell him. Tell him what’s happened. I’m the miracle man, not the dead man — not the endpoint.”
“Now, now, I’m not on call.”
She scrolls her eyes again, but sideways this time and I’m thinking, I should be concerned about that. “He’s off duty, give me a break, what do you want him to do, cut you open right here on cheap formica? Besides, I’d like to propose another toast — to the doctor, and–” She raises her spoon again and this time I can clearly see the raised plastic ridge from the factory mold on it. Her voice trails off to a murmur and a lactose hiccup.
“Look, son,” the doctor interjects, still grinning as if nothing happened, no miracle, no anomaly, an endpoint, back in the middle. “I came over here to see if you could help me and my boy out. I’m afraid we, well, actually, I did it. I hit the ball so hard it just–” He leans in, “Well, anyway, we still have a game to finish.”
“What?” I said I’m a miracle, an anomaly, not an endpoint, back in the middle.
“A smashed ball. It’s not exactly the first time, but — don’t know my own strength. Anyway, when I saw you over here I thought, ‘there’s my answer’, basta. I can’t exactly return it, but I expect that’s okay with you.”
“Return what?” I’m cold, I shouldn’t be — but it kicks in. Almost shivering.
“Why, the tumor, of course.”
“In your head — you know, the little ball in your head
It’s the perfect size
Lucky I have my instruments with me
We can use one of the tables in the lounge
It’ll only take a few minutes, and you’ll be doing me a great favor
I don’t want to quit
I’m winning the game right now — ah, yes, here it is, my saw!
Funny I had it with me, and no need for anesthetic
Pooky-pooh can hold your hand, that always works in the movies
Actually I’d call it a miracle that you’re right here, right now
I’d call it a miracle–“
Fabulous project and write-up by one of the nicest guys in the video game business. Not to be missed.
A couple of months ago I found David Eagleman’s Sum: Forty Tales from the Afterlives. This is a fun, imaginative little book — highly recommended. In the spirit of the book, I couldn’t help but write #41 myself:
As far as anyone knows, there is no afterlife, since there is no death. You simply come into existence and never die. There are theories that you might eventually collapse or compress into nothingness, but in billions of years of known history, no one ever has.
You will see plenty of “death”, but none of it is real. Anyone who appears to die has already diverged from any number of shared realities with any number of other beings, who at the “time of death” only see an echo of the “dead person”. This always happens automatically — as you get older, you learn to recognize when you’re going to “shift” instead of “die”.
You don’t find all of this out until you leave your home planet — in this case, Earth. Everybody leaves their home planet on the first shift.
You will change forms many times. Most folks just change as needed, according to the environment in which they find themselves, in order to acclimate and fit in. It all happens involuntarily, like breathing air on Earth.
Though it certainly happens, it’s best if you don’t discover the truth about death too soon — for example, within your first decade or two if you were born on Earth. If you grow up without a firm grasp of the idea of mortality, you almost always wind up adopting a sort of depressingly destructive attitude about it all.
I was looking through some of my Disqus comments and was pleasantly surprised at some of my replies to various discussions. Like everyone else, blog commenting is a mostly in-the-moment affair for me, and while I guess that quoting myself is an arrogant sort of thing to do, I believe that these quotes will make you think a bit, especially if you’re in a startup and/or the video game industry. Some light editing for context.
Is Google evil? Hell yes – it’s corporately impossible for them not to be at their scale. Apple is also evil at scale. Spotlight as an app-mining mechanism ultimately results in plenty of ads from apps, in addition to 80+% chatter from zombie apps. If Apple does evolve Spotlight into a full-on Google competitor (oh the irony, considering Job’s quote), their ability to hold off on ad-spam results is only possible because their revenue model doesn’t need/want it – yet. Privilege remains committed to the fantasy that the natural result of scale is diversification into non-core competencies through market consolidation/acquisition and wildly expensive internal development. The root of the root problem is that no large tech companies – certainly not Google or Apple – believe that their Scrooge McDuck money bins can ever be big enough.
Having traveled to Silicon Valley several times per year for two decades, lived there for seven years (99-06), and seeing my son’s experiences for the last three years since he moved there fresh out of college, the fundamental SV milieu hasn’t changed much. I still grok it as a theme park. In fact using religion as a metaphor, SV as a religious theme park hits home. It’s presumptuous, exploitative, shiny, kitchy, dogmatic and arrogantly opportunistic. And if you grok the concept of creating truly meaningful software out of nothing but your own mind and mettle, SV is like one of those big crazy Texas churches, except you may be the god that changes the world. SV is where art fucks science, creates a singularity, then rebrands it as a virgin birth and the second coming for the next generation congregation. Or something like that.
In my industry (video games), from my perspective as a developer, things are a bit different from the bubblicious milieu. It’s more like a dunken orgy inside a rocketship to the bottom, where 0.01% landowner-publishers are in slave-heaven with developer-unfriendly disty deals and mini fickle-finger-of-fate awards in lieu of cash. Apple and Google changed distribution forever. Absolutely no one has any real ideas about how to deal with the scale of the market and the ever non-presence of discovery. Customers have been taught to expect crap for free. The industry used to be cutthroat and hit-driven — the good old days! Now it’s just a big lottery.
In the gaming segment, big companies (publishers) and small companies (developers) have undergone a big relationship shift. Prior to the rise of mobile and social games and the F2P model, developers were valued as reliable sources of content that would have a direct impact on publisher success. Today the developer has much less real value to the publisher – discovery is so difficult that most publishers can only afford a very wide net to catch distribution deals. Since production costs have only risen, developers produce less compelling content. The race to the bottom is getting so big that the starting line is elbows-to-elbows with out-of-shape runners. Hence developers only help publishers be successful to the extent that they incrementally increase the probability of a hit game in which profits are shared equally.
Large-scale organizations (of all kinds) appear more and more like big collections of entropic vagaries whose operational tools are over-confidence, short-term accounting, obfuscation, denial, deflection, disinformation and so on. These are old tools that cannot hope to be of any real use up against cyber-attacks. Limiting organizational growth would by definition limit the impact of a single cyber-attack. Of course this is blasphemy to all modern economic systems. Sigh.
Something I’ve learned and am still learning is that communication is almost always about feelings and the needs behind them. If I’m mindful of this and realize that I’m co-authoring the story of the conversation then I tend to listen much better and not lecture and analyze so much; if not I’m just data without a soul, steamrolling everyone’s needs including my own.
The collection and storage of data seems impossible to stop, given the ubiquitous commercial nature of the Internet. Rabbit’s been out of the hat since ’94 or so and it’s far easier to re-use that rabbit than to create another hat. The bigger issue may be Peak Abstraction. We’re all leaves in various trees with chains of nodes dumping us into super-groups, on up a given tree until we hit its root node. When nodes contain too many sub-nodes to evaluate logically/meaningfully and leaves are far removed from their nodes, yet power enforces any sort of algorithmically-motivated action toward the leaves, we hit some pretty scary peaks. If one of those trees is government, the air will be damned thin up there.
Most engineers, artists, designers I know have always had side projects — it’s the special stuff they “want” to do away from the normal stuff they “need” to do. Sometimes the special is an off-shoot from the normal, often not. If the special becomes normal then maybe it becomes a “thing” whose fundamental bits are mostly immutable. Maybe it’s a needy thing. It needs to impress, it needs validation, it needs to generate value, it needs to function beyond the sparky neocortextual passion that first formed it. Once normalized, the full expression of the original vector is lost, or hard to compute. So on to the next project.
Productivity purely as a function of time makes some sense where it’s clear that time is inherent to product[ivity], e.g. manufacturing when quantity is the primary objective, or old-school QA. But it starts to break down past the short-term. In software I see it generally as a violent process standing in for trust, a red flag with a herring logo on it, beating in the breeze over management’s head. If the objective is to serve your time then time is who you serve. You are timetive, not productive.
Android developers, in particular, try to remember that Google is run by the best and led by super-geniuses, unlike those wannabes at Apple. They know this is true because, well, everybody knows it now. And they remember it when they have to use lousy development tools and do battle with the Eclipse IDE and slow, buggy emulators. They remember it when they’re struggling with an over-engineered, clunky, dubious API, debugging in a black box or on any of the dozens of test devices they had to buy, and they realize Goggle has much more important things to do than write documentation. And they know that Google could spend more time with device manufacturers to decrease platform fragmentation, but they trust that there’s a strategy in place that must be beyond their understanding. In all seriousness, I totally agree that Google has an enormous amount of talent and they are on a steady march to innovative user experiences in several areas. Neural network-based voice recognition is exciting. But they have a ton of housekeeping to do, too.
Except for retail, these models are a predictable response to market scale, and the gaming industry is more creative and sophisticated in their use of them due to its history as a hit-driven business. But the fundamental problem is ever-present: Quality doesn’t scale. The non-traditional market is massive and getting massive-er by the day. The game shelf is a mile long with a handful of endcaps. Funding a high-quality game is very risky since it cannot be done on the cheap. So quality is the first thing to go out the door – it’s intuitive (and may be a fallacy) to diversify instead. Rather than betting your budget on one high-fidelity game, the platforms ask that you create many low-fi games with minimally viable mechanics and art then invest in creative monetization and cross-promotion to keep re-leveraging your players across the catalogue. And it makes some sense until you realize it’s not quite sustainable because customer expectations scale, too – especially new users you’ve transformed into gamers.
I have mild OCD. I hate it when I’m meta-OCD and become OCD about my OCD as I seek to suppress rather than repress. Finding data specific to entrepreneurs as a class sounds tough. Looking at type a’s, highly creative types and super-driven product people and engineer types, maybe successful execs, makes some sense to me. Deconstruct the entrepreneur into component sub-classes, at least that’s a direction in which to head. Qualitatively, my own experiences with other entrepreneurs suggests that they — especially the product and engineer types — are prone to depression and OCD, manic behavior, excessive hubris and definitely divorce. They are also prone to remarkable displays of kindness, honesty, purpose, courage and genius, qualities I observe somewhat less frequently in others.
In my business (video games), looking for a segment where you can become the first mover is a little analogous to implementing a new or under-adopted game mechanic so well that you become the definition of the category. Others will follow your idea but wish they could follow your execution. Rovio, for example — they weren’t the first mobile 2D physics game, but their product execution was first-rate and their market execution was prescient (continual engagement with players through lots of content updates — few were doing this on mobile at the time — rather than feature updates and new skus). Now they’re scaling and evolving and so far doing a good-to-excellent job of that. IMO all software companies should study the video game industry in preparation for the massive markets that are coming our way over the next decade — at that scale practically everything will become hit-driven and a measurement window of six months may be generous.
Somehow people convince themselves that there is never enough time but it’s really not that hard to be responsive. The good will generated alone is worth the effort, and often there’s a business payoff — sometimes way down the line but it happens to me not infrequently (give people time and they will surprise and delight). In my industry (gaming) we often work with external teams. I only get to meet these guys in person once a year at best (usually at an industry conference), otherwise the communication is project-focused email/phone/Skype. When someone reaches out to me for other types of help or connectivity, it’s an opportunity to put something good out into the universe. The way I look at it, we’re all on the same team. Practicing trust and reliability is good work. It’s a chance to show quality. It’s a happiness-inducer and life-extender.
As of March, we’re at 343,915 games in the App Store, at a rate of 130 new games per day.
Think of it: 343,915 games at your fingertips — if you spend just five minutes on each game for 12 hours per day, 7 days per week, it would only take you about six and half years to try them all, and as you finish the last one, you’ll have over 300,000 new additions to check out! And that’s just games in the App Store — there are even more on Google Play!
Isn’t that just incredible? Isn’t Apple (and Google) amazing? I mean, they’ve really changed the landscape. We’re talking well over 300,000 shitty games developed by amateurs, and several hundred games that people actually play, developed by data analysts and suits who had, or raised, enough money to exploit the market.
Of course there are periodic hits that seemingly come from nowhere (you know, the Jobs-Woz garage developers who hit RPRT*), and that’s enough to keep the myth alive that anyone, with the mettle and motivation, can become a success!
Oh what a world of opportunity! All my ex-game developer friends were positively sick of making a living developing games anyway. Why, any one of them now has the chance to start anew and maybe even become successful enough to drop $30 million on a little privacy!
Apple and Google — Steve (RIP), Tim, Larry, Sergey, Eric, I don’t say this often enough: You guys are like gods really! Like good little warlords, you raped and pillaged software (especially games!) the world over. You figured out how to redistribute the wealth right up the ladder. You turned software distribution into a downright software exchange, reminding us of that great scene in the movie Trading Places:
Billy Ray: No thanks, guys, I already had breakfast this morning.
Mortimer Duke: This is not a *meal*, Valentine. We are here to TRY to explain to you what is we do here.
Randolph Duke: We are ‘commodities brokers’, William. Now, what are commodities? Commodities are agricultural products… like coffee that you had for breakfast… wheat, which is used to make bread… pork bellies, which is used to make bacon, which you might find in a ‘bacon and lettuce and tomato’ sandwich.
[Billy Ray turns and gives a long look at the camera]
Randolph Duke: Randolph
Randolph Duke: And then there are other commodities, like frozen orange juice… and GOLD. Though, of course, gold doesn’t grow on trees like oranges.
Randolph Duke: Clear so far?
Billy Ray: [nodding, smiling] Yeah.
Randolph Duke: Good, William! Now, some of our clients are speculating that the price of gold will rise in the future. And we have other clients who are speculating that the price of gold will fall. They place their orders with us, and we buy or sell their gold for them.
Mortimer Duke: Tell him the good part.
Randolph Duke: The good part, William, is that, no matter whether our clients make money or lose money, Duke & Duke get the commissions.
Mortimer Duke: Well? What do you think, Valentine?
Billy Ray: Sounds to me like you guys are a couple of bookies.
Randolph Duke: [chuckling, patting Billy Ray on the back] I told you he’d understand.
*Right Place, Right Time
Little much does skip past the face of a bear,
he weighs one of two endpoints to swat
to play, to chew, to fall, to lie.
Maybe stuck in prayer position does he look up
before he goes to sleep? Does he listen for something?
Lard belly feels good only for rest.
When he does wake does he then
at some moment think about flames or fire?
Quick sugar beef jerky for a large time.
When he does wake does he then
at some moment remember blue lines on 3rd-grade paper walls?
Crunchy wet middle of a cracked leaf bed.
When he does wake does he then
at some moment see gills feather their drops?
Salt rubbed scales between steel-post mob teeth.
When he dozes away, canoes then
at some moment see the paddle match his mechanical paw.
Mark bark big claw limit of the physics in the math.
Maybe stuck in player position does he look down
before waking up? Does he listen for something?
Hard jelly cracks like double-sided dreams.
Little much does skip past the face of a bear,
he weighs one of two endpoints to swat
to fall, to lie, to play, to chew.
Brad Feld, one of my favorite bloggers, wrote a little gem this week called Something New Is Fucked Up In My World Every Day. It’s an inspirational reminder that the way out of your problems is through, and that you don’t need to look far to discover how insignificant your significant woes may be. It also got me thinking — sometimes problems don’t look like problems.
Years ago my mother managed a facility for psychiatric patients who were hoping to eventually reintegrate into society. Once, on a visit home, I was chatting with her in her office and a guy knocked on the door, came in with a clipboard and a stack of papers, and proceeded to discuss medication schedules and patients with her.
He was dressed neatly, was very friendly and personable. He introduced himself to me and asked how my visit was going. He enthusiastically talked about how much he enjoyed working with my mother.
After he left I said, “Great staff, mom, he seems like a go-getter”, to which she replied, “He’s a patient — one of our most difficult”. Turns out that he was a well-adjusted, normal fellow most of the time (though delusional about his role there), but every couple of weeks he would have a big psychotic break for a day — he was sort of bipolar without the depression, with short, intense manic periods. Without medication, he was much worse.
Things aren’t always okay even when they seem okay, and any solution to a problem is susceptible to regression and entropy. Sometimes you don’t translate yourself through suffering as much as you scale its effects in some way — the operation is multiplicative, for better or worse. Very often suffering is recurrent, making a solution to a problem seem more like modulo than subtraction.
In the Eat Me If You Wish parable in Brad’s post, a man whose cave is full of demons makes them disappear by surrendering to their unknown wishes. The way I interpret this is that he renders his demons powerless by giving them his full attention — focus yields control, and suffering effectively becomes a choice.
I tend to think of suffering as more of a stream, a thread that runs in the background or is brought into focus. It doesn’t disappear — it’s clamped to some small epsilon and will never scale to zero. It’s a problem that doesn’t always look like a problem. Perhaps this makes suffering more about when than what, and when is something I can manage much better than what.