Why good ideas can feel very bad

"The beginning of a very bad idea often feels very good. And the beginning of a very good idea can feel very bad." As Scott Berkun reminded me on Monday, the boundary-pushing, edge-defining creative act rarely makes complete initial sense, not least because its contextual elements aren't yet fully understood. Does it fit in this box or that? How similar or different is it to prior thinking? It is evolutionary or revolutionary? What might it mean for me?

We are metaphorical machines, always, always, always describing one thing in terms of another, and any idea, before being put to the test, is what we say it is.

As for the test, whether a potential breakthrough belongs in a new mathematical theorem or on stage in your local theater, doubt is a requirement for discovery. Take courage.

In his writerly lecture about "the shapes of stories" posted by Maria Popova, Kurt Vonnegut reminds us that we're often just as clueless about the meaning of personal experience. Popova uses Vonnegut's thinking on Hamlet to illustrate the point. "Shakespeare," Vonnegut says, "told us the truth.... The truth is, we know so little about life, we don’t really know what the good news is and what the bad news is." 

It's corollary: We can't control circumstances, just what we think in the midst of them.

Of course, there's a fine line between belief-made and make believe. And navigating it means starting with a healthy dose of doubt and faith. The former so that we don't make fools of ourselves before the idea has been realized in act or shape, the second so that - for the very best reasons - we can.

Wayne

Image: Geoff Oliver Bugbee

You promised me Mars, and all I got was Facebook

Reminiscent of Peter Thiel's statement that "We wanted flying cars, instead we got 140 characters,' Buzz Aldrin peers from the digital pages of the November/December issue of Technology Review and says "you promised me Mars colonies and all I got was Facebook."

Speaking only for myself, I'm well and truly sorry.

Can the biggest problems in energy, in higher education, in health care be solved? Of course. Can they be solved res publica? Different question.

As a glance at the number of fully funded and audacious Kickstarter projects can attest, individuals are driven to create, to invent. Still, a plasma thruster is not the Saturn V, and even though I have no doubt that inventors still have big plans for the future, crowdsourced innovation pales in comparison to the historic and nationally-funded moon landings. But as Technology Review points out, mounting such a peacetime effort may have simply been an historic anomaly. Apollo consumed four percent of the federal budget in its heyday. With the top federal tax rate about half of what it was during the 1960's and enormous entitlement obligations to maintain, a new "Apollo" - figuratively speaking of course; pick any worthwhile goal - would almost certainly be unaffordable.

Technology Review:

The Apollo program, which has become a metaphor for technology's capacity to solve big problems, met these criteria, but it is an irreproducible model for the future. This is not 1961: there is no galvanizing historical context akin to the Cold War, no likely politician who can heroize the difficult and dangerous, no body of engineers who yearn for the productive regimentation they had enjoyed in the military, and no popular faith in a science-fictional mythology such as exploring the solar system. Most of all, going to the moon was easy. It was only three days away. Arguably, it wasn't even solving much of a problem. We are left alone with our day, and the solutions of the future will be harder won. 

We don't lack for challenges. A billion people want electricity, millions are without clean water, the climate is changing, manufacturing is inefficient, traffic snarls cities, education is a luxury, and dementia or cancer will strike almost all of us if we live long enough. In this special package of stories, we examine these problems and introduce you to the indefatigable technologists who refuse to give up trying to solve them.

For a bracing read on just one of the many challenges the program faced, read "Digital Apollo." Using almost non-existent read/write space, inventive engineers created fault tolerant and intelligent software that could scan, track and respond to an external environment, all while assessing its own state of play and abandoning unneeded computation. On Aldrin's flight, that technology probably saved the landing attempt.

Still, Pontin, who is the editor in chief and publisher of Technology Review, is correct. Having been alive long enough to have watched televised men on the moon in rapt wonder, any comparison to the past, no matter how wonderfully accomplished, is inevitably romanticized. Quoting Auden, Pontin writes that "we are left alone with our day." Just as those technologists worked within the confines of primitive computing, there are still challenges to be met and Earth-sized problems to be solved that will require solutions specific to our time and age.

Although written on a completely different and far more personal subject matter, Robin Robinson's line "We are drawn to edges, to our own/parapets and sea-walls" is apropos. One poetic valedictory deserves another, I guess.

Give the Technology Review issue a read. Assaying on The Crisis in Higher Education, one of my favorite writers, Nicholas Carr, is also featured. If the piece is anything like his tartly worded blog, Rough Type, it will be a pleasure.

Well done, Jason.

Wayne

Image: Technology Review

Borgs at the Bowling Alley

By way of capturing the prevailing narrative about the human relationship to digital technologies and the social web, I think Mark Changizi has come up with a catchy descriptive phrase: "Borgs at the bowling alley." Sure, the network keeps us in touch with far-flung friends and makes the morning commute quite a bit more manageable for many, including me, but it's just a tool.

Hold that thought.

Changizi reviews "You are Not a Gadget: A Manifesto," by one of my favorite technology contrarians, Jaron Lanier, the digital relative of Nicholas Carr and Sherry Turkle. Changizi:

In fact, what comes across most clearly in the book is Lanier’s distinct and unique individuality. He brims with novel ideas, from the origins of speech and music (he speculates that it connects to color signaling in cephalopods), to radical kinds of programming languages (without protocols), and to new ideas for virtual reality (e.g., altering our perceptions so that we experience life as billowy clouds). Although many of these ideas are not entirely crucial to his central thesis, they serve to illustrate that it is in individuals, not collectives, where we find the lion’s share of creativity. These novel ideas also serve to convince the reader to trust Lanier’s intuitions about where creativity comes from....

Here’s what, in my experience, people tend to tell themselves: Smart collectives result from liberal servings of self-organization and complexity. Why? Because the most brilliant collectives that exist—those found in biology, such as our bodies and brains built out of hundreds of billions of cells—are steeped with self-organization and complexity. And, the intuition continues, the Web also drips with self-organization and complexity. The Web therefore must be smart. And because the Web is growing and evolving over time, the Web must be getting ever smarter. Perhaps some day it will even become self-aware!

If I may agree with what I take to be Changizi's point: what a load.

This is the time of the year for reassessment and stock-taking, for quiet and contemplation. Consider the growing number of people questioning what, exactly, this sometimes uncomfortably close and expanding network is, as well as the stunted idea that the human imagination can be sourced to the collective rather than to individuals.

Susan Cain, would you please consider being an IdeaFestival participant? Write me at whall[at]ideafestival[dot]com

As much as I admire the self-organizing and adaptive skill of the ant colony or the bee hive, I have no desire to be an ant or bee. The human experience with its constant striving, its ability to hold out possible worlds for examination, its sad lassitude in the face of problems like climate change and the horrific sexual trafficking of girls and young women, its ability to conceptualize something better and (occasionally) act on that new conceptualization, is much, much more interesting.

Heavens no. The Singularity is not near.

We've made great strides toward understanding the human brain. Neuro-imaging can show us our centers for self-restraint, obstinacy and that flicker of electrical activity that coincided with the raid on the kitchen fridge last night at 12:31a. Correlative? Yes. Causative, no. We still don't know why a three pound lump of matter should be capable of sifting the pros and cons of the late night snack or Dickens' Great Expectations. And the very idea that x-number of exponential steps will lead us anywhere that we might predict now is, frankly, laughable. We don't even understand the full future impact of any single breakthrough. Yes of course, technology is rapidly expanding, perhaps, as Lanier might charitably suggest, at a dizzying rate. It's easy to be confused. But to think that said network will self-realize after reaching another gazillion connected gadgets or so is a category mistake. The network will not one day blink open its eyes and ask us what we've been doing all this time. And if it does, let's hope it's familiar with the idea of pity, the mercurial stepchild of grace, because it surely won't be familiar with its practice. We wonderfully frail and biological gadgets are ultimately moved and convicted by the heaving in our chests, by the strength in our arms, by the moving of our feet.

One person at a time.

Wayne

Image of Jaron Lanier: AttributionNoncommercialNo Derivative Works Some rights reserved by leeander

It takes a while to finally be you

You have to play a long time to be able to play like yourself – Miles Davis

One of the things that makes us unique is an ability to understand self-referential statements like the one here. Strictly speaking, it's a tautology, meaning it's necessarily true and just as meaningless. Scott Berkun uses it, however, to make the case for creative perspiration, because the biggest contributor to any ultimate success is just showing up.

We are all born with a gap between our ambitions and our abilities, and ambitions rise much faster than abilities can....

Many very talented people never develop their skills only because they can’t stand the feeling of this gap. They’re embarrassed and tortured by it. They expect to be great quickly and when they’re not they feel they’re a failure, despite their foolish comparisons to ghosts of their own invention.

So if you have any creative ambition, there are but two questions: What is it? And are you willing work long enough to give it the best part of you?

Wayne

Image: AttributionShare Alike Some rights reserved by seventime

 

Nurture your inner psychopath?

Talk about lateral thinking.

In a Scientific American piece, John Horgan points out the unexpected connection made by Oxford University psychology professor Kevin Dutton in his book, The Wisdom of Psychopaths: What Saints, Spies and Serial Killers Can Teach Us about Success:

Next time you face a difficult situation, Dutton said, imagine what you’d do if you had no fear. 'Psychopath up!' Another way to become more psychopathic, Dutton suggested, might be to meditate. In a study that he calls 'Monks Versus Punks,' Dutton has carried out psychological tests of Buddhist monks and compared them to psychopaths. Like psychopaths, monks are often calm and decisive in the face of stress; free of anxiety, even in the face of death; and able to read others’ expressions accurately.

The big difference, Dutton said, is that monks are motivated by compassion for others, whereas psychopaths seek only their own pleasure.

If insight comes from making the connection between widely differing things - and that is what the IdeaFestival does really, really well - then the idea of success as a single-minded sheering, of emotion-free targeting, or of finding similarities between how a monk and killer might size up success, certainly made me think.

On the whole, Dutton's connection is impressive. But perhaps - as Horgan suggests at the conclusion of his piece - linking a certain kind of ruthlessness to success, even if that ruthlessness is motivated by compassion, makes a rather more uncomfortable point. If contemporary society is already too self-absorbed, is maximizing personal freedom the right goal?

Scientific American's "'Dexter' and British Psychologist Ask: Who Wants to Be a Psychopath?" is here.

Horgan may be best known for his book, The End of Science.

Wayne

Image credit: AttributionShare Alike Some rights reserved by Gustavo da Cunha Pimenta and Showtime