Tag Archives: artificial intelligence

Cross-train your brain to do four things really well

A condensed version of this post originally appeared in the Accenture Careers blog.


Do you want to excel in your career, today and in our uncertain future? Below are the four not-so-simple areas of focus.

But first a cliché warning: Smarter people than I have tread this terrain and stepped in a few. I’m thinking specifically about this from David Foster Wallace, from his famous commencement speech This is Water, where he began by talking about how higher learning should be about “teaching you how to think” …

If you’re like me as a student, you’ve never liked hearing this, and you tend to feel a bit insulted … The fact that you even got admitted to a college this good seems like proof that you already know how to think … [but this] cliché turns out not to be insulting at all, because the really significant education in thinking that we’re supposed to get in a place like this isn’t really about the capacity to think, but rather about the choice of what to think about.

Not that long ago you could graduate with many types of degrees, having absorbed facts and their interrelation, and expect to build a nice career. Naturally you would continue learning after graduation, but in that pre-digital era, when access to knowledge was constrained, learning was primarily absorbing new facts. Here I’m imagining my dad’s career …

Adopting Learning Skills For Today’s Work Environment

My late father was co-founder of a small-town accounting firm. As a CPA, he and his partners would turn to an imposing set of heavy, beige tax law books that filled to bursting the bookshelves his firm had built for them. To me, as a child, they reached so wide and high they appeared to hold the whole business up.

I wasn’t that far off. In that analog world, the folks closest to the facts got the job done. And got the raises. And the promotions.

What’s changed? For one, digitization – a democratization of knowledge. With sites like Google, Wikipedia, plus more specialized sites like The Markup, the world of information is in all of our pockets. What else has changed is complexity, and mass disruption. Our world is many factors more complex than my father’s, or yours.

It’s also clear that the pace of change will continue to accelerate.

How to cope? Here’s a comprehensive list. It’s the “Four Cs.” I discovered them in Yaval Noah Harari’s 21 Lessons for the 21st Century (originally sourced from this NEA white paper). They are 1.) Critical Thinking, 2.) Creativity, 3.) Collaboration and 4.) Communication.

1. Critical Thinking

Consider Accenture’s Problem-Solution Mapping (PSM). It’s a hybrid of several mental models, including the thinking processes of Eliyahu Goldratt’s Theory of Constraints. Here’s a video introduction:

It’s no accident that the first part of PSM is about stepping back and looking objectively at what we want to accomplish and why. It’s also about looking deeply into the problems identified to find root causes. Only once this is done do we construct hypotheses to test in pursuit of solutions.

Or consider the advice of Charlie Munger, someone best known as the behind-the-scenes member, with Warren Buffett, of the duad that built Berkshire Hathaway. In a speech Munger put it this way:

You can’t really know anything if you just remember isolated facts and try and bang ’em back. If the facts don’t hang together on a latticework of theory, you don’t have them in a usable form.

You’ve got to have models in your head. And you’ve got to array your experience — both vicarious and direct — on this latticework of models.

Or finally, consider the words of my friend from a conversation I had with him this weekend. He’s a leader in a multi-billion-dollar e-retailer. He told me, “I’m advising my daughter to learn Python as a way to put her data science degree to work. In fact, we’re going to learn it together. But I’m also stressing the importance of learning how to define the right problems to solve, and finding a framework for their solution.” She’s got a smart father.

2. Creativity

Once you’ve identified the best problems to solve, it often takes creativity to solve them. In their new book Pivot to the Future, Accenture colleagues Larry Downes, Omar Abbosh, and Paul Nunes remind us that automation will place greater intellectual demands on all of us (emphasis below is mine):

We are not among those who think AI [artificial intelligence] will displace knowledge workers to a significant extent. We do, however, believe they will substantially alter the nature of work and in a positive way. Today, too many jobs are boring and repetitive leaving workers unmotivated or Worse. AI Technologies offer an opportunity to redesign work away from the mundane and toward tasks that require human reasoning, empathy, and creativity.

I agree with their belief that AI is not to be feared. We need to look at it as another tool for getting work done, and try not to listen to what scholars call “moral panic” over new technology. What it means is we need to develop the qualities that machines will be less effective at “learning.” Human creativity is at the top of my list.

3. Collaboration

In her TED Talk and best-selling book, Reality Is Broken: Why Games Make Us Better and How They Can Change the World, game developer Jane McGonigal makes a case for how multi-player video games develop crucial learning skills.

Arguably, the highest on her list is collaboration. She reminds us how these games are rehearsals for the modern work world, where delegating, co-creating and supporting each other are crucial keys to team success.

At Accenture, collaboration is priced into every project. What’s more, our commitment to inclusion and diversity helps ensure a working environment of psychological safety – something that Google’s research has found to be one of the keys to a winning team.

4. Communication

Of the four, this may be the easiest to practice – but the hardest to get right! In their excellent book on business communication, Weekend Language, authors Andy Craig and Dave Yewma describe what happens to our storytelling abilities when the weekend is over and we’re back at the office:

We’re full of feature lists and ten-point plans, ‘high level’ terms and nonsense. As if that wasn’t bad enough, we beat the snot out of our audiences with 118-slide PowerPoint presentations chock-full of text.

Audience members typically don’t remember anything.

Their book trains readers to tell stories instead, and to tell them effectively to achieve business goals. If you’ve read their book, you may have noticed some of their recommendations here. I’ve peppered this with anecdotes, metaphors, and other ways to (hopeful) bring my content to life for you.

How important is storytelling to Accenture careers? Consider this:

I work out of the Digital Hub in Chicago’s Accenture offices. And every month our Storytelling Club meets to isolate and strengthen these communication “muscles” in front of a live audience. It’s an invaluable resource.

Focus Your Thinking On the Four Cs

Now that you’ve learned “what to learn,” know that Accenture can help you. Consider joining us, learning from and working with some of the industry’s best and brightest. Find your fit with Accenture.

Wolves, Trains and Automobiles: The Domestication of A.I.

I’ve thought and read a lot about artificial intelligence (AI). Particularly, its potential threat to us, its human creators. I’m not much for doomsday theories, but I admit I was inclined to fear the worst. To put things at their most melodramatic, I worried we might be unwittingly creating our own eventual slave masters. But after further reading and thinking, I’ve reconsidered. Yes. A.I. will be everywhere in our future. But not as sinister job-killers and overlords. No, they will be extensions of us in a way I can only compare with that most beloved of domesticated creatures: The dog.

For you to follow my logic, you’ll need to remember two facts:

  1. Our advancement as a species from hunter-gatherers to complex civilizations would not be possible without domesticated plants and animals
  2. Our collective fear of technology is often wildly unfounded

Bear with me, but you’ll also probably need to recall these definitions:

  • Domestication: Taking existing plants or animals and breeding them to serve us. Two examples are the selection of the most helpful plants and turning them into crops. Michael Pollan’s early book, The Botany of Desire: A Plant’s-Eye View of the World, will bring you a long way to seeing this process in action. As for animals, you may think of dogs as being mere pets, but early in our evolution as humans we bred the wolf to help us hunt for meat, and to protect us from predators. Before domestication, we pre-humans hunted in packs, and so did the wolves … never the twain shall meet. After this domestication, we ensured the more docile canines a better life, under the protection of our species and its burgeoning technologies (see definition below), and they delivered the goods for us by helping us thrive in hostile conditions. It was a symbiosis that turned our two packs into a single unit. No wonder the domesticated dog adores us so, and that we consider them man(kind)’s best friend.
  • Technology: Did you know the pencil was once considered technology? So was the alphabet. You may think of them merely as tools, but technology is any tool that is new. And our attitudes toward anything new always starts with fear. Douglas Adams put it this way: “I’ve come up with a set of rules that describe our reactions to technologies: 1.) Anything that is in the world when you’re born is normal and ordinary and is just a natural part of the way the world works. 2.) Anything that’s invented between when you’re fifteen and thirty-five is new and exciting and revolutionary and you can probably get a career in it. 3.) Anything invented after you’re thirty-five is against the natural order of things.” Fear of technology not surprisingly spawned the first science fiction: Mary Shelley’s Frankenstein; or, The Modern Prometheus, a literal fever dream about a scientist’s hubris and the destruction it wrought upon himself and the world. This fear has a name: Moral panic. And it has created some pretty far-fetched urban myths.

In a Wall Street Journal piece, Women And Children First: Technology And Moral Panic, Genevieve Bell listed a few of these vintage myths. The first is about the advent of the electric light: “If you electrify homes you will make women and children … vulnerable. Predators will be able to tell if they are home because the light will be on, and you will be able to see them. So electricity is going to make women vulnerable … and children will be visible too and it will be predators, who seem to be lurking everywhere, who will attack.” And consider this even bigger hoot: “There was some wonderful stuff about [railway trains] too in the U.S., that women’s bodies were not designed to go at 50 miles an hour. Our uteruses would fly out of our bodies as they were accelerated to that speed.”

Sounds messy.

I don’t have to tell you about our modern moral panic surrounding A.I. Except there is a bit of reverse sexism going on, because this time it is male workers who are more the victims. Their work — whether purely intellectual or journeyman labor — will be eliminated. We’ll all be out on the street, presumably to be mowed down by self-driving cars and trucks.

The Chicken Littles had me for a while

So what changed? In the same week I read two thought-provoking articles. One was in The New Yorker, The Mind-Expanding Ideas of Andy Clark. Its subtitle says it all: The tools we use to help us think — from language to smartphones — may be part of thought itself. This long piece describes Clark’s attempt to better understand what consciousness is, and what are its boundaries. In other words, where do we as thinking humans end and the world we perceive begin?

He comes to recognize that there is a reason we perceive the world based on our five senses. Our brains are built to keep us alive and able to reproduce. Nothing more. All the bonus tracks in our brain’s Greatest Hits playlist … Making art, considering the cosmos, perceiving a future and a past … these are all artifacts of a consciousness that moves our limbs through space.

To some people, perception — the transmitting of all the sensory noise from the world — seemed the natural boundary between world and mind. Clark had already questioned this boundary with his theory of the extended mind. Then, in the early aughts, he heard about a theory of perception that seemed to him to describe how the mind, even as conventionally understood, did not stay passively distant from the world but reached out into it. It was called predictive processing.

Predictive processing starts with our bodies. For instance, we don’t move our arm when it’s at rest. We imagine it moving — predict its movement — and when our arm gets the memo it responds. Or not. If we are paralyzed, or that arm is currently in the jaws of a bear, it sends the bad news back to our brains. And so it goes.

In a similar way we project this feedback loop out into the world. But we are limited by our own sense of it.

Domestication of canines was such a game-changer because we suddenly had assistants with different senses and perceptions. Together humans and dogs became a Dynamic Duo … A prehistoric Batman and Robin. But Robin always knew who was the alpha in this relationship.

Right now there is another domestication taking place. It’s not of a plant or an animal, but of a complicated digital application. If that seems a stretch … If grouping these three together — plants, animals and applications — keep in mind that domesticating all of them means altering digital information.

All Life Is Digital

Plants and animals have DNA, or deoxyribonucleic acid. They are alive because they have genetic material. And guess what? It’s all digital. DNA encoding uses 4 bases: G,C,T, and A. These are four concrete values that are expressed in the complex combinations that make us both living, and able to pass along our “usness” to new generations. We’re definitely more complicated than the “currently” binary underpinnings of A.I. But as we’ve seen, A.I. is really showing us humans up in some important ways.

They’re killing us humans at chess. And Jeopardy.

So: Will A.I. become conscious and take us over? Clark would say consciousness is beyond A.I.’s reach, because as impressive as its abilities to move through the world and perceive it are, even dogs have more of an advantage in the consciousness department. He would be backed up by none less than Nobel Prize in Economics winner Daniel Kohneman, of Thinking, Fast and Slow fame. I got to hear him speak on this subject live, at a New Yorker TechFest, and I was impressed and relieved by how sanguine he was about the future of A.I.

Here’s where I need to bring in the other article, a much briefer one, from The Economist. Robots Can Assemble IKEA Furniture sounds pretty ominous. It’s a modern trope that assembling IKEA furniture is an unmanning intellectual test. But the article spoke more about A.I.’s limitations than its looming existential threats.

First, it took the robots comparatively long time to achieve the task at hand. In the companion piece to that article we read that …

Machines excel at the sorts of abstract, cognitive tasks that, to people, signify intelligence—complex board games, say, or differential calculus. But they struggle with physical jobs, such as navigating a cluttered room, which are so simple that they hardly seem to count as intelligence at all. The IKEAbots are a case in point. It took a pair of them, pre-programmed by humans, more than 20 minutes to assemble a chair that a person could knock together in a fraction of the time.

Their struggles brought me back to how our consciousness gradually materialized to our prehistoric ancestors. It arrived not in spite of our sensory experience of the world, but specifically because of it. If you doubt that just consider my natural and clear way just now of describing the arrival of consciousness: I said it materialized. You understood this as a metaphor associated with our perception of the material world.

This word and others to describe concepts play on our ability to feel things. Need another example: This is called a goddamn web page. What’s a page? What’s a web? They’re both things we can touch and experience with our carefully evolved senses.

And without these metaphors these paragraphs would not make sense.

Yes, our ancestors needed the necessary but not sufficient help of things like cooking, which enabled us to take in enough calories to grow and maintain our complex neural network, and the domestication of animals and plants that led us to agriculture and an escape from the limitations of nomadic hunter-gatherer tribes (I strongly recommend Guns, Germs and Steel: The Fates of Human Societies for more on this), but …

To gain consciousness, we also needed to feel things. And what do we call people who don’t feel feelings? Robots. “Soulless machines.”

Without evolving to feel, should A.I. nonetheless take over the world, it’s unlikely they will be assembling their own IKEA chairs with alacrity. They’ll make us do it for them. Because our predictive processing makes this type of task annoying but manageable. We can even do it faster over time.

It’s All About The Feels

But worry not. Our enslavement won’t happen because — and I’m feeling pretty hubristic myself as I write this — we’re the feelers, the dreamers, the artists. Not A.I.

Before we domesticated dogs, we were limited in where in the world we could roam, and the game we could hunt. After dogs, we progressed. We prospered. Dogs didn’t put us out of jobs, if you will, they took the jobs they were better at in our service. Inevitably, we found other ways to use our time, including becoming creatures who are closer to the humans we would recognize on the street today, or staring back in the mirror.

We are domesticating A.I. Never forget that.

And repeat after me: We have nothing to fear but moral panic itself.