I really don’t like inaugural poetry

Does that make me a bad person? Does it mean I agree with  Eric Cantor?!

It’s not that I don’t like the poems themselves. Richard Blanco’s “One Today” has its moments, with its “millions of faces in morning mirrors” and its “plum blush of dusk”. But it reads a bit like a geo-politico-linguistic checklist: “One sun rose on us today, kindled over our shores/ peeking over the Smokies, greeting the faces/ of the Great Lakes, spreading a simple truth/ across the Great Plains, then charging across the Rockies”; “Hear: the doors we open for each other all day, saying: hello, shalom/ buon giorno, howdy, namaste, or buenos días in the language my mother taught me“.

This is not to say that Blanco’s multiculturalism, or, for that matter, President Obama’s, is at all insincere. But in an event as pompous and circumstantial as the presidential inauguration, decisions cannot help but be political. I couldn’t shake the feeling that the James Taylor-Kelly Clarkson-Beyonce triumvirate represented some sort of dream voting coalition.

And poetry, to its credit, is an art form that does not hold up well to political posturing. A great poem challenges whether or not narratives can be controlled by those in power. I think what we got yesterday could be more aptly called an inaugural story, which isn’t necessarily a bad thing. But I get the feeling that Blanco would’ve been happier writing about his own family, rather than gauzily trying to encapsulate all Americans’. That wouldn’t have been such a bad thing, either.

A different kind of cabinet diversity

Since before the new year, many have questioned the growing monotony in the Obama cabinet. The uproar really took hold on Tuesday, when The New York Times ran a picture of the current cabinet in the oval office with the president. Annie Lowrey questioned the cabinet’s “all-male look”; Valerie Jarrett, or rather her left leg, is the only visible representative of female influence.

Peekaboo!

Peekaboo!

Criticism of the lack of diversity in the new Obama cabinet is, for the most part, well-taken. The white men in power are getting no less so. Pending Senate confirmations, John Kerry will take over for Hillary Clinton as Secretary of State, Chuck Hagel for Leon Panetta as Secretary of Defense, and Jack Lew for Tim Geithner as Secretary of the Treasury. The disappearance of female and minority voices isn’t just bad politics, it’s bad policy, depriving President Obama of insight grounded in a range of different viewpoints. Ironically, Obama’s own unique viewpoint made him such a transformational in his first campaign. Or, as Margaret Carlson writes in Bloomberg View,

“Obama suffers from Groucho Marx syndrome: He favors those in the club he doesn’t belong to.”

But Michael Hirsh of National Journal describes a different kind of diversity being squashed in Obama’s second term. If Hagel and Kerry are confirmed, then former senators will be serving as President, Vice President, Secretary of State, and Secretary of Defense, all at the same time. (Of course, the first three posts were already filled by ex-senators, as Hillary Clinton left her seat to lead the State Department.) Hirsh writes that “any future ‘Obama doctrine’ will also be a Kerry-Hagel-Biden doctrine”, and that the president is likely to seek counsel in his former senate mentors. This is a good thing- the more counselors, the better.

But I think it’s important to remember that cabinet members, like their bosses, aren’t just hired to think. They’re hired to lead. As one of three presidents elected directly from the Senate- John Kennedy and Warren Harding were the others- Obama has been dogged by criticism that he lacked the executive experience necessary for the White House. While some of that flack was probably just misplaced opposition to Obama in general, it hasn’t died out. Indeed, after Vice President Biden took the lead in 11th-hour fiscal cliff negotiations, the only praise offered to Obama in “On Leadership” column of The Washington Post was that he stayed out of Biden’s way. Hopefully the legislative team atop the executive branch will be not only trenchant thinkers, but effective leaders.

Les Misérables and the way we talk about capitalism

[SPOILER ALERT]

For a two-hour-and-forty-minute movie, Les Misérables moves fast. Watching it was a bit like watching the Harry Potter movies after reading the books: there was so much to cram in- 17 years’ worth in Les Mis– that one couldn’t help but wish there had been more fleshing out. (I haven’t actually read Victor Hugo’s extraordinarily long original, but still.)

But that doesn’t mean everything’s given equal time. What’s glossed over can tell us almost as much as what’s included. I most wish that we would have been given more details of Jean Valjean’s life between his release from prison and his time as mayor and factory owner, not least because it would have meant more of Hugh Jackman singing. Valjean’s unlikely success catalyzes every major event in the movie. Because of his

"Stars, with our multitudes/ most of us belong in this movie/ but one of us doesn't..."

“Stars, with our multitudes/ most of us belong in this movie/ but one of us doesn’t…”

place atop the corporate ladder, he delegates factory operations to his foreman in order to deal with a particular sort of regulator, namely the sort, middle-finger-to-bass-singers-edly portrayed by Russell Crowe, that hunts him as an ex-con who’s broken parole. Without the benevolent Valjean paying attention, the foreman fires Fantine- Anne Hathaway, awesome– for having a child out of wedlock, and she’s forced into prostitution to support her daughter. Yet Valjean’s wealth and status, which had led to Fantine’s tragic fall, lets him provide a life of luxury for Cosette after her mother’s death.

In other words, capitalism plays an important, complex role in Les Mis, serving as both a motivator of tragedy and a savior from it. But we never find out how Valjean became so successful in the first place. In one scene, the just-released prisoner of 19 years steals silverware from a bishop, who lies to police to keep him from being re-incarcerated  In the next, Valjean is clean-shaven, powerful, and rich. That we don’t see Valjean’s rise is especially noticeable given the moving detail in which we see Fantine’s fall, which occurs just after it. This fall shows consumer capitalism at its very worst- Fantine reluctantly agrees to sell her hair, then a tooth, then her whole body. This culminates in “I Dreamed a Dream”, one of the best, saddest moments of the film.

Indeed, even though Valjean’s wealth allows him to avenge Fantine’s death by saving Cosette, capitalism in Les Mis is always presented in a negative light. The most enterprising characters in the movie are the inkeeper and his wife, hilariously portrayed by Sacha Baron-Cohen and Helena Bonham-Carter. (Aside from Russell Crowe, who, unlike this French magician, sucked, Les Mis is remarkably well-cast. Take a look at this picture showing the portrait of Cosette from the book and musical, Isabelle Allen as the young character in the film, and Amanda Seyfried as the older version.)

nuts.

Nuts.

The Thénardiers milk their drunken customers for all they’re worth: “Charge ’em for the lice, extra for the mice/ two percent for looking in the mirror twice”. When Valjean comes to take Cosette out of their miserable custody, they do their best to get a good price for her, faking heartbreak though they can’t remember her name.

But just because Les Mis mocks capitalism doesn’t mean it presents an alternative.The working-class student revolutionaries all wind up dead except Marius, who in Eddie Redmayne’s rendition sounds like a well-trained Kermit the Frog, and who is saved from the barricade by Valjean. When Valjean himself dies, he goes to heaven to find the company singing the finale atop an enormous barricade: “Do you hear the people sing? Say, do you hear the distant drums?/ It is the future that they bring when tomorrow comes”. But tomorrow still hasn’t come. Heaven isn’t the fruits of revolution, but the glorious revolution itself.

The tragedy and triumph of private enterprise, together with the fact that no other system materializes, speaks to today’s economy. Fantine’s story makes clear the need for a safety net to protect the most vulnerable among us, both in the workplace and out. But so does Valjean’s. Like the 2012 election, the film praises the self-made job creator to no end, and Valjean’s charity is the only force for good in a world of pimps, vast inequality, and Russell Crowe’s singing voice. But in Orléanist France as in our own time, not every ex-con can own a factory- a great many people who strike out on their own strike out. We don’t see Valjean’s rise, or how close he could’ve come to ruin. He remains otherworldly throughout the film, called a saint by Marius on his deathbed. Likewise, for all the talk of small businesses as engines of the economy, we rarely hear about the three out of four venture-backed startups that fail to return investors’ money. We should. Hearing about them would remind us that entrepreneurs don’t just need tax cuts and less regulation, but also some assurance that their risks are taken in good faith, and that someone will be there if they fail.

On bias

Question: What is the most biased mainstream news channel in the USA?

My copout answer:

Depends on what we mean by “biased”.

If we mean, “unbalanced representation of one major political party over the other”, then I’d say MSNBC, where there’s nary a Republican in sight (although I doubt they’re beating down the door trying to get in).

If we mean, “unbalanced questioning of representatives of one major political party over another”, then it’s gotta be Fox, which excels in pummeling its token liberals senseless.

But here’s the thing- everyone knows those two are biased, and they know we know. Yes, Fox still says “Fair and Balanced”, but sort of in the same way that Burger King still says “have it your way”. They’re not expecting you to come in there asking for a bespoke quarter-pounder. It’s just a thing they say.

So if by “biased” we mean, “unbalanced reporting relative to perceived balance”, then CNN is the most biased. By branding themselves in the middle of this see-saw of crazy, they’ve somehow kept their status as “neutral”. The proof? CNN is what you see in airports. When you think waiting for a plane, think CNN.

But really, the biggest bias goes beyond what political party a network does or doesn’t support. It’s the extent to which they turn current events into political point-tallying and star-gazing. They’re like tabloids, only instead of the Kardashians, they have Barack Obama and Mitt What’s-His-Name. So they’re all biased- away from complex issues, towards (political) celebrity.

What “Shark Tank” is teaching me about entrepreneurship

As someone working towards founding a startup, and who studied something very different than startuppery in college, I’m always looking for places to learn the ropes. I’ve found a bunch of great free resources, like Quora, the hackers/founders and edtech meetups, and Skillshare. These outlets all require a bit of effort, which is great. But there’s also a resource that requires no effort, and that combines the best of the worst of TV, vaudeville, and venture capital. I’m referring, of course, to ABC’s “Shark Tank”.

As we’re reminded at the beginning of each episode, the sharks are five entrepreneurs-turned-investors. There’s Mark (Cuban), the goofy one, Daymond (John), the cool one, Kevin (O’Leary), the mean one, Barbara (Corcoran), the female one, and Robert (Herjavec), the dad. These five decide whether to invest in contestants’ companies, giving the entrepreneurs some healthy pummeling along the way. Like any good reality show, some lessons make themselves painfully clear by sheer repetition. (I’m thinking of Robert Irvine of “Restaurant: Impossible” yelling, “paint the walls white!” “cut your menu down to 10 items!” “get these Indian headdresses out of here!”) Here’s what I’ve learned so far:

1: Don’t overvalue your company. Almost every contestant comes in with an absurd valuation. This is probably because the show stipulates that contestants have to get a shark to invest the full asking amount, so they give themselves extra wiggle room on the shark’s equity stake, and thus the company’s value. Still, contestants get thrown out by the boatload because of ridiculous valuations. The looks Barbara, Robert, and Mark give to the peanut butter girls when they insist that

And this is the face of the person that actually gave them the money.

This is the face of the person that actually gave them the money.

they’re valuing themselves on future sales, not present sales, are priceless (6:08 in the video). Robert’s is the sort of look a dad gives when his daughter calls him a name he’s never heard before, and doesn’t know what it means, but doesn’t really care because she is not leaving this house in that.

2. Have sales. Related to #1. Later in the same episode, when the Mix Bikini bros waltzed in, it seemed like nothing could stop them, other than the fact that they forgot what equity they were asking for, they seemed slightly buzzed, and they had an aesthetically-questionable idea to let girls mix and match bikini parts. (Did they learn nothing from Tri Mi Tank, which is not a Vietnamese city but a strap-swapping company that had sold 100 units before coming on the show?) But they also had no sales. Not so good.

3. Be awesome. Marian Cruz had spent 5 years making a very high-quality video about the TurboBaster, which does a lot of things to turkeys. She had no working prototype, no research about the size of her market, and no idea how much the product would cost to make. And yet Kevin Harrington, the sometimes-shark who

Tell me you don't see it.

Tell me you don’t see it.

closely resembles an actual shark, gave her $35,000 for her whole company, in which she would keep a 2% royalty. Not exactly a sweetheart deal, but not bad for a video about a turkey baster. How did Marian Cruz do it? Because Marian Cruz is awesome.

4. Go on “Shark Tank”. The publicity helps. The “I Want to Draw a Cat for You” guy saw his sales go up 3000% after Mark Cuban gave him $25,000. Just to be clear, his business is drawing stick figures of cats.

So what can all this teach one (and by one, I mean me) about the road towards an internet startup? Not much, maybe. Numbers 1 and 2 don’t seem to apply. Online, a company can have value to investors without any revenue, and making a name for oneself comes before making sales- hell, Twitter still doesn’t have a business model. And even though Mark, Kevin, and Robert all made their fortunes in tech, you don’t see too many online startups on “Shark Tank”. Maybe the takeaway is to get publicity however you can, and to be awesome.

Did I mention I have a blog now?

Hello world

My name’s Jake, and I’m a grad student and startup person. I want to fix political debate and higher education. This is my blog.

I’ll be posting all sorts of cool stuff here. Original stuff. But I’m starting with stuff I wrote for SkilledUp, a New York edtech startup that specializes in online-learning, ass-kicking, and name-taking.

A History of Gamified Learning

 You have died of dysentery.” This phrase brings great nostalgia, and great sadness, to anyone whose life has been affected by The Oregon Trail, the PC game that has captivated young pioneers for decades. “What went wrong?”, we ask ourselves. “Should I have bought more provisions? Should I have left at a different time of year? Should I have tried to lug that entire bison?”

gamification-title-oregontrail-1

In The Oregon Trail, players learn about the hardships of pioneering in the context of a computer game.

EduGames: Where we’ve been

The Oregon Trail was more than just a ’90s pop-culture phenomenon. It was the first real fusion of education and video games. Indeed, the oregon trail itself serves as a nice metaphor for the history of high-tech educational games. The students are the pioneers, the games are the oxen, and the learning is Oregon. Or maybe the teachers are the pioneers, the students are the family members, and the games are the ammunition.

Okay, the metaphor isn’t perfect. But the history of learning games has truly been a journey into the unknown, whose ending has yet to be written. In this piece, we’ll trace this history from the teletype machine to the iPhone and beyond. These articles draw on an excellent infographic by Knewton, a leading adaptive-learning startup.

The Oregon Trail may have started the revolution in PC-based learning games, but it wasn’t invented on a PC . In fact, when Don Rawitsch, Bill Heinemann, and Paul Dillenberger created the game in 1971, it was on a screen-less teletype machine connected to a mainframe computer. The original game used text prompts to cue players- for example, if a player chose to hunt, the computer printed out, “Type BANG.”

Three years later, The Oregon Trail became part of the software library of the Minnesota Educational Computing Consortium (MECC), a state-funded organization which would later become a private company. (Minnesota was known at the time as the “Silicon Valley of the Midwest”.) The MECC dominated the educational software market through the 1980’s, thanks in no small part to its adoption of the Apple II computer.

But other innovators soon entered the educational game software world. One of the most notable was Brøderbund, which released Where in the World is Carmen Sandiego? in 1985. Carmen Sandiego introduced kids to a whole new way to learn geography, and a whole slew of humorously-named bad guys, like Joy Ryder and M.T. Pockets. Carmen Sandiego became so popular that it spawned its own children’s game show, starring Lynne Thigpen as the hard-nosed chief detective (“I’m the Chief. But you can call me…the Chief.”).

"Where in the World is Carmen Sandiego?" requires players to learn geography in order to catch a criminal.
“Where in the World is Carmen Sandiego?” weaves geography education into a detective computer game.

What Carmen Sandiego and The Oregon Trail have in common is the creation of a virtual game world, in which the player learns while striving towards a distinct end goal- getting to Oregon or catching Carmen. In 1989, one game took the concept of world creation to new heights, while eliminating the clear-cut competition usually associated with games. It was called SimCity. 

The story of SimCity, like that of most groundbreaking games, is the story of a long-shot that beat the odds. In the mid-80’s, when most videogames seemed to involve killing dinosaurs, jumping out of planes, or some combination of the two, creator Will Wright set his sights on what might be the world’s least-sexy activity: urban planning. The player acted as a city mayor, trying to save the city from traffic, a depleted water supply, or monsters. Gameplay could last forever, with new problems continuously arising. Despite its free-form structure and its seemingly-boring subject, SimCity was a hit, and was used in over 10,000 classrooms. Game developers even got calls from the CIA and the Department of Defense for their innovative work in city-scale simulation. The release of SimCity 2000 in 1994 cemented the popularity of the game, and the importance of the “God Game” genre.

"SimCity" players must build their city and fend off attacking aliens.
“SimCity” combines run-of-the-mill urban planning with the occasional alien attack to engage and teach players.

To be sure, many other styles of educational gaming software hit the market in these early days. Arcade-style PC games like Math Blastersgrade-specific games like Reader Rabbitand computer-literacy games like Mario Teaches Typing also made a huge impact. But virtual-world games pointed towards the future of educational gaming.

The ability of games like SimCity and Civilization to simulate parallel, but not inconceivable, world histories is astonishing by any standard. But in the 1990’s, neither of these games relied too heavily on the most important technological advancement of the day- the internet. The first major virtual learning world to make full use of the web was Whyvillecreated in 1999. Whyville is an expansive, avatar-based life, a bit akin to Second Life, but with a distinctly educational bent. The game is aimed at students ages 8-16, and encourages players to participate in all aspects of virtual society, from earning a living (using “clams” as currency) to serving as a city worker to writing for the game’s weekly newspaper, the Whyville TimesThe game’s base of over 7 million players has enjoyed in-game concerts by the likes of the Jonas Brothers and Justin Bieber, and has learned about everything from science to business to the arts.

Children learn from experts in Whyville's virtual Greek Theater.
“Whyville” features a digital Greek Theater, where kids can learn from experts. Here, a noted singer/songwriter fields questions.

But virtual learning worlds would grow up fast. In 2002, The Serious Games Initiative was founded at the Woodrow Wilson International Center for Scholars, a think tank affiliated with the Smithsonian Institute. The initiative advocates the adoption of virtual games in both education and a hosts of other critical areas. Over the past decade, the group that has implemented serious games with the most enthusiasm has been the U.S. Army.

On the 4th of July, 2002, the army released the first version of America’s Army for PC. The realistic first-person shooter gives users a semblance of the full deployment experience, complete with grenade launchers, humvees, and medical training.  Although the Army acknowledges that the game was originally meant for recruitment and propaganda, it quickly became apparent that America’s Army was an ideal training tool for real-life soldiers. Soon, the game was being integrated into simulations that trained soldiers to operate ground-to-air missiles, tanks, and other weapons, sometimes serving as a 360-degree immersive virtual environment. The game has proven particularly useful for training small units, and has been updated over 26 times in the last decade, with versions for Xbox and mobile phones as well as desktops.

"America's Army" features some of the most realistic battle sequences of any computer game..
Army training programs make use of the true-to-life graphics of America’s Army to acclimate soldiers to combat conditions.

And America’s Army is far from the only training game used by the U.S. military. On June 15, 2012, the Army posted a call for proposals to build a new first-person shooter that builds on Virtual Battlespace 2, a training game developed with the help of the Marine Corps and the Australian Defence Corps, among others. The fee offered to the successful contractor? $44.5 million over 5 years.

From Serious to Casual

So far, the games we’ve looked at make use of a few main features of gaming. First, like all games, they use some kind of points system, scoring players by the health of their passenger pioneers, the number of clams they’ve collected, or the number of enemy civilizations they’ve conquered. (SimCity is a notable exception, which helps explain why it was such a risky, revolutionary game.)

Second, they offer the player a simulation of a real-world atmosphere as a means of learning: The Oregon Trail lets you be a real pioneer, Carmen Sandiego  a detective, SimCity a mayor, and America’s Army a soldier. Whyville simulates an entire world. The fidelity of the games becomes higher as the graphics improve- America’s Army does a far more convincing job of simulating its environment that The Oregon Trail– but the principle remains the same.

Third, these games provide some form of artificial intelligence. This feature is closely linked to simulation. In order for America’s Army to truly simulate warfare, enemy combatants have to be properly agile and hostile. Likewise (sort of), The Oregon Tail needs smart buffalo to give the young trailblazer something to shoot at.

Above all. these educational games offered a lush, immersive virtual world. Game graphics got more sophisticated with each technological advancement. But that would all change as the focus of educational games became more social, and more casual. Yet ironically enough, this social movement began, in a way, with Solitaire.

Solitaire's win screen is a remembrance of things past.
Is it weird to get nostalgic about the original Solitaire win screen?

As of 1990, Solitaire has come automatically installed on Microsoft Windows. Unlike the other ’90s-era games we’ve mentioned, Solitaire had no overarching narrative, and required no long-term commitment to the game. Solitaire was a smash hit, and has now been played by over 400 million people. The original release of the game was soon followed by PC games like FreeCell, Minesweeper, and Spider Solitaire. Suddenly, gaming was no longer for only the most “serious” consumers.

Casual games like Solitaire benefited from tech innovations not by becoming more immersive, but by becoming more social. In 2006, Apple announced that its iTunes store would offer cheap, social games for download to the iPod Touch. The next year, Zynga began releasing casual social games for Facebook. Farmville would quickly become Zynga’s most popular offering, featuring low-fi graphics and a heavy dose of social interaction. Farmville players invite Facebook friends to be their neighbors, and ensure a good harvest by visiting their neighbors’ farms.

Farmville represents a landmark in the “gamification” of social networking. Another major landmark came with the 2009 launch of Foursquare, which gamified GPS, letting players earn badges and mayorships for visiting restaurants, bars, and businesses. It was only a matter of time before gamification moved to education.

Foursquare offers badges for "checking in" to various real-world places.

The most crucial lesson of gamification that education startups have taken to heart is the importance of badges. Video games reward players with badges for each progression they make, providing near-instant gratification. Likewise, successful ed-tech startups recognize that learners must be continuously made aware of their progress. Khan Academy, perhaps the most influential education startup of them all, uses heavy doses of gaming aspects. In fact, KA President Shantanu Sinha wrote a guest column in the Huffington Post in February extolling the virtues of gamification in education. Khan students earn badges for the courses they take, and show off these badges on their profiles. Badges are divided into six types, with well over 100 in all. Students also earn “energy points,” although the metric for these might be a bit skewed: after one day and zero fully completed lessons, I had 1,267 points.

Khan Academy features a wide variety of game-like incentives for students.
Students in Khan Academy can show off their energy points and badges to the world.

Likely influenced by Khan Academy, Codecademy, launched in late 2011, offers users a profile that purposefully resembles that of a video game, complete with points, badges, streaks, and a progress bar. These game elements have contributed to the site’s success; Codecademy reached one million users just five months after its launch, and received $10 million in Series B funding this June.

Codecademy brings a video-game field to programming education.
Learning on Codecademy can feel like playing a video game, if a video game taught you the basics of HTML.

The Future of Gamification in Education

As more and more people turn to multiple online sources for learning, they’ll accumulate more and more badges. Yet these badges are only truly valuable when they can be understood and compared by someone besides the badge-earner- say, an employer. To that end, Mozilla unveiled its Open Badges project in private beta last September. Open Badges will allow any site to issue a badge, and will enable badge earners to store their badges in an online “backpack” and display them to the public. The Mozilla project is backed by the Macarthur Foundation, and is already being used by NASA, Disney-Pixar, and others.

In a sense, the current explosion of points and badges is a continuation of the social scoring system used by traditional schools. Students compete for the best class rank, the best Latin honors, and the most chords at graduation. Universities compete with one another for a spot atop the leaderboards of college rankings. The problem is that these incentives aren’t always fun for everyone, and they can sometimes do more harm than good. Sinha, President of Khan Academy, puts it best:

[dramquote]It’s almost like we felt we could scare students into studying by stamping them with bad grades, as if an ‘F’ is the scarlet letter to college admissions offices. Fear isn’t a particularly effective way to motivate someone.[/dramquote]

As educators continue to innovate and innovators continue to turn towards education, we’re likely to see more gaming aspects brought to the world of learning. One particular element that could well come to prominence is artificial intelligence. AI has long been the domain of video games, which pit the player against ‘the computer’, whether the virtual opponent takes the form of enemy combatants or a hapless bison. A less advanced form of AI exists in automatic grading technology like Scantron, which has become so commonplace in schools that I bet you didn’t know that it’s actually the name of a company. While Scantron forms reduce all assessment to multiple-choice tests, future AI, developed with the insight of game makers, could be able to evaluate complex tasks like those performed in video games. This would give teachers the freedom to dive into complex subject matter without having to worry about being overwhelmed by grading.

A second element of gaming that could be brought to education in the future is collaborative competition. Too often, students learn subjects in “silos”- math from 9:00-10:00, biology from 10:00-11:00, history from 11:00-12:00, and so on.  They take tests individually, and are chastised (often with good reason) for working together. But the world outside of the classroom doesn’t work like this. Professionals in every area, from business to design to teaching itself, must work with others solve hard problems that bleed into many different areas. This kind of work isn’t going away- The Apollo Research Institute lists “transdisciplinarity” and “virtual collaboration” atop its list of skills that will be needed in the workplace in 2020. Video games draw on players’ competitive instincts to force them to work together, and education initiatives could do the same, all while harnessing the vast resources found at sites of learning both online and off.

Whatever the future looks like, we hope that education will be truly effective. We hope that elements of game psychology and mechanics will be used to reward lasting learning. And we hope it’ll be fun, too.

Originally posted here.