By fostering the original hacker culture centered on innovation and mad programming skills, we could have the next Steve Jobs. But can schools make the shift?
DefCon, the annual hacker gathering in Las Vegas, marked a first this year: a kids’ track. With legendary hacker and DefCon founder, Jeff Moss, aka “The Dark Tangent,” on hand to welcome them, the kids—about 100 budding code crackers aged 8 to 16—fanned out to “classrooms” in the Rio hotel to attend demos and presentations, with a 10-year-old California girl’s zero-day find, a vulnerability in a mobile game, among the highlights.
Workshops covered picking locks—the school locker variety—Google hacking, and coding in Scratch, a programming language for children. In a session entitled “Meet the Feds,” kids were invited to press the flesh with agents from the Department of Homeland Security and the National Security Agency and chat with them about “intelligence gathering, cyber weapons, war strategy, and more.”
It’s an interesting convergence, with the global exploits of hardcore hacker groups Anonymous and Lulzsec in the background, but hacking—in various forms—is another way in which kids are engaging technology and one that’s evolving.
At University Laboratory High School in Urbana, IL, students would, on occasion, mess with the system, says Frances Jacobson Harris, Uni High’s librarian. In the mid ’90s, when the school began its technology ethics program, kids were inclined to get into school files and “poke around,” she says. There were no handheld devices in those days, nor were there any filters.
In one instance students acquired the root password to the school’s system. But this was done via a key catcher (a device for tracking keystrokes), so it wasn’t hacking per se, says Jacobson Harris. With no harm done, the ensuing discussions centered on whether that outcome made the kids’ actions ethical, but didn’t address Black Hat vs. White Hat activities, she says, alluding to the two poles of hackerdom—criminal, malicious hacking on one end and benevolent activity on the other.
Still, those kids possessed a certain savvy in terms of what’s under the hood with technology, adds Harris. And among them were genuine nerds, “Unix heads.” “Now kids use tech, but they’re much more consumers,” she says of the generation enthralled with Facebook and the iPad. “But the arc may be turning again.”
Evidence of that may be Neelam (pseudonyms used for students), a 14-year-old high school freshman, who says he’s into hacking for the fun of it and “doing what isn’t supposed to be done.” As for his specific exploits, “it depends on what you consider hacking,” says Neelam, pausing. “If it means voiding a warranty, then almost everything I do is considered hacking. I’ve jail-broken a few dozen iPods. I’ve also messed with Wii’s and installed homebrew channels and such.”
Then there’s Jarrett. Like Neelam, the 13-year-old eighth grader is an avid gamer and reader, but aside from some modest hardware mods, Jarrett doesn’t consider himself a hacker, but aspires to be one. Why? Beyond the excitement factor, he says, “hacking requires a lot of time to learn, so it’s a special skill.”
So what exactly is hacking?
As Neelam implies, hacking carries multiple meanings. In the most dispassionate definition—from the online manifesto “How to Become a Hacker” penned by renowned open source software advocate Eric S. Raymond—it’s “the practice of modifying the features of a system in order to accomplish a goal outside of the creator’s original purpose.” But thanks to the quasi-political “hacktivist” campaigns of Anonymous (and to a lesser degree the breaking-and-entering types and copyright violators derisively called “crackers”) hacking’s acquired a bad name. Purists claim that the shared culture around innovation and mad programming skills that’s been around since the early days of computer technology has been overshadowed by the Black Hats, with hacking becoming synonymous with malicious and criminal activity.
It’s the original creative spirit and emphasis on skill building and inquiry that Chris Hoff is hoping to instill in the next generation. The senior director and security architect at computer networking company Juniper Networks, Hoff was on hand at DefCon Kids to demo Scratch. At a similar set of regional conferences that he runs called HacKid, children and teens get hands-on time with a range of stuff, including photography and building trebuchets, as well as basic circuitry and robotics. Food got hacked, too, as attendees picked apart the curious comestible freeze-dried ice cream.
“It’s what you would have liked to have been exposed to at age eight,” says Hoff proudly about the opportunity to tinker and explore. It’s exercising the same curiosity of the Uni High kids, Neelam, and Jarrett, but with guidance.
“There’s nothing we can do at a mass-market level to dispel the stereotypes of hacking. The nature of a hack is all a function of the person who’s doing the activity, says Hoff, a father of four. Take lock picking. “People react to that,” he says. “But at HacKid and DefCon Kids, the first thing they talked about were the ethical considerations. There were two rules: don’t pick any lock that’s not yours and any lock that you depend on.” After setting the groundwork with these guidelines, the kids really got into the physics of locks and, along with their parents, confronted assumptions about security and learned how to better secure their lockers at school.
Instructions for just about everything make their way to the Internet, says Hoff. “Would you, as a parent, want to educate your kids or do you want them going to YouTube and picking locks with no guidance?”
Youngsters introduced to programming at the October 2010 HacKid in Boston (fully half were girls) were able to see how the online applications they use every day are made from the same languages they were working with. And even those attendees with little technology experience, Hoff says, ended up “learning at a pace that’s commensurate with a digital native.”
That’s one golden outcome, but it begs the question about the role of education. In his 2010 book Program or Be Programmed, media expert Douglas Rushkoff cited the critical need for users, especially young people, to have a deeper knowledge of technology beyond simply using commercial applications. But are districts, hampered as they are by budget cuts that impact professional development, prepared to teach it?
Even in the presence of knowledgeable educators, schools are hard pressed to keep up, given scant support and the relentless pace of innovation and product cycles that govern relevancy around technology. Gabe, a high school sophomore and accomplished programmer with several applications in the App Store, reports that 80 percent of his middle school computer class was devoted to learning Microsoft Office, the 2002 version, at that. Lucky for him, his parents stepped in and got him into the Apple Developer’s program. As for Neelam and Jarrett, they’re largely self-taught. “Google helps a lot,” says Neelam.
Columbia University professor Chris Wiggins calls this a “Sputnik moment” for computer science. Decades of development of the Internet, cloud computing, and open source tools have opened the path to innovation and young adults are energized about careers in engineering. But will their primary and high school experiences help get them there? For his part, Chris Hoff is willing to partner with schools in regional HacKids programming, but even he doesn’t think that’s the solution. Technology education, perhaps, is ripe for a hack of its own.