The International Association for K-12 Online Learning estimates that up to 10 percent of all America’s public schools have adopted some form of personalized learning
The economy needs kids who are creative problem solvers
The digital tool tells us: We have a problem to fix with these kids right here and we can do it right then and there
Washington, USA, August 28, 2017: In middle school, Junior Alvarado often struggled with multiplication and earned poor grades in math, so when he started his freshman year at Washington Leadership Academy, a charter high school in the nation’s capital, he fretted that he would lag behind.
But his teachers used a computer to identify his weak spots, customize a learning plan just for him and coach him through it. This past week, as Alvarado started sophomore geometry, he was more confident in his skills.
“For me, personalized learning is having classes set at your level,” Alvarado, 15, said in between lessons. “They explain the problem step by step, it wouldn’t be as fast, it will be at your pace.”
As schools struggle to raise high school graduation rates and close the persistent achievement gap for minority and low-income students, many educators tout digital technology in the classroom as a way forward. But experts caution that this approach still needs more scrutiny and warn schools and parents against being overly reliant on computers.
The use of technology in schools is part of a broader concept of personalized learning that has been gaining popularity in recent years. It’s a pedagogical philosophy centered around the interests and needs of each individual child as opposed to universal standards. Other features include flexible learning environments, customized education paths and letting students have a say in what and how they want to learn.
Under the Obama administration, the Education Department poured $500 million into personalized learning programs in 68 school districts serving close to a half million students in 13 states plus the District of Columbia. Large organizations such as the Melinda and Bill Gates Foundation have also invested heavily in digital tools and other student-centered practices.
The International Association for K-12 Online Learning estimates that up to 10 percent of all America’s public schools have adopted some form of personalized learning. Rhode Island plans to spend $2 million to become the first state to make instruction in every one of its schools individualized. Education Secretary Betsy DeVos also embraces personalized learning as part of her broader push for school choice.
Supporters say the traditional education model, in which a teacher lectures at the blackboard and then tests all students at the same time, is obsolete and doesn’t reflect the modern world.
“The economy needs kids who are creative problem solvers, who synthesize information, formulate and express a point of view,” said Rhode Island Education Commissioner Ken Wagner. “That’s the model we are trying to move toward.”
At Washington Leadership Academy, educators rely on software and data to track student progress and adapt teaching to enable students to master topics at their own speed.
Digital tool finds problem
This past week, sophomores used special computer programs to take diagnostic tests in math and reading, and teachers then used that data to develop individual learning plans. In English class, for example, students reading below grade level would be assigned the same books or articles as their peers, but complicated vocabulary in the text would be annotated on their screen.
“The digital tool tells us: We have a problem to fix with these kids right here and we can do it right then and there; we don’t have to wait for the problem to come to us,” said Joseph Webb, founding principal at the school, which opened last year.
Webb, dressed in a green T-shirt reading “super school builder,” greeted students Wednesday with high-fives, hugs, and humor. “Red boxers are not part of our uniform!” he shouted to one student, who responded by pulling up his pants.
The school serves some 200 predominantly African-American students from high-poverty and high-risk neighborhoods. Flags of prestigious universities hang from the ceiling and a “You are a leader” poster is taped to a classroom door. Based on a national assessment last year, the school ranked in the 96th percentile for improvement in math and in the 99th percentile in reading compared with schools whose students scored similarly at the beginning of the year.
It was one of 10 schools to win a $10 million grant in a national competition aimed at reinventing American high schools that are funded by Lauren Powell Jobs, widow of Apple founder Steve Jobs.
Naia McNatt, a lively 15-year-old who hopes to become “the African-American and female Bill Gates,” remembers feeling so bored and unchallenged in fourth grade that she stopped doing homework and her grades slipped.
At the Academy, “I don’t get bored ‘cause I guess I am pushed so much,” said McNatt, a sophomore. “It makes you need to do more, you need to know more.”
In math class, McNatt quickly worked through quadratic equations on her laptop. When she finished, the system spat out additional, more challenging problems.
Her math teacher, Britney Wray, says that in her previous school she was torn between advanced learners and those who lagged significantly. She says often she wouldn’t know if a student was failing a specific unit until she started a new one.
In comparison, the academy’s technology now gives Wray instant feedback on which students need help and where. “We like to see the problem and fix the problem immediately,” she said.
Still, most researchers say it is too early to tell if personalized learning works better than traditional teaching.
A recent study by the Rand Corporation found that personalized learning produced modest improvements: a 3 percentile increase in math and a smaller, statistically insignificant increase in reading compared with schools that used more traditional approaches. Some students also complained that collaboration with classmates suffered because everybody was working on a different task.
“I would not advise for everybody to drop what they are doing and adopt personalized learning,” said John Pane, a co-author of the report. “A more cautious approach is necessary.”
The new opportunities also pose new challenges. Pediatricians warn that too much screen time can come at the expense of face-to-face social interaction, hands-on exploration, and physical activity. Some studies also have shown that students may learn better from books than from computer screens, while another found that keeping children away from the computer for five days in a row improved their emotional intelligence.
Some teachers are skeptical. Marla Kilfoyle, executive director of the Badass Teachers Association, an education advocacy group, agrees that technology has its merits, but insists that no computer or software should ever replace the personal touch, motivation and inspiration teachers give their students.
“That interaction and that human element are very important when children learn,” Kilfoyle said. (VOA)
When a CIA-backed venture capital fund took an interest in Rana el Kaliouby’s face-scanning technology for detecting emotions, the computer scientist and her colleagues did some soul-searching — and then turned down the money.
“We’re not interested in applications where you’re spying on people,” said el Kaliouby, the CEO and co-founder of the Boston startup Affectiva. The company has trained its artificial intelligence systems to recognize if individuals are happy or sad, tired or angry, using a photographic repository of more than 6 million faces.
Recent advances in AI-powered computer vision have accelerated the race for self-driving cars and powered the increasingly sophisticated photo-tagging features found on Facebook and Google. But as these prying AI “eyes” find new applications in store checkout lines, police body cameras and war zones, the tech companies developing them are struggling to balance business opportunities with difficult moral decisions that could turn off customers or their own workers.
El Kaliouby said it’s not hard to imagine using real-time face recognition to pick up on dishonesty — or, in the hands of an authoritarian regime, to monitor reaction to political speech in order to root out dissent. But the small firm, which spun off from a Massachusetts Institute of Technology research lab, has set limits on what it will do.
The company has shunned “any security, airport, even lie-detection stuff,” el Kaliouby said. Instead, Affectiva has partnered with automakers trying to help tired-looking drivers stay awake, and with consumer brands that want to know whether people respond to a product with joy or disgust.
Such queasiness reflects new qualms about the capabilities and possible abuses of all-seeing, always-watching AI camera systems — even as authorities are growing more eager to use them.
In the immediate aftermath of Thursday’s deadly shooting at a newspaper in Annapolis, Maryland, police said they turned to face recognition to identify the uncooperative suspect. They did so by tapping a state database that includes mug shots of past arrestees and, more controversially, everyone who registered for a Maryland driver’s license.
Initial information given to law enforcement authorities said that police had turned to facial recognition because the suspect had damaged his fingerprints in an apparent attempt to avoid identification. That report turned out to be incorrect and police said they used facial recognition because of delays in getting fingerprint identification.
In June, Orlando International Airport announced plans to require face-identification scans of passengers on all arriving and departing international flights by the end of this year. Several other U.S. airports have already been using such scans for some departing international flights.
Chinese firms and municipalities are already using intelligent cameras to shame jaywalkers in real time and to surveil ethnic minorities, subjecting some to detention and political indoctrination. Closer to home, the overhead cameras and sensors in Amazon’s new cashier-less store in Seattle aim to make shoplifting obsolete by tracking every item shoppers pick up and put back down.
Concerns over the technology can shake even the largest tech firms. Google, for instance, recently said it will exit a defense contract after employees protested the military application of the company’s AI technology. The work involved computer analysis of drone video footage from Iraq and other conflict zones.
Similar concerns about government contracts have stirred up internal discord at Amazon and Microsoft. Google has since published AI guidelines emphasizing uses that are “socially beneficial” and that avoid “unfair bias.”
Amazon, however, has so far deflected growing pressure from employees and privacy advocates to halt Rekognition, a powerful face-recognition tool it sells to police departments and other government agencies.
Saying no to some work, of course, usually means someone else will do it. The drone-footage project involving Google, dubbed Project Maven, aimed to speed the job of looking for “patterns of life, things that are suspicious, indications of potential attacks,” said Robert Work, a former top Pentagon official who launched the project in 2017.
While it hurts to lose Google because they are “very, very good at it,” Work said, other companies will continue those efforts.
Commercial and government interest in computer vision has exploded since breakthroughs earlier in this decade using a brain-like “neural network” to recognize objects in images. Training computers to identify cats in YouTube videos was an early challenge in 2012. Now, Google has a smartphone app that can tell you which breed.
A major research meeting — the annual Conference on Computer Vision and Pattern Recognition, held in Salt Lake City in June — has transformed from a sleepy academic gathering of “nerdy people” to a gold rush business expo attracting big companies and government agencies, said Michael Brown, a computer scientist at Toronto’s York University and a conference organizer.
Brown said researchers have been offered high-paying jobs on the spot. But few of the thousands of technical papers submitted to the meeting address broader public concerns about privacy, bias or other ethical dilemmas. “We’re probably not having as much discussion as we should,” he said.
Not for police, government
Startups are forging their own paths. Brian Brackeen, the CEO of Miami-based facial recognition software company Kairos, has set a blanket policy against selling the technology to law enforcement or for government surveillance, arguing in a recent essay that it “opens the door for gross misconduct by the morally corrupt.”
Boston-based startup Neurala, by contrast, is building software for Motorola that will help police-worn body cameras find a person in a crowd based on what they’re wearing and what they look like. CEO Max Versace said that “AI is a mirror of the society,” so the company chooses only principled partners.