πŸ› What Changes When You Measure Differently?

What this means for educators + more

Welcome to Playground Post, a bi-weekly newsletter that keeps education innovators ahead of what's next.

This week's reality check: Stanford researchers fed the same essays to AI models with different student labels attached. The feedback changed completely. Meanwhile, 58% of 7th graders are ready for Algebra 1 but only 37% get to take it, and a new FAFSA fraud tool is catching ghost students but can't tell if it's also blocking real ones.

Data Gem

From 1996 to 2015, only 4-6% of fourth graders reported receiving no math homework the previous night. By 2024, that share had risen to more than one-quarter. 40% of teachers say they've reduced homework assignments over the past two years.

Same Essay, Different Student Labels. AI Changed Its Feedback.

Stanford researchers took 600 middle school argumentative essays and fed them into four AI models.

Then they resubmitted each essay 12 more times, changing only the student descriptor attached to it: motivation level, gender, disability status.

The essay didn't change. The feedback did.

Students described as highly motivated received direct, critical suggestions aimed at refining their work. 

Students labeled as unmotivated received upbeat encouragement. Female students were addressed more affectionately. Some student groups received more praise and less criticism for identical writing.

The researchers call it "positive feedback bias" and "feedback withholding bias." 

Some students consistently got told their work was strong. Others got pushed to sharpen their arguments - but the input essay was the same every time.

"They are picking up on the biases that humans exhibit," said Mei Tan, lead author and doctoral student at the Stanford Graduate School of Education. The AI models are trained on human language, and human teachers soften criticism for certain students too.

At first glance, more encouragement might seem harmless. 

But there's a trade-off.

Tanya Baker, executive director of the National Writing Project, said she worried that some students might not be "pushed to learn" to write better. Praise can motivate. But it doesn't replace the specific, direct feedback that helps students actually improve.

And the problem extends beyond this experiment. 

Many educational databases already collect detailed student information, from prior achievement to language status. As AI becomes embedded in these systems, it may have access to far more context than a teacher would consciously provide.

"Maybe a takeaway is that we shouldn't leave the pedagogy to the large language model," Tan said. "Humans should be in control."

Tan recommends teachers review AI feedback before forwarding it to students. But one of the selling points of AI feedback is that it's instantaneous. If the teacher has to review it first, the speed advantage disappears.

This is a consistency and reliability problem hiding inside every AI feedback product on the market. Districts adopting automated writing tools need output auditing: does the tool give the same quality of feedback regardless of who the student is? 

Consistency benchmarking tools, prompt-testing platforms, and procurement evaluation criteria around feedback reliability address a gap that most AI vendors haven't even measured yet. The products that prove their outputs are consistent will win the contracts.

58% of 7th Graders Are Ready for Algebra 1. Only 37% Get In.

Algebra 1 in eighth grade is a gateway course. It opens the path to calculus in high school, which opens the path to selective colleges and STEM careers.

But across New York state, more than 1 in 4 schools don't even offer Algebra 1 to eighth graders.

An EdTrust-New York report found that 58% of 7th graders scored proficient on their 2023-24 state math exams. 

The following year, only 37% of 8th graders enrolled in Algebra 1.

That's a gap of 20,000 students statewide.

In New York City alone, 8,000 more students were proficient on 7th grade exams than enrolled in 8th grade Algebra 1.

More than half of those missing students were from low-income families. Nearly half were students of color.

"When we have qualified kids that are denied that opportunity, and it impacts them in high school and beyond, it is such a critical inflection point," said Jeff Smink, deputy director at EdTrust-New York.

The problem isn't readiness.

It's that the course doesn't exist at their school.

Smink named the incentive problem: "If there's no demand, then schools aren't going to respond to it. They're going to offer the easier, simpler option, which is just tracking kids to the standard eighth grade class."

The report recommends an automatic enrollment policy. Any 7th grader who scores proficient would be enrolled in Algebra 1 in 8th grade by default, with parents opting out rather than opting in. 

The coalition also recommends an $8.5 million investment to help 15 high-needs districts expand access, fund tutoring, and hire qualified math teachers.

Chicago has already moved in this direction, offering Algebra 1 online and covering training costs for teachers to earn credentials.

31% of Community College Applications in California Were Fraudulent

The Education Department launched a new FAFSA identity-verification tool this week. 

It screens every application in real time for identity fraud as students fill out the form.

The reason: "ghost students." 

People who receive federal financial aid without attending any classes.

California's community college system reported that during 2024-25, about 31% of applications were fraudulent. Nearly $10 million in federal aid and $3 million in state and local aid was allocated to students who didn't exist.

Since the department began bolstered verification last year, it says it has prevented $563 million in fraud nationwide. 

Aaron Lemon-Strauss, executive director of the FAFSA program, put the scale plainly: "There is a significant fraud problem on the FAFSA, and it's costing the federal taxpayer $1 billion a year."

But there's a problem with the solution.

Students flagged as high risk must complete a live camera check with a government-issued ID on their smartphone or tablet.

If they don't have an ID on hand or don't have a smartphone, their application is rejected. 

And the rejection doesn't distinguish between a fraudster and a real student who couldn't verify on the spot.

Melanie Storey, president of NASFAA, pushed the department to quantify how many legitimate students would get caught. They couldn't give a number.

The department says most rejections will be fraud and that colleges can choose to reach out to rejected applicants. But there's no mechanism for financial aid offices to tell which rejected applications are ghost students and which are real students who got caught in the filter.

For education innovators, the tension between fraud prevention and student access is a defined product gap. Identity verification tools that work without requiring a smartphone. False-positive resolution workflows that flag likely legitimate students for follow-up. FAFSA completion platforms that prepare students for verification before they get flagged. 

And enrollment protection systems that ensure real students don't lose aid because they couldn't pass a camera check on the spot. The $1 billion fraud problem is real and needs quick action.

⚑️More Quick Hits

This week in education:

β€’ Afterschool tutoring shows 22 percentage point math gains in less than six months β€” Step Up Tutoring and Canopy Education data show afterschool models delivering measurable results with data-driven sessions and at least 90 minutes per week, an overlooked alternative to hard-to-schedule in-school tutoring

β€’ Nebraska rejected 35% of special ed transfer applications versus 9% without IEPs β€” Bellevue Public Schools rejected more than three-quarters of IEP applicants while accepting all but one of 246 non-disabled applicants, and special education students were suspended more than twice as often as peers

β€’ Only 1 of 250 AI assessment studies at a major conference examined bias against student groups β€” Learning Data Insights' John Whitmer called the gap a "big miss" as districts adopt AI-powered testing tools without knowing whether results are fair across student populations

β€’ Foreign-born student absences rose 40% after the inauguration β€” Brown University researchers found absence likelihood jumped from 5.9% to 8.1%, with high school juniors seeing a 6 percentage-point increase

To stay up-to-date on all things education innovation, visit us at playgroundpost.com.

What did you think of today’s edition?

Login or Subscribe to participate in polls.