Elevate App Review (2021) - Are Brain Training Apps Worth It?
Based on FromSergio's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Brain training apps market broad cognitive benefits, including long-term memory and intelligence gains, but the transcript highlights major scientific skepticism about those claims.
Briefing
Brain training apps like Elevate, Peak, and Lumosity sell big promises—better memory, sharper intelligence, and even protection against Alzheimer’s—but the scientific record behind those claims remains thin. Major scientific critiques argue there’s no compelling evidence that these apps deliver the long-term cognitive benefits they advertise, and regulators have taken action against at least some companies for overstating results.
A 2014 letter from scientists said there was no compelling scientific evidence that brain training apps work as marketed, while also accusing some products of exploiting older adults’ fears about cognitive decline. A later review by psychologists in 2016 examined studies cited by these companies and argued that the research was cherry-picked—presenting isolated findings as if they were more common or more meaningful than the full body of evidence supports. Another 2018 study found short-term improvements in cognitive function, but it couldn’t establish lasting effects over the long run. Taken together, the evidence base doesn’t support confident claims about preventing brain aging or producing durable gains in intelligence or memory.
Against that backdrop, the transcript shifts from research to a hands-on test of Elevate. The reviewer used Elevate daily for 60 days, taking advantage of a 50% off offer for the first year rather than paying the full $40 per year upfront. Elevate’s structure is consistent: five timed games each day, grouped into four skill categories—reading, speaking, writing, and math. The app awards extra points for faster responses, offers dozens of game variations per category, and unlocks harder levels as performance improves. After each session, it provides a score and a progress graph, and it tracks overall performance against other users.
The experience is described as engaging and well-designed. The daily routine is enjoyable, and the reviewer’s scores trend upward over time. But the key question—whether cognitive function improves beyond the tasks themselves—doesn’t show up clearly. The reviewer doesn’t notice meaningful cognitive changes after 60 consecutive days, attributing score increases to practice effects on the specific games rather than broader, long-term brain benefits.
The conclusion is conditional. Elevate may be worth it for people who want a fun, game-like app and enjoy daily practice. But it’s not a reliable substitute for evidence-based cognitive improvement, and the transcript criticizes the marketing for overselling outcomes. Renewal at full price is considered likely, not because of proven brain training benefits, but because the app is enjoyable to play. For users seeking results grounded in strong evidence, the transcript warns they may feel misled by the size of the claims.
Cornell Notes
Brain training apps such as Elevate, Peak, and Lumosity advertise sweeping benefits, but scientific critiques say the evidence for long-term cognitive gains is weak. Reviews and studies cited in the transcript point to a lack of compelling proof, possible cherry-picking of results, and findings limited to short-term improvements without clear durability. A 60-day personal trial of Elevate found rising scores and strong engagement, yet no obvious improvement in broader cognitive function—suggesting practice effects rather than lasting transfer. The takeaway: these apps may be worthwhile as entertaining daily games, but their marketing claims outpace the evidence for meaningful, long-term brain benefits.
What kinds of claims do brain training apps make, and why do critics say those claims are hard to support?
What evidence is cited for short-term benefits, and what’s missing for long-term conclusions?
How does Elevate’s daily training work in practice?
What did the 60-day Elevate trial show about performance and perceived cognitive change?
When does the transcript suggest Elevate is “worth it”?
Review Questions
- Which critiques (2014 letter, 2016 review, 2018 study) are used to argue that brain training apps lack strong long-term evidence?
- What practice-effect explanation is offered for why scores improve in Elevate without clear cognitive transfer?
- How does Elevate’s structure (five timed games, four categories, scoring, progress graphs) shape what users can realistically expect?
Key Points
- 1
Brain training apps market broad cognitive benefits, including long-term memory and intelligence gains, but the transcript highlights major scientific skepticism about those claims.
- 2
A 2014 scientist letter says there’s no compelling scientific evidence that these apps work as advertised and criticizes exploitation of older adults’ anxiety.
- 3
A 2016 psychologist review argues companies cherry-pick studies to overstate how often benefits occur.
- 4
An 2018 study finds short-term cognitive improvements but cannot support long-term conclusions.
- 5
A 60-day Elevate trial reported rising scores and high engagement, yet no clear improvement in broader cognitive function.
- 6
The transcript distinguishes between enjoying a game-like app and expecting evidence-based cognitive training results.
- 7
Renewal is framed as likely for enjoyment, not because of proven cognitive enhancement.