How to Remember High Volumes of Information Quickly - 12 Principles
Based on Justin Sung's video on YouTube. If you like this content, support the original creators by watching, liking and subscribing to their content.
Encoding improves when new information is organized into a meaningful structure rather than forced into memory without relevance.
Briefing
Strong memory isn’t mainly about “trying harder”—it’s about encoding: how the brain organizes new information into a structure it can later retrieve. When information can’t be placed into a meaningful network, the brain treats it as waste and discards it, creating a cycle of forgetting and re-learning that burns hundreds of hours. The core message is that efficient learners stop fighting this process and instead make new material relevant on purpose, turning “irrelevant” inputs into usable connections.
The first principle is to stop forcing raw memorization. Encoding works best when the brain can see where something fits—like shelving a book according to a library’s logic. When learners repeatedly push information into long-term memory without building that fit, they end up with what’s described as “WTE memorization” (smashing information in without helping it connect). That approach doesn’t just fail to store knowledge; it also produces “learning debt.” If a learner postpones the hard work of making something relevant, the future version of them must pay it back by re-learning the same material, repeatedly, because it still doesn’t belong.
To prevent that debt, the framework shifts to a two-phase learning rhythm: consuming and digesting. “Don’t overeat information” means digesting continuously—taking small chunks during lectures or reading, then doing quick synthesis (a mini mind map, a short summary, a few connections) before moving on. This creates a snowball effect: early understanding becomes anchor points that make later concepts easier to integrate. By contrast, consuming for long stretches and digesting all at once overloads working memory and makes integration feel overwhelming.
The next set of rules turns digestion into repeatable tactics. Learners are urged to simplify everything first, then compare it—either with known concepts or with other new ideas—to generate meaning through similarities and differences. After that comes connecting: mapping influences and implications so the information becomes part of a growing network. Grouping follows, where patterns of shared traits compress a complicated web into a more memorable structure. These steps are framed as a habit of “thinking hard,” not a one-time trick.
The method also emphasizes iteration. Encoding never ends because new details can reveal gaps or errors in earlier groupings, forcing learners to restructure their networks. Better analogies can accelerate the same process—if they stay comprehensive, simple, and accurate—because building an analogy requires simplifying, comparing, connecting, and grouping. Notes are treated as an offload system: writing externalizes the mental juggling, makes connections visible, and helps identify weak or disconnected “straggly” items without constant testing.
Finally, knowledge is treated as a hypothesis. Learners should challenge their own structures constantly so early mistakes don’t harden into rigid foundations. The overall takeaway is blunt: there’s no shortcut that replaces effort, but there is a reliable path—make information relevant, digest it in small cycles, and keep rebuilding the network as understanding deepens.
Cornell Notes
Efficient memory depends on encoding—how the brain organizes new information into a meaningful structure. When information can’t be placed into a network, it gets discarded, leading to “learning debt,” where future time is wasted re-learning the same irrelevant material. The framework prevents that debt by digesting continuously: consume small chunks, then simplify, compare, connect, and group them into a growing mental network. Notes and better analogies support this process by offloading working memory and making connections visible. Because new information can expose gaps or mistakes, encoding is an ongoing cycle, and learners should treat their knowledge structures as hypotheses to be challenged and updated.
What does “stop fighting your brain” mean in practice, and why does it matter for memory?
How does “learning debt” form, and how does it trap learners over time?
What does “don’t overeat information” look like during lectures or reading?
How do the tactics “simplify, compare, connect, group” work together?
Why are notes treated as an “offload,” and how can they reveal weaknesses without extra testing?
What does it mean to “challenge your hypothesis constantly,” and how does that prevent rigid knowledge?
Review Questions
- Which specific behaviors create learning debt, and what early “digesting” step would interrupt that cycle?
- In what order should a learner apply simplify, compare, connect, and group, and what does each step produce in the mind map?
- How can notes help identify weak knowledge areas even before any quiz or test?
Key Points
- 1
Encoding improves when new information is organized into a meaningful structure rather than forced into memory without relevance.
- 2
Avoid “learning debt” by doing the relevance work soon—turn irrelevant inputs into relevant connections during the learning session, not later.
- 3
Digest continuously: consume small chunks and synthesize immediately with mini summaries or mind-map fragments.
- 4
Use a repeatable thinking sequence—simplify, compare, connect, then group—every time new information arrives.
- 5
Train “thinking hard” until it becomes habitual; efficiency comes from doing the mental work, not from speed-reading or brute-force flashcards.
- 6
Keep restructuring: encoding is iterative, and new details can reveal gaps or errors in earlier networks.
- 7
Use notes as offload and diagnostic tools, and treat knowledge as hypotheses that must be challenged and updated.