From Personal Science Wiki
Jump to navigation Jump to search"Finally, we can reach MSE=11.1% with a modest 3-layer 16-unit LSTM model, a 28% improvement on the baseline, after a short training period." Lots of other apps that used ML to try to predict retention. Like a NN in 2013.

interference ambiguity and fighting them with distinguishers and constraints tested ability to work on treadmill using TREADMILL DEFINATELY BUT MILDLY AFFECTED RECALL. "analysis of the ~50m response Mnemosyne dataset confirmed that there are meaningful hour-of-day and day-of-week effects" Description of algorithms behind forgetting curve. " Textbook stuff is hard to understand by just grabbing some other person's deck, but basic foreign language vocabulary is much more context-free, so it's easy to do with the premade decks. " "Doing the vocabulary stuff feels quite effortless now, more like playing a casual game than doing work. The SRS takes care of boring stuff like figuring out what needs reviewing and keeping track of the materials, and I just need to come up with weird mnemonics for stuff." "too conceptually complex for SRS to work well with" "We roll the random module to decide whether or not the card is failed and we schedule this card accordingly (Intervals got multiplied by factor)." This is bad! Must use ML to predict!!!

"One of the benefits of spaced repetition is that it forces you to be more intentional on how you structure information and see connections between ideas." -steven jonas

sleep affects memory and so if sleep affects anki fluency...[edit | edit source]