New preprint, on incorporating human-like fleeting memory in transformer language models