DataLoader with multiple workers leaks memory

PyTorch Developer Podcast - Ein Podcast von Edward Yang, Team PyTorch

Podcast artwork

Kategorien:

Today I'm going to talk about a famous issue in PyTorch, DataLoader with num_workers > 0 causes memory leak (https://github.com/pytorch/pytorch/issues/13246). This bug is a good opportunity to talk about DataSet/DataLoader design in PyTorch, fork and copy-on-write memory in Linux and Python reference counting; you have to know about all of these things to understand why this bug occurs, but once you do, it also explains why the workarounds help.

Visit the podcast's native language site