Questionnaire

tip: Experiments: For the questions here that ask you to explain what some function or class is, you should also complete your own code experiments.

  1. What is glob?
  2. How do you open an image with the Python imaging library?
  3. What does L.map do?
  4. What does Self do?
  5. What is L.val2idx?
  6. What methods do you need to implement to create your own Dataset?
  7. Why do we call convert when we open an image from Imagenette?
  8. What does ~ do? How is it useful for splitting training and validation sets?
  9. Does ~ work with the L or Tensor classes? What about NumPy arrays, Python lists, or pandas DataFrames?
  10. What is ProcessPoolExecutor?
  11. How does L.range(self.ds) work?
  12. What is __iter__?
  13. What is first?
  14. What is permute? Why is it needed?
  15. What is a recursive function? How does it help us define the parameters method?
  16. Write a recursive function that returns the first 20 items of the Fibonacci sequence.
  17. What is super?
  18. Why do subclasses of Module need to override forward instead of defining __call__?
  19. In ConvLayer, why does init depend on act?
  20. Why does Sequential need to call register_modules?
  21. Write a hook that prints the shape of every layer’s activations.
  22. What is “LogSumExp”?
  23. Why is log_softmax useful?
  24. What is GetAttr? How is it helpful for callbacks?
  25. Reimplement one of the callbacks in this chapter without inheriting from Callback or GetAttr.
  26. What does Learner.__call__ do?
  27. What is getattr? (Note the case difference to GetAttr!)
  28. Why is there a try block in fit?
  29. Why do we check for model.training in one_batch?
  30. What is store_attr?
  31. What is the purpose of TrackResults.before_epoch?
  32. What does model.cuda do? How does it work?
  33. Why do we need to check model.training in LRFinder and OneCycle?
  34. Use cosine annealing in OneCycle.

Further Research

  1. Write resnet18 from scratch (refer to <> as needed), and train it with the Learner in this chapter.
  2. Implement a batchnorm layer from scratch and use it in your resnet18.
  3. Write a Mixup callback for use in this chapter.
  4. Add momentum to SGD.
  5. Pick a few features that you’re interested in from fastai (or any other library) and implement them in this chapter.
  6. Pick a research paper that’s not yet implemented in fastai or PyTorch and implement it in this chapter.
    • Port it over to fastai.
    • Submit a pull request to fastai, or create your own extension module and release it.
    • Hint: you may find it helpful to use nbdev to create and deploy your package.

In [ ]: