Tutorial - Migrating from Catalyst
Incrementally adding fastai goodness to your Catalyst training
We’re going to use the MNIST training code from Catalyst’s README (as at August 2020), converted to a module.
To use it in fastai, we first convert the Catalyst dict into a DataLoaders
object:
@before_batch_cb
The Catalyst example also modifies the training loop to add metrics, but you can pass these directly to your in fastai:
You can now fit your model. fastai supports many schedulers. We recommend using 1cycle:
learn.fit_one_cycle(1, 0.02)
As you can see, migrating from Catalyst allowed us to replace 17 lines of code (in CustomRunner
) with just 3 lines, and doesn’t require you to change any of your existing data pipelines, optimizers, loss functions, models, etc. Once you’ve made this change, you can then benefit from fastai’s rich set of callbacks, transforms, visualizations, and so forth.
Changing the model
Instead of using callbacks, in this case you can also simply change the model. Here we pull the view()
out of the training loop, and into the model, using fastai’s layer:
We can now create a Learner
and train without using any callbacks:
learn = Learner(data, model, loss_func=F.cross_entropy, opt_func=Adam, metrics=metrics)
epoch | train_loss | valid_loss | accuracy | top_k_accuracy | time |
---|---|---|---|---|---|
0 | 0.230051 | 0.292577 | 0.924800 | 0.995100 | 00:11 |