Tutorial - Migrating from Catalyst

    Incrementally adding fastai goodness to your Catalyst training

    We’re going to use the MNIST training code from Catalyst’s README (as at August 2020), converted to a module.

    To use it in fastai, we first convert the Catalyst dict into a DataLoaders object:

    1. @before_batch_cb

    The Catalyst example also modifies the training loop to add metrics, but you can pass these directly to your in fastai:

    You can now fit your model. fastai supports many schedulers. We recommend using 1cycle:

    1. learn.fit_one_cycle(1, 0.02)

    As you can see, migrating from Catalyst allowed us to replace 17 lines of code (in CustomRunner) with just 3 lines, and doesn’t require you to change any of your existing data pipelines, optimizers, loss functions, models, etc. Once you’ve made this change, you can then benefit from fastai’s rich set of callbacks, transforms, visualizations, and so forth.

    Changing the model

    Instead of using callbacks, in this case you can also simply change the model. Here we pull the view() out of the training loop, and into the model, using fastai’s layer:

    We can now create a Learner and train without using any callbacks:

    1. learn = Learner(data, model, loss_func=F.cross_entropy, opt_func=Adam, metrics=metrics)
    epochtrain_lossvalid_lossaccuracytop_k_accuracytime
    00.2300510.2925770.9248000.99510000:11