Last weeks learnings at the Benelearn conference about how to compress your Neural Networks to fit into less memory
Compressing your NN by 1/64 without hurting your results too-much, it just sounds like dark magic! Great talk by +Kilian Weinberger.
Compressing your NN by 1/64 without hurting your results too-much, it just sounds like dark magic! Great talk by +Kilian Weinberger.