Skip to main content

ML

Generative Adversarial Network
·753 words·4 mins
A neural network is like a highly sophisticated, multi-layered calculator that learns from data. It consists of numerous “neurons” (tiny calculators) connected in layers, with each layer performing a unique function to help the network make predictions or decisions.
Variational-Auto-Encoder
·729 words·4 mins
The beauty of VAEs lies in their ability to generate new samples by randomly sampling vectors from this known region and then passing them through the generator part of our model.
Auto-Encoder
·545 words·3 mins
An autoencoder begins its journey by compressing input data into a lower dimension. It then endeavors to reconstruct the original input from this compressed representation.
Less is More Paper Review
·467 words·3 mins
Less is More: Parameter-Free Text Classification with Gzip offers a novel text classification method using gzip compression, eliminating manual parameter tuning.
Infini-Attention Paper Review
·438 words·3 mins
Infini-Attention introduces a novel approach to scaling Transformer models for infinitely long inputs while managing memory and computation.
Softmax
·1713 words·9 mins
Softmax stands as a pivotal component in neural network architectures, offering a means to convert raw scores into interpretable probabilities.