Understand CLIP (Contrastive Language-Image Pre-Training) — Visual Models from NLP

CLIP introduces a model that enables zero shot learning for a new dataset (in addition to a new example) by using NLP to supervise pre-training. i.e., To identify an object, you can provide the name or description of a new object that the model has not seen before. Traditionally a computer vision model was trained … Read more

Batch Normalisation

Before we describe what Batch Normalisation is, here are a few introductory terms Internal covariate shift Stochastic Gradient Descent uses a minibatch of input to train the parameters of a layer. The input to a layer is the output from the previous layer. A change in the parameters of the previous layer causes a change … Read more

Distributed Machine Learning – Part 2 Architecture

Why Distributed Machine Learning? In the previous article we looked at how GPGPU, ASICS, AWS’s Inferentia, the new NVidia A100 chip and other advances in hardware have tremendously improved the performance of Machine Learning training and inference. However the increase in the volume of data and the increasing complexity of the machine learning models require … Read more

CloudFormation for SageMaker instance

Amazon SageMaker helps data scientists and Machine Learning developers build, train and deploy machine learning models. It includes Jupyter notebook to build and train model as well SageMaker API to train and deploy model with a few lines of code. Amazon CloudFormation helps in provisioning AWS resources using code. It automates provisioning and configuring resources … Read more

Statistics – Examples

T-Test:Problem : A swimming instructor wants to prove that the swimming speed of an athlete increases if the athelete performs some specific exercises before the swim. He undertakes an experiment with 16 participants and randomly assignes 8 participants to each team. For team A he recommends some common exercises and for team B he recommends … Read more