Distillation of CLIP model and other experiments

0

Distillation of CLIP model and other experiments

Distillation of CLIP model and other experiments

CLIP is a model released by OpenAI earlier this year. It was trained to learn “visual concepts from natural language supervision” on more than 400 million image-text pairs using an impressive amount of compute (256 GPUs for 2 weeks).

Source: tech.pic-collage.com/distillation-of-clip-model-and-other-experiments-f8394b7321ce

August 18, 2021
Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

Subscribe to our Digest