logoalt Hacker News

LarsDu88today at 4:39 PM0 repliesview on HN

There's been a few very interesting JEPA publications from LeCun recently, particularly the leJEPA paper which claims to simplify a lot of training headaches for that class of models.

JEPAs also strike me as being a bit more akin to human intelligence, where for example, most children are very capable of locomotion and making basic drawings, but unable to make pixel level reconstructions of mental images (!!).

One thing I want to point out is that very LeCunn type techniques demonstrating label free training such as JEAs like DINO and JEPAs have been converging on performance of models that require large amounts of labeled data.

Alexandr Wang is a billionaire who made his wealth through a data labeling company and basically kicked LeCunn out.

Overall this will be good for AI and good for open source.