Discussion about this post

User's avatar
Michael J. Goldrich's avatar

AI becomes intelligent by forgetting more.

The compression principle explains why smaller, focused implementations often outperform massive general models.

If you're thinking about what this means for your team's AI strategy: https://vivander.substack.com/p/something-shifted-when-i-read-openais

Michael J. Goldrich's avatar

The idea that intelligence is compression, not memorization, completely reframes how we should think about AI success.

Most orgs are still measuring AI like they measure databases: how much can it hold?

The real question is how well does it forget what doesn't matter.

1 more comment...

No posts

Ready for more?