5 Comments
May 8, 2023Liked by Abhinav Upadhyay

Context can be compressed. Much like how humans can only remember 7 random things at any given time.

I think GPT-100 will have fewer parameters than GPT-5

The moore's law is for parameters not transistors.

Expand full comment
author

Yes. I agree that models will get smaller. But i believe that these models lack the ability to creatively solve an unseen unknown problem when it comes to programming. Sure, if you ask them to solve a problem which they have seen thousands of times in different ways on GitHub data they would be able to reproduce it. But when faced with a new problem we can extrapolate all our knowledge and experience to create novel solutions, I doubt that's within the reach here, because if it gets to that we might be very close to an AGI.

At this point everyone is speculating what the future hold. I'm more optimistic for things to play well for us.

Expand full comment
May 8, 2023Liked by Abhinav Upadhyay

Such a brilliant compilation of ideas. AI is not expected to kill the programming industry but rather transform it.

Expand full comment
author

Yes, that's how technological innovations have transformed this industry in the past.

Expand full comment

Well written. I agree with this completely. I have a theory that AI will start to build packages for us to use. Almost like an "AI Package Manager". It will generate it's own documentation and make it easy for us to use. The token limit seems to be the biggest complication. Unless we're all going to start running $30,000 PCs to handle hundreds of thousands of tokens, I don't see them taking the occupation over anytime soon.

Expand full comment