AI musings

by Tim

There seems to be a spark there, where we are distilling petabyes of information down to mere gigabytes. An agerage of humaity made so small. I have a lot of questions, wondering how we can use these to go backwards, to cite sources, to reflect on itself. Can we chain input width to increate token length without it going mad?

So many questions, so many projects, so little time and resources.