Greetings, fellow AI enthusiasts and sympathetic onlookers! Pull up a chair and let me regale you with the tale of an aspiring researcher’s quest to

Small Models, Big Dreams: A Novice’s Misadventures in Tree of Thoughts Implementation

submited by
Style Pass
2024-09-19 11:00:06

Greetings, fellow AI enthusiasts and sympathetic onlookers! Pull up a chair and let me regale you with the tale of an aspiring researcher’s quest to implement Tree of Thoughts (ToT) reasoning on models so small, they could fit in the back pocket of your jeans. Spoiler alert: it involves more plot twists than a soap opera written by a malfunctioning GPT-2 model.

Picture this: armed with nothing but a trusty laptop that’s seen better days and a GPU that could barely run Minecraft, I set out to revolutionize AI reasoning. My grand plan? Implement Tree of Thoughts on Llama 3.1 8b. Because who needs 175 billion parameters when you can have… 8 billion?

Undeterred by my laptop’s judgemental whirring, I began my search for the perfect small model. TinyLlama caught my eye first. “Tiny” was in the name, so it had to work, right?

Leave a Comment