You chomp your mouth to move and turn your face to steer. You look…pretty dumb while playing it. At the end it tries to give you the dumbest possibl

PacCam: Pacman controlled with your face | eieio.games

submited by
Style Pass
2024-11-01 20:30:03

You chomp your mouth to move and turn your face to steer. You look…pretty dumb while playing it. At the end it tries to give you the dumbest possible gif of yourself (it tracks when your mouth is the most open, and builds the gif around that moment).

I use a couple of additional tools like Framer motion, radix, GIF.js, and styled-components. But nothing crazy. The game is largely Just Code.

So the high-level answer to “how does this work” is mostly: MediaPipe is an easy way to do face tracking in the browser and otherwise this is just a website on the internet. We live in the future!

That said, there were some challenges gluing all this tech together. The biggest early challenge was getting React to play nice with MediaPipe.

“Information” means the location of “landmarks” (e.g. the tip of your nose) as well as the value of “blend shapes” (e.g. how open your mouth is). But the point is, MediaPipe sprays lots of data at you (think hundreds of floats tens of times a second).

So maybe you can see a potential problem here. Naively combining React and MediaPipe and stuffing all your MediaPipe state into React causes React to do lots of updates. This gets slow.

Leave a Comment