Ever pulled your chair up to a colleague’s desk to pair on some code or debug a problem? Multi's meant to be the virtual equivalent of just that.
We’ve built many collaborative features that mimic things you can do at the same desk, like pointing, drawing, or scribbling notes—and most of the time that’s all you need. But sometimes, you just need to grab the mouse and keyboard. That’s why we built remote control. Here’s how.
In a nutshell, remote control should feel like you’re working directly on the computer you’re controlling. Our early version covers:
We need a method for the remote control input commands to be programmatically passed to macOS. Because we want the remote control to feel as native as possible, we want to hook into the system at the levels closest to the OS as possible. That means creating and sending [CGEvents]. CGEvents are part of the Core Graphics framework and allow us to create both cursor and mouse events. Here’s what a creating a CGEvent that represents a mouse down event looks like:
Then, we can post that event to the system (note that your app will need to have Accessibility permission granted from the user for this to work):