The newly-released desktop versions of our microM8 Apple IIe emulator now include an MCP (Model Context Protocol) server built-in.
These allow external AI applications to control them in various ways. We will explain the ‘how’ a little bit after we explain the ‘why’:
The usage of Large Language Models (LLMS) has always been contentious, especially their disruption in creative industries and education, where they’ve upended the traditional model of examinations and research papers and forced more practical evaluations.
These disruptions have caused a mixture of worry and excitement that they would be coming for the software engineers at some point. But they have struggled to become truly useful at programming, especially novel programming.
This has much to do with their ability to actually reason. Even a year ago they were really bad at thinking but over the last year that has changed somewhat — they’re a little less terrible. This has changed due to three main factors, the scale and resolution of the model itself, the algorithm behind getting a result, which now iterates better in an attempt to refine a result and improve accuracy, and a larger ‘context’ that allows for the insertion of large amounts of reference material, consideration of past history and the ability to search the Internet for information.