Prominent Apple blogger John Gruber kicked off a firestorm on Twitter when he suggested recently that web developers “should not be trying to create a ‘native like app’ in a web browser.” Gruber said this in relation to increasing criticism of Apple for not allowing competing browser engines on its iOS platform. By forcing browser vendors like Google, Microsoft and Mozilla to use the WebKit browser engine, Apple is deliberately restricting the functionality of web apps on iOS. It’s why many of the features of progressive web apps (PWAs) do not work on an iPhone or iPad.
Despite the clear bias that Gruber has towards Apple’s proprietary iOS platform, he does raise an interesting philosophical question: how far should web apps go in trying to emulate the advanced functionality of native apps? To answer that question, we first need to understand how far web apps have come already.
For the majority of its history, the web has been a platform for applications as well as web pages. Although it started out in 1991 as a document-centric platform, as early as 1993 — with CGI scripts — the web began evolving into an application platform. Both Netscape and Microsoft turned their web browsers into web app platforms in the mid-to-late 90s, and by the early 2000s we had Ajax applications — a key driver of the so-called Web 2.0 era.