I’ve read two bad takes recently on how “faux native” technologies are considered harmful.
The first is alarmingly titled Electron is flash for the desktop:
Also all you web devs: Go learn C or Rust or something. Your program runs on a computer. Until you know how that computer works, you’re doomed. And until then get off my lawn shakes fist.
The second, a little less insufferable, is about an iOS developer’s experience with React Native:
Chances are that your finished RN app is going to feel like a not-quite-native iOS app and a not-quite-native Android app. Some people will have a bigger problem with this than others and most users of the general population won’t notice. But you do risk losing platform specific niceties that users will notice.
Both of these articles make some fair points, even if they’re not cogent overall.
They’re worth reading, and bring up objectively correct things, like the fact that Electron apps tend to be less machine-efficient and React Native has some eyes that can hamper overall user experience.
But they gloss over what feels obvious, time after time again:
Developer experience is the strongest bellwether for success. The ability to iterate quickly is incredibly difficult to outweigh.
I’m not saying this is fair, or the best for users in the short term, or cosmically correct or whatever.
But there’s a difference between What approach produces the best software for users? and What is the best approach for producing software for users?
Rather than a grumbling from a developer who has dipped their toes in the water of “faux native” development, I think the essay written by the folks at Artsy does a great job explaining the myriad reasons why a JS-based approach beats out a native one:
I’m looking at my Applications right now, and most the apps I use the most – Spotify, Slack, Nylas, VS Code – aren’t native apps.
There’s a reason for that, and it’s not those developers are idiots who don’t care about their users.