People that previously never would have been able to make software/a desktop app can now do this - which is arguably both good and bad.
As increasingly more software is built using web-technologies - bringing with it the ridiculous state of modern web development, with it's crazy tooling complexity, enourmous dependency chains and swollen size - the impact is felt everywhere, including on platforms such as desktops and smartphones.
This ain't your daddy's desktop application
Though the use of the term is questionable in this context, the idea of using web technologies to build "native" applications is nothing new - at least when it comes to the GUI specifically. Please note that these are not native applications in the true sense of the word - a commonly accepted definition is applications which are programmed using platform (e.g. macOS or Windows) APIs, thus being first-class citizens.
The Qt-toolkit comes with the WebKit engine embedded, in the form of QtWebKit - which enables developers leveraging Qt to display content in a web view. Google Earth is made using QtWebKit.
There is also the Chromium Embedded Framework (CEF), which allows applications to embed the Chromium runtime directly. Spotify, among others, is built upon CEF.
Then, during their work on Atom (a text editor written with web technologies, of all things), GitHub produced and released Electron - which brings the entire mess to the desktop. Essentially, Electron consists of a Node (JS) runtime and an embedded Chromium instance, as well as platform integration features like access to the filesystem.
Proponents of Electron point out that it allows webdevs to leverage their familiarity with web technology, possibly share code with their web apps, and produce cross-platform applications. Admittedly, this does make (cross-platform) application development less prohibitive with regards to resources (i.e. time, cost, expertise). But this lowering of the bar to entry will in some cases lead to lowered quality software.
As you might imagine, Electron is known to have a significant impact on battery life/power usage - it embeds a copy of an entire web-browser, for goodness' sake!
Web-based UIs do allow cross-platform interface design, but you end up with a free-form UI that doesn't look or feel native on any platform - even though some proponents claim that beautiful cross-platform UIs is one of Electrons upsides.
It also provides bad integration with the underlying platform: as applications use a web view for their UI rather than native widgets, they lose accessibility features. What's particularily worring about this is that when I've had the opportunity to ask developers on projects that use web view based GUIs what their appliations' accesibility is like, none of them have known; Users with accesibility-needs are effectively left out.
What we're left with is essentially less accessible, less efficient, more power-hungry software with worse platform integration - just because some people and organizations can't be bothered to use the appropriate tools for the job.
Pointing this out might be met with accusations of elitism - but gatekeepers and/or barriers to entry might actually have useful functions in some cases.
I've often seen people defend using the usual web stack to implement a UI, saying that it allows them to be more efficient, but it's not unreasonable to suspect that the only reason they claim this is because they already know the web stack.
I'm all for enabling people to make software - but if it comes at the expense of my (or the planet's) resources, I'd rather have quality than quantity.
This is just ridiculous
It's as if some developers are OK with application efficiency decreasing faster than hardware efficiency is increasing.
Just the other day, I experienced Safari reloading my Facebook tab as I was trying to read through greetings I had received on my birthday, stating that the tab was using too much too many resources. It's bad enough to have this happen on a webpage, but I cringe at the thought of this infecting the desktop as well…
I think the HN user Veen is onto something when they say:
What you call "trade-offs" appears to me to be developers externalizing their costs onto users. There are costs to developing desktop applications and developers don't want to pay them, so they make users pay for them in wasted hardware dollars, bandwidth, RAM, battery life, and poor integration.
There's also a slew of other fun stuff, such as:
- The often required restarts
- Large download sizes; often large updates (because not many developer make delta updates, for some reason)
- Slow startup
- General slowness/unresponsiveness
Electron enables lazy people to make garbage
In my experience, Electron apps tend to be bloated and not very performant or energy inefficient. This means that developers make a choice regarding users' disk space, user experience and battery life. The extent to which users notice any of this is of course dependent on a combination of their machine (e.g. old/new hardware, laptop/desktop) and the application being run.
Also, since you don't use the platform's native widgets, there's poor accessibility for users with special needs - not to mention the implications (HCI-wise) of redefining and replacing standard interface elements.
It's one of the slowest, least memory efficient, and most inelegant GUI application platforms out there - bundling an entire web browser just to provide portable GUI functionality.
Electron allows developers to make desktop software in a weird, roundabout way in order to pander to people and organizations that can't be bothered to learn something other than web technology or to use efficient tech for reasons like laziness and budget/time constraints. It's easy to use, but it's not a good solution.
This state of "modern" desktop application development is, frankly, embarrassing. It reveals incompetency, whether on the level of applying programming knowledge or reasoning.
In my opinion, all this reeks of poor design, and is arguably (in some cases) downright user-hostile and unethical.
And to portray software produced in this way as "native" is, frankly, ridiculous.
That's not to say there aren't legitimate use cases for this type of technology, though: The main thing Electron does well is lower the barrier to entry and maybe increase development speed. This makes it suited for quick prototypes, internal applications and educational settings.
The markdown editor I use to write my blogposts, for instance, renders the content as inline preview/WYSIWYG. This is one use case where web technologies are uniquely suited for the task.
But since the technology is so resource-intensive, developers should ask themselves:
- Does this application really need to be created in Electron?
Additionally, we should also consider what the global energy impact and carbon footprint of these inefficient applications is...