I was taking a Game Engines class at my uni last semester and we had to add scripting to our C++ engines using Duktape (or it's C++ counterpart, Dukglue.) It was surprisingly easy to use Duktape once you figured out the initial boilerplate, and it was especially exciting and fun to be able to write a demo game using the engine in two languages (especially with one which didn't require recompilation!)
Just keep in mind that performance is abysmal (order of magnitude of 1% vs what you are used to in a browser for the same amount of cpu time spent), which might be ok for your use case, just keep in mind that many libraries assume a somewhat performant runtime. Another down point is that you are stuck with ES5.
For anyone who needs a performant "JavaScript for embedded" runtime that also fully supports ECMAScript 2018, Moddable's XS is a great choice.[1]
Moddable is the only embedded engine provider in the Ecma TC39 JavaScript language committee, so they tend to be really aggressive in supporting new JavaScript features.
Yes, it's well worth understanding the licensing before trying it.
Their commercial license is 25¢/unit (no minimum), and their FOSS license is LGPL for the runtime, GPL for the development tools, and Creative Commons for their example code.
For anyone interested in XS specifically or embedded software licensing in general, they do a good job of explaining their rationale here: https://www.moddable.com/license
In my experience (having embedded both spidermonkey and duktape) is that if embedding is not a primary use case for the engine it can get messy and complicated to update (lots of api breakage etc). This was the reason I switched from spidermonkey to duktape for my app. That said, it was probably five or six years ago. Perhaps things have changed since. I guess the takeaway is just that. If the engine isn’t maintained for embedding. Don’t use it as such.
If you're embedding, you have C/C++ right there if you need real calculation work done.
Duktape supports some ES6 features (26% according to kangex for version 2.4). I'd note that Duktape also supports proper tail calls which v8 and spidermonkey refuse to support (JSC in Safari does though). Support for typed arrays is nice (especially when interacting with C).
If you have room for a larger binary, QuickJS and XS are both faster and both have support for newer specs.
> If you're embedding, you have C/C++ right there if you need real calculation work done.
Sure, but in my use case this would have completely negated the desired benefits (code sharing between web + native) I was aiming for in the first place.
YMMV, often times a factor 100 performance degradation is perfectly acceptable, sometimes it’s not. Just wanted to let folks know.
v8 used to support them behind a flag. Chakra and spidermonkey teams said implementing was hard. v8 team put forward a plan to replace automatic PTC with syntactic PTC. The proposal has been essentially dead and v8 removed their PTC support completely.
A big complaint was "no stack traces". This doesn't matter (IMO) because shadow stacks have been prove to work well in Safari and in several Scheme implementations (you seldom need more than the first few stacks and the last few stacks with a rolling window). We've had way worse than that for decades now as you lose your whole stack every time the event loop goes to the next event. Safari has about 30% of the total JS marketshare including mobile. They've been running PTC on websites for a couple years without complaints, so the whole "we'll break the web" concerns seem overblown.
My understanding of the politics is that both the Safari and spidermonkey/v8 members on the consortium refuse to budge so there isn't consensus to move forward. This still seems a non-issue to me though. If you view PTC as an optional enhancement, then you could still add a non-optional syntactic PTC variant.
In all somewhat recent V8 using engines such as Chrome or node.js you get a "zero-cost" full asynchronous stacktrace these days. That's independent of the async. stacktrace provided through the dev.tools for a long time and "always on" for free.
The condition: You must use async/await. Even if you return a promise from a function, for that function to be included in the stacktrace is that you "return await thePromise;". Because of the await the context of the function calling an asynchronous function remains available and can be (and now is) used to create the stacktrace.
The Firefox JS engine does not have that feature (yet?) last time I checked.
Typescript compiles perfecly to Duktape and the code is quite performant. We've used Duktape in production systems without any hickups for 2yrs+, first with Babel and now with TS. It serves us well as an intermediate migration platform for non-evented code, as part of a rewrite that can run alongside legacy code. Once the code is all in TS we will adapt it to Node's evented style.
YMMV, we tried to use Duktape to share some code between native and web and abandoned it im favor of V8 because performance just didn’t work out. It really depends on what you are doing inside.
I tried it on embedded. Wasn’t happening for me. Eventually moved to mJS a similar JS engine for C. Eventually dumped that and moved to Lua.
Part of the issue is I needed a scripting engine, but I didn’t NEED JS. It was also a pain that the code/script was in ASCII which had to go to base64 and none of this was as good as compiled for efficiency.
I wasn’t going to let users run their own code, so that’s probably a big reason people use this.
What issues with Duktape led to you using Lua? I've used both Duktape and Lua for years for non-embedded (desktop), and they seem very similar in requirements, so I would have guessed both would be a similar experience in microcontrollers.
One difference seems to be that in Duktape you can only generate bytecode for one function at a time (duk_dump_function) while in Lua you can just load your whole script and dump it as bytecode (lua_dump). Sure, you can wrap your code in a self executing function in JS to get around this. But Lua looks to be somewhat more convenient for this use case.
Yes, you’re right. Also, super-annoyingly I think there is another entirely separate JS interpreter called mJS. Someone didn’t search before giving their code a name.
moddablexs and quickjs are probably better at this point, with current spec compatibility and whatnot. still a lot of respect for duktape though, it is not that bad to embed.
We switched from Duktape to QuickJS for https://vcvrack.com/Prototype, where performance on x86_64 is a huge concern. Both are incredibly easy to embed compared to most other scripting language interpreters, but QuickJS required us to fork the project and hack 3-4 features/fixes into its codebase, but after that, it was a drop-in replacement. In our use case, QuickJS was ~2x faster than Duktape for lots of math. We still keep the Duktape implementation up-to-date but disabled as a reference for new script engines, since its API is the easiest to read.
600kb or 1.2mb are way larger than 330kb and makes them much less suitable for smaller embedded applications which Duktape seems to target (it can get way smaller than 330kb).
I'm surprised with v8 though. I assumed they'd have a separate binary available so you could run jitless without having to carry all the binary bloat.
I suspect these two things are related. ES2015 more than doubled the size of the spec and each spec since then has continued adding loads of things that take precious kb to implement.
That said, it seems like there are serious code savings to be had with some things like destructuring, template strings, and arrow functions. Generators are probably complex to add, but also don't transpile well (that is, debugging the resulting code is a horrible experience).