When I decided to create a game engine where the game could be entirely scripted in a scripting language, I was choosing between JavaScript (QuickJS), Python (Boost.Python), and Lua (Sol2).
The ease of embedding Lua, even with a C++ wrapper, is incredible. With little effort, I now have something I consider “ready”.
A nice thing about seeing an engine or application support Lua for scripting is that it implies Fennel can be used (and transpiled to Lua). Or at least that is highly likely unless something unusually weird is going on.
Boost.python is not the best scripting tool to be honest. So, that might affect your judgement as well.
UncleEntity 18 days ago [-]
I've yet to find an even decent python binding generator, TBH.
Generally, I use pybindgen to get the basic module and then hack on that by hand. The main problem is most C(++) libraries aren't designed to interoperate with managed memory languages so most of the work is figuring that out. Don't get me wrong, I've tried to work within the binding library (with pybindgen at least) but the amount of work for anything even slightly complicated isn't really worth it.
For a project with a major python API (like blender) you're better off crafting your own python class generator (again, like blended does, and even that has some major issues around object lifetime management). Best would be to design the underlying library/application with python integration in mind but that's not always possible if you want to include other libraries.
I can say I did mess around with using lua as an embedded scripting language within an application years ago and it wasn't too difficult from what I remember. It was only ever a proof-of-concept and didn't go too far so I never ran into the inevitable edge case as one always does with these sorts of things.
miguel_martin 18 days ago [-]
See pybind11 or nanobind
triknomeister 18 days ago [-]
That's very true actually. Boost.Python is not good, but the alternative then is actually doing everything ourselves which seems worse in short term.
conaclos 18 days ago [-]
Is Sol2 a Lua VM or just a wrapper to the standard Lua VM?
delduca 18 days ago [-]
It is a wrapper, in next months it will be also a luau wrapper
rgoulter 18 days ago [-]
> [bolded] Lua isn’t just a high-level language. It’s an embedded dev strategy.
I find it difficult to take any writing seriously when it uses phrases like this.
stinos 18 days ago [-]
The whole article gives me a 'I know LUA and have been using it for years, I also tried MicroPython for a couple of hours, so now I'm ready to draw conclusions' vibe. With some 'Python vs C' on top of it. Not everything written about MicroPython is factually incorrect, but some of the things are so over the top to the point that it becomes ridiculous. Just one example:
MicroPython can be equally readable, but in practice, many projects end up with blurred layers between system code and scripting. That creates a maintenance burden as projects grow.
Yeah, right. Even if this is the case (I find it hard to belive the author has really seen 'many' sort of professional MicroPython projects), where's the proof the language used was the deciding factor in that. And not the project management for instance. Or simply the dev's architecturing abilities.
epcoa 18 days ago [-]
This is just an ad in a trade rag masquerading as an article or something. It's an ad for that Xedge Lua framework.
thomasm6m6 18 days ago [-]
Much of the article resembles chatgptese... though I suppose for adslop it doesn't matter whether it's written by a human or an llm
AlecSchueler 18 days ago [-]
"It isn't just X, it's Y" is textbook ChatGPT.
zelphirkalt 17 days ago [-]
ChatGPT must have learned that from somewhere...
AlecSchueler 17 days ago [-]
You're quite right to call that out! Of course real people also use that same formatting sometimes; but when ChatGPT does it? It's just chef's kiss.
capyba 17 days ago [-]
as other commenters noted, this sounds like chatgpt and it gave me indigestion when reading
10000truths 17 days ago [-]
It's important to distinguish between "embedded" as in platform, and "embeddable" as in "integrate into existing application". MicroPython is used in embedded platforms, yes, but it's not an embeddable runtime like Lua. MicroPython is designed to replace the traditional C runtime with a Python one. The expected use case is to have a minimal C shim that initializes MicroPython, then define the rest of your business logic as a MicroPython script. There's no equivalent of lua_State that allows you to run multiple MicroPython interpreters concurrently in your program. There's no sandboxing functionality for running untrusted user scripts containing "while True: continue" or "oom = 'a'*(10**40)'. MicroPython doesn't solve the "I want rapid iteration with scripts in my game engine" problem, it solves the "I want to read the temperature sensor on my IoT board using Python" problem.
stinos 17 days ago [-]
it's not an embeddable runtime like Lua
While it's true you can't have multiple MicroPython interpreters running concurrently (or at least not easily; it's not that the design makes this impossible, it's just that all in all MicroPython is fairly young and development focus has been put elsewhere), it is possible to embed MicroPython. Not completely out of the box, needs some glue code etc. See for example https://github.com/micropython/micropython/tree/master/ports....
Toit scripting language claims to be 30x faster than MicroPython
Toit founder was "responsible for initial development of V8"
jacknews 18 days ago [-]
Lua is just a much simpler language at heart.
Python does have the 'there should be one, preferably only one, way to do it' mantra, but to me it utterly fails at that, and is in fact a bit of a 'kitchen sink' or 'armchair' language.
That is it's strength in some ways, it's easy and approachable, and has more libraries than perhaps any other language, so you can usually get something working fairly quickly.
But it's not so suited to sparse environments. You can't easily turn a plump armchair with automatic footrests and telescoping side-tables into a plywood Eames.
qznc 18 days ago [-]
Python is easy. Lua is simple.
The problem of "easy" is that it implies hidden complexity for its magic. The problem of "simple" is that it requires more work from its users.
3036e4 18 days ago [-]
Python also seems to have more issues with stability, in the sense that things randomly stop working when you upgrade from version 3.x to 3.x+1. Lua might not be perfect either, but at least it seems common that a platform supports a range of Lua versions instead of forcing an immediate upgrade.
marky1991 17 days ago [-]
Could you give some examples?
3036e4 16 days ago [-]
No, but we have a steady stream of tickets at work for scripts that must be fixed to work with a new minor python version.
I thought this was common knowledge. Just look at the non-empty list of removed things in every minor release changelog. If you have enough code to maintain something is going to affect you directly or indirectly.
Lua is a great language, including for embedded. And I am sure that their Lua-based product is good also. But this article was not very convincing as to why it "beats MicroPython". The criticisms are vague or not correct.
Extending MicroPython in C is quite easy, and one can implement third party modules in the same way all the official modules are made. There is build variable to set for including that into your custom firmware build - nothing particularly tricky about it.
The article goes on to lament that popular libraries from Python world are not available, including numpy. But actually there is a very good reimplementation of numpy (and parts of scipy), called ulab.
KaiserPro 18 days ago [-]
With respect that sounds like marketing fluff.
You use micrpython when you have lots of horsepower and need something fairly robust on the network.
You use C/C++ if you need precise control over power, memory or CPU. Even though if you're doing network stuff its much harder to do quickly and securely. (THere might be better embedded TLS support now)
Lua is frankly just sparkling C. Sure if someone has create a bunch of libraries for you, then great, if not, you've now go to support lua toolchain, and your own microcontrollers toolchain, and port what ever control lib the manufacturer provides yourself.
> You use micrpython when you have lots of horsepower
It runs on a 2350.
blackguardx 18 days ago [-]
Cortex-M33s are decently powerful in the non-Linux embedded world. Micropython isn't competing with Linux, it is competing with bare-metal code.
rasz 17 days ago [-]
2350 with PSRAM is more powerful than top of the line 1997 PC.
KaiserPro 18 days ago [-]
the 2350 is huge compared to an STM8, or atmega.
Palomides 18 days ago [-]
is anyone "serious" using micropython or lua for embedded work?
zevv 18 days ago [-]
I have been developing Lua-heavy embedded products as a freelancer for about 20 years now, including VoIP devices, home automation controllers, industrial routers, digital video recorders, and more. These systems typically consist of a Linux kernel, some libc implementation, the lua interpreter and a few 3d party libs support libs to help building the app. The Lua apps ranges from 30k to 100k lines of code, depending on the application. Some of these devices can be considered 'small' in 2025 terms: 8MB of flash, 64MB of ram. Lua is doing great here.
All of these products are still alive today, actively supported and making my customers good money.
Some things come very natural to Lua: Lua <=> C interfacing is a breeze, and while some modern languages are still struggling to figure out how to do proper async, Lua has been able to do this for decades. The language itself is minimal and simple but surprisingly powerful - a few smart constructs like coroutines, closures and metatables allow for a lot of different paradigms.
For new projects at this scale, I would still choose Lua + C/C++ as my stack. Over the last few years I have been visiting other ecosystems to see what I'm missing out on (Elixir, Rust, Nim), and while I learned to love all of those, I found none of them as powerful, low-friction and flexible as Lua.
conaclos 18 days ago [-]
I am currently working on an embedded system with 264Kb of RAM and 4Mb of flash. Do you think Lua could be used in such limited settings? I am also considering the berry scripting language [0].
Assuming your flash allows XIP (execute in place) so all that memory is available for your lua interpreter data, you should at least be able to run some code, but don't expect to run any heavy full applications on that. I don't know Berry but it sounds like a better fit for the scale of your device.
But sure, why not give it a try: Lua is usually easy to port to whatever platform, so just spin it up and see how it works for you!
conaclos 18 days ago [-]
It is a RP2040. We plan to eventually upgrade to RP2350B.
I haven't worked on a system that limited (not even OpenWRT routers) since a dev board in college.
The experience I had there might be your best bet for something productive. That board came with a 'limited C-like compiler' (took a mostly complete subset of C syntax and transcribed it to ASM).
You'll probably be doing a lot of things like executing in place from ROM, and strictly managing stack and scratch pad use.
The 64MB of RAM and 8MB (I assume that's 64Mbit) of ROM allow for highly liberating things like compressed executable code copied to faster RAM, modify in place code, and enough spare RAM otherwise to use scripting languages and large buffers for work as desired.
18 days ago [-]
rleigh 17 days ago [-]
It's more than generous. You can run it with much less resource utilisation than this. It only needs a few tens of kilobytes of flash (and you can cut it right back if you drop bits you don't need in the library code). 32 KiB is in the ballpark of what you need. As for RAM, the amount you need depends upon what your application requires, but it can be as little as 4-8 KiB, with needs growing as you add more library code and application logic and data.
If you compare this with what MicroPython uses, its requirements are well over an order of magnitude larger.
matt_trentini 17 days ago [-]
Those are generous resources for MicroPython. And it'll be faster and less quirky to develop in than either Berry or Lua.
mrheosuper 16 days ago [-]
the article is comparing micropython and lua, so i assume this is about MCU, not embedded linux.
Most of the time you have less than 1MB of ram on MCU
matt_trentini 18 days ago [-]
Yes, we use MicroPython for medical device development up to class B.
xchip 4 days ago [-]
Could you give a link to such devices?
matt_trentini 2 days ago [-]
I can't reveal specific implementation details but I work for Planet Innovation:
Some of our products use MicroPython though we also use a whole host of other technologies. Some of our devices are proof-of-concept (often designed to progress a theory) but we also deliver up to Class B solutions.
5d41402abc4b 17 days ago [-]
How do you handle memory fragmentation?
matt_trentini 17 days ago [-]
Carefully, at least for devices with higher classifications. Using pre/early allocation helps but, more importantly, we monitor memory use over time in realistic scenarios. We've built tooling, like a memory-profiler [1] that allows us to observe memory consumption and quantify performance over time.
However, it turns out that MicroPython has a simple and efficient GC - and once you eliminate code that gratuitously fragments memory it behaves quite predictably. We've tested devices running realistic scenarios for months without failure.
The embedded world is really vast. If it's something safety critical, regulations won't allow it. But the regulations say nothing about all the test rigs you'll be building. IoT is another domain where people do whatever they find convenient.
jononor 17 days ago [-]
Just as one example, MicroPython is deployed in several satellites (cubesats). There are some conference talks and podcasts about it.
pmarreck 18 days ago [-]
there are thousands of products that use Lua underneath or in some capacity. I investigated LuaJIT these past few months and I think it is underrated.
joezydeco 18 days ago [-]
Define "serious"?
Every so often I have a need for a small cheap device interoperating with a larger system that I'm developing. Like something that sits on MODBUS and does a simple task when signalled. I've taken the RP2040 and Pico board and spun it into a gizmo that can do whatever I want with Micropython, and it's an order of magnitude cheaper and faster than trying to spin it up in STMCube.
ycombinatrix 17 days ago [-]
"Serious embedded devs" are probably using a compiled language.
Dork1234 17 days ago [-]
Well it is compiled into Bytecode.
skybrian 18 days ago [-]
As a hobbyist I just use Arduino (via platformio). I don't think I need an interpreter of any sort for microcontrollers because recompiling and uploading the flash on hobbyist boards is quick and easy.
But I'd like to try some other compiled language someday because I'm not a big fan of C++. Any recommendations for something that works well with a Raspberry Pi Pico?
Rebelgecko 18 days ago [-]
I'm not an expert but from what I've seen Rust is one of the big alternatives. It's new enough to be trendy and fixed a lot of C++ issues whole being mature enough that the tooling is ok.
I'm also intrigued by Zig. I haven't used it for anything yet but the language looks fun and I believe platform.io supports it
Raspberry Pis are beefy enough that you could also get away with less systems-y languages. I like Kotlin. By default Kotlin needs a JVM but I think it's usable if you build native executables. However if you want to fiddle with GPIOs you might have to do it by manually setting things on/off in the filesystem (edit, sorry, just read you're using a PICO. Not sure how well Kotlin is supported)
20after4 18 days ago [-]
The Pi Pico is a microcontroller, not a full linux-capable SBC. The confusion is helped by the fact that Raspberry Pi has a bunch of variants and almost all of them are SBCs.
The problem with Micropython is that they don't have unit tests for the features they implement, so they keep breaking things over and over, or fix something but brake something else.
matt_trentini 13 days ago [-]
Unit test coverage is exceptionally high for the interpreter, and always has been:
Testing the port-specific code has been a much more challenging problem.
Which is why in the past year or two there's been a lot of energy put in to HIL testing. There are now a few dozen boards that are automatically tested with an increasingly rich suite of hardware tests. Both the number of boards and tests are increasing rapidly.
It's not perfect but it's getting pretty darn good. If you want to help out please do reach out.
xchip 9 days ago [-]
Then why do they have 1300 issues open? Good luck to you but it looks like a land mine to me. They are 420 pull requests not merged, so that is a lot of disappointed developers.
They need to test the whole ecosystem, that is the usual interaction with typical peripherals (I2C, I2S, SPI, Serial, interrupts...) and for every bug fixed there should be a unit test to make sure future changes don't cause regressions.
I had this conversation with them but they take every opportunity to justify/ask for more funding.
They are also prioritizing adding support for new chips but they quickly get eaten alive by chaos, regressions and platform fragmentation.
matt_trentini 6 days ago [-]
> Then why do they have 1300 issues open? There are 420 pull requests not merged...
Because it's a large, widely-used project with relatively few core developers. Many other projects have similar numbers of issues, that's not unique to MicroPython! For example, I admire Zephyr's organisation, yet they have double the issues and 5x the PRs.
> They need to test the whole ecosystem...and for every bug fixed there should be a unit test...
Sure, that's a great ideal to strive for. That was - and is - the case for the language and interpreter. It's becoming more common for port-specific code as the tools to allow automated hardware testing are implemented. But it doesn't happen overnight and it's a large surface area.
> I had this conversation with them but they take every opportunity to justify/ask for more funding.
Ah, I presume you're referring to this discussion?
No-one asked for funding. Some noted that such testing infrastructure doesn't come without effort or for free.
BTW I'm not sure if that was you, but that post had a pretty unhelpful, disrespectful tone. If that was you, I suggest you check your tone and get in and help create some tests. We'd appreciate the help!
> They are also prioritizing adding support for new chips...
That's often paid work. Recent paid work has helped fund HIL testing. Further, a careful expansion of the number of ports is necessary to avoid becoming irrelevant.
> ...get eaten alive by chaos, regressions and platform fragmentation
I use MicroPython commercially, daily...and I don't share your experience with chaos or regressions. Sure, sometimes issues occur, particularly if you're on the bleeding edge - but they're pretty well controlled and rapidly fixed. YMMV.
As for platform fragmentation, I struggle to understand your concerns; it has been improving for a long time now as developers have focused on ensuring compatibility between ports. Again, yes, there are some differences - but given how radically different the micros can behave, the consistency between ports is pretty remarkable.
xchip 4 days ago [-]
>testing infrastructure doesn't come without effort or for free.
We don't implement unit tests because it is more effort, we implement them because in the long run is less effort and we catch issues before they happen.
Up to you.
theoutfield 18 days ago [-]
Another scripting language that I’ve been using in embedded systems with a little more memory is AngelScript. It’s underrated in this space. It’s very easy to extend and has the advantage of being a subset of C/C++.
jokoon 18 days ago [-]
I dislike how the syntax of lua feels, maybe the same for its semantics
gdscript is so awesome
analog31 18 days ago [-]
Is there a way to try out embedded Lua within the Arduino dev environment? Yeah I know, friends don't let friends, but I'm still curious.
7thaccount 18 days ago [-]
Actually I haven't heard this. Does Arduino have a bunch of controversy now?
analog31 17 days ago [-]
It evokes the "real engineers don't use it" reaction, similar to any tool that's perceived as being of primary interest to hobbyists and students.
Some concerns are valid: Arduino doesn't have as much flexibility for digging really deep into things like pin and memory assignments and what happens when a microcontroller starts up. Also, the quality and documentation of Arduino support can vary from one MCU family to another.
There's a concern about the quality of libraries and code.
It doesn't support hardware debugging.
Granted, the embedded community has good reasons for being conservative, especially for critical applications.
smohare 18 days ago [-]
What’s the dev experience actually like for serious Lua? I’ve only used it for some basic neovim configuration. The dynamicism and lack of type hinting in Python that was the norm when I started having to review professional code, after transitioning from pure mathematics, was a major cognitive blocker for me. With a math paper I could typically skim the first few pages to map the author’s particular symbology (if non-standard) and know exactly what was being expressed. I could never do that with untyped code.
unleaded 18 days ago [-]
Lua can be quite an elegant language once you get to know it well, but you can still use it like most other programming languages, it's not that weird. There are quirks like arrays starting at 1 (and all arrays being hash tables) but they don't take that long to get used to. The real strength is in the ecosystem and implementation itself, it's designed to be easily embedded into applications and there's not really much else like it. Some developers want to incorporate a scripting language into their project and get turned off by Lua's quirks and choose something else, but it usually ends up causing more problems than it was meant to solve.
On typing, there's only a few main ones you need to worry about—strings, functions, tables and numbers. I don't think it does weird things like JS where it converts between them without asking. Luau adds some type hinting if it's a big point of concern but I haven't really looked into it much.
SV_BubbleTime 18 days ago [-]
> There are quirks like arrays starting at 1
I know it’s probably and overreaction, but this was a compete non-starter for me.
Lyngbakr 18 days ago [-]
While I understand the aversion, I can't help but think that people miss out on some really cool experiences when they balk at stuff like 1-based arrays or parens in Lisp or whatever. Sure, those quirks may remain deal breakers after you've given the language a thorough try, but you may also gain super interesting new perspectives in the process.
docandrew 18 days ago [-]
Not having to put length-1 everywhere is a good thing, actually.
const_cast 17 days ago [-]
Probably just use a .last method or something. Reads better too.
pmarreck 18 days ago [-]
There are languages that transpile to Lua that get you things like typing…
> What’s the dev experience actually like for serious Lua
The dev experience for lua is f-ing awful.
The language is small, but not “simple”; it’s stuck in 1985.
The tooling is nearly non-existent and the stuff that does exist is a joke.
The few libraries that exist are awful and barely maintained; the maintained libraries are maintained by neckbeards with a god-complex.
The community is “reimplementing everything by hand is the right way to do it AND ALSO you’re an idiot doing it wrong” toxic. There are a million good reasons why it’s only has a foothold in nginx and Roblox territory.
It’s not a joke to say that it’s f-ing terrible all the way around.
hajile 17 days ago [-]
Lua was designed in 1993. This was back when MS-DOS was still cutting edge and the 66MHz Pentium was cutting-edge consumer technology.
Even "modern" languages often don't have features like first-class functions, closures, proper tail calls, etc that Lua has had for a very long time now. LuaJIT also trades blows with JS almost certainly making it the fastest or second-fastest dynamic language around.
There's a lot to like about the language (aside from array indexes starting at 1), but I think you are right about the ecosystem and probably right about most of community.
guenthert 17 days ago [-]
MS-DOS was never cutting edge, but I came here because I wonder about the performance claims.
makes it look fairly slow, competing with Ruby, rather then JS.
hajile 17 days ago [-]
He removed LuaJIT from the benchmark opting to use the much slower Lua (Lua is used for small embedded stuff and LuaJIT is used for high-performance stuff with a popular recent example being the game Balatro). Here's the Vercel version which uses LuaJIT. I have no idea if the benchmarks were deoptimized by him (as he did to several language he thought "were too fast").
LuaJIT used to have a performance comparison page, but it was removed a couple years ago apparently. It shows that LuaJIT is an average of around 16x faster than standard Lua (geomean). Very impressive considering it was written by just one guy for years (Mike Pall) and even more impressive when you consider that the JIT is only around 0.5mb (compare that with v8 which comes in at somewhere around 28mb IIRC).
LuaJIT supports most of the features of Lua versions after 5.1 with the major missing feature being 64-bit integers, but like modern JS JITs, it actually uses 31/32-bit ints internally most of the time. Even in Lua 5.4 code, you are using implicit rather than explicit ints 99% of the time.
I haven't run the code to see, but I'm willing to bet that you can copy all the current benchmark code into LuaJIT and it'll run just fine.
> You have no idea.
I know with certainty that deoptimizations were applied to at least some scripts. Here's three examples for Common Lisp, StandardML, and Haskell over some time.
It's not a question of if this happens -- only if it affects Lua (I've never checked).
mardifoufs 16 days ago [-]
Wait, that's news to me. Last time I checked, maybe 2 years ago, luajit supported some Lua >5.2 features. But, just a handful of those. Is it truly supporting most of the newer features nowadays?
igouy 16 days ago [-]
> extensions.html
On that page, what words do you think support your claim "was removed before Lua 5.2 was released".
> I know with certainty
I made those code changes. I wrote de-optimized as a joke.
docandrew 18 days ago [-]
nginx and Roblox and redis and nmap and neovim and cryengine … the list goes on
There are a LOT of tools with embedded Lua scripting capabilities.
giraffe_lady 18 days ago [-]
8 years of full time professional lua development experience here and unfortunately I agree with all of this. I use fennel when I can; it doesn't improve any of the library or tooling problems but it doesn't make them worse either and addresses several of the problems with the language semantics itself.
uamgeoalsk 18 days ago [-]
[flagged]
Rendered at 07:25:20 GMT+0000 (UTC) with Wasmer Edge.
The ease of embedding Lua, even with a C++ wrapper, is incredible. With little effort, I now have something I consider “ready”.
Not to mention, it’s a very lightweight VM.
https://github.com/willtobyte/carimbo
https://fennel-lang.org/
Generally, I use pybindgen to get the basic module and then hack on that by hand. The main problem is most C(++) libraries aren't designed to interoperate with managed memory languages so most of the work is figuring that out. Don't get me wrong, I've tried to work within the binding library (with pybindgen at least) but the amount of work for anything even slightly complicated isn't really worth it.
For a project with a major python API (like blender) you're better off crafting your own python class generator (again, like blended does, and even that has some major issues around object lifetime management). Best would be to design the underlying library/application with python integration in mind but that's not always possible if you want to include other libraries.
I can say I did mess around with using lua as an embedded scripting language within an application years ago and it wasn't too difficult from what I remember. It was only ever a proof-of-concept and didn't go too far so I never ran into the inevitable edge case as one always does with these sorts of things.
I find it difficult to take any writing seriously when it uses phrases like this.
MicroPython can be equally readable, but in practice, many projects end up with blurred layers between system code and scripting. That creates a maintenance burden as projects grow.
Yeah, right. Even if this is the case (I find it hard to belive the author has really seen 'many' sort of professional MicroPython projects), where's the proof the language used was the deciding factor in that. And not the project management for instance. Or simply the dev's architecturing abilities.
While it's true you can't have multiple MicroPython interpreters running concurrently (or at least not easily; it's not that the design makes this impossible, it's just that all in all MicroPython is fairly young and development focus has been put elsewhere), it is possible to embed MicroPython. Not completely out of the box, needs some glue code etc. See for example https://github.com/micropython/micropython/tree/master/ports....
Lua ranks higher than MicroPython in this list
https://www.farginfirmware.com/home/lua-vs-micropython
Github account "SkipKaczinksi" thinks Lua is generally faster
https://hackaday.com/2020/11/14/micropython-on-microcontroll...
Hackaday commenter "Michael Polia" suggests Lua, smaller, faster
https://www.cnx-software.com/2021/11/28/toit-open-source-lan...
Toit scripting language claims to be 30x faster than MicroPython
Toit founder was "responsible for initial development of V8"
Python does have the 'there should be one, preferably only one, way to do it' mantra, but to me it utterly fails at that, and is in fact a bit of a 'kitchen sink' or 'armchair' language.
That is it's strength in some ways, it's easy and approachable, and has more libraries than perhaps any other language, so you can usually get something working fairly quickly.
But it's not so suited to sparse environments. You can't easily turn a plump armchair with automatic footrests and telescoping side-tables into a plywood Eames.
The problem of "easy" is that it implies hidden complexity for its magic. The problem of "simple" is that it requires more work from its users.
I thought this was common knowledge. Just look at the non-empty list of removed things in every minor release changelog. If you have enough code to maintain something is going to affect you directly or indirectly.
https://docs.python.org/pl/dev/whatsnew/3.15.html
You use micrpython when you have lots of horsepower and need something fairly robust on the network.
You use C/C++ if you need precise control over power, memory or CPU. Even though if you're doing network stuff its much harder to do quickly and securely. (THere might be better embedded TLS support now)
Lua is frankly just sparkling C. Sure if someone has create a bunch of libraries for you, then great, if not, you've now go to support lua toolchain, and your own microcontrollers toolchain, and port what ever control lib the manufacturer provides yourself.
Or, as this is a marketing page, pay https://realtimelogic.com/products/xedge/ to do it for you.
It runs on a 2350.
All of these products are still alive today, actively supported and making my customers good money.
Some things come very natural to Lua: Lua <=> C interfacing is a breeze, and while some modern languages are still struggling to figure out how to do proper async, Lua has been able to do this for decades. The language itself is minimal and simple but surprisingly powerful - a few smart constructs like coroutines, closures and metatables allow for a lot of different paradigms.
For new projects at this scale, I would still choose Lua + C/C++ as my stack. Over the last few years I have been visiting other ecosystems to see what I'm missing out on (Elixir, Rust, Nim), and while I learned to love all of those, I found none of them as powerful, low-friction and flexible as Lua.
[0] https://berry-lang.github.io/
Assuming your flash allows XIP (execute in place) so all that memory is available for your lua interpreter data, you should at least be able to run some code, but don't expect to run any heavy full applications on that. I don't know Berry but it sounds like a better fit for the scale of your device.
But sure, why not give it a try: Lua is usually easy to port to whatever platform, so just spin it up and see how it works for you!
Viper Bytecode emitter.
The experience I had there might be your best bet for something productive. That board came with a 'limited C-like compiler' (took a mostly complete subset of C syntax and transcribed it to ASM).
You'll probably be doing a lot of things like executing in place from ROM, and strictly managing stack and scratch pad use.
The 64MB of RAM and 8MB (I assume that's 64Mbit) of ROM allow for highly liberating things like compressed executable code copied to faster RAM, modify in place code, and enough spare RAM otherwise to use scripting languages and large buffers for work as desired.
If you compare this with what MicroPython uses, its requirements are well over an order of magnitude larger.
Most of the time you have less than 1MB of ram on MCU
https://planetinnovation.com/
Some of our products use MicroPython though we also use a whole host of other technologies. Some of our devices are proof-of-concept (often designed to progress a theory) but we also deliver up to Class B solutions.
However, it turns out that MicroPython has a simple and efficient GC - and once you eliminate code that gratuitously fragments memory it behaves quite predictably. We've tested devices running realistic scenarios for months without failure.
[1] https://github.com/pi-mst/micropython-memory-profiler
Every so often I have a need for a small cheap device interoperating with a larger system that I'm developing. Like something that sits on MODBUS and does a simple task when signalled. I've taken the RP2040 and Pico board and spun it into a gizmo that can do whatever I want with Micropython, and it's an order of magnitude cheaper and faster than trying to spin it up in STMCube.
But I'd like to try some other compiled language someday because I'm not a big fan of C++. Any recommendations for something that works well with a Raspberry Pi Pico?
I'm also intrigued by Zig. I haven't used it for anything yet but the language looks fun and I believe platform.io supports it
Raspberry Pis are beefy enough that you could also get away with less systems-y languages. I like Kotlin. By default Kotlin needs a JVM but I think it's usable if you build native executables. However if you want to fiddle with GPIOs you might have to do it by manually setting things on/off in the filesystem (edit, sorry, just read you're using a PICO. Not sure how well Kotlin is supported)
https://micropython.org/resources/code-coverage/
('py' folder: 99.2% line, 88.7% branch coverage)
Testing the port-specific code has been a much more challenging problem.
Which is why in the past year or two there's been a lot of energy put in to HIL testing. There are now a few dozen boards that are automatically tested with an increasingly rich suite of hardware tests. Both the number of boards and tests are increasing rapidly.
It's not perfect but it's getting pretty darn good. If you want to help out please do reach out.
They need to test the whole ecosystem, that is the usual interaction with typical peripherals (I2C, I2S, SPI, Serial, interrupts...) and for every bug fixed there should be a unit test to make sure future changes don't cause regressions.
I had this conversation with them but they take every opportunity to justify/ask for more funding.
They are also prioritizing adding support for new chips but they quickly get eaten alive by chaos, regressions and platform fragmentation.
Because it's a large, widely-used project with relatively few core developers. Many other projects have similar numbers of issues, that's not unique to MicroPython! For example, I admire Zephyr's organisation, yet they have double the issues and 5x the PRs.
> They need to test the whole ecosystem...and for every bug fixed there should be a unit test...
Sure, that's a great ideal to strive for. That was - and is - the case for the language and interpreter. It's becoming more common for port-specific code as the tools to allow automated hardware testing are implemented. But it doesn't happen overnight and it's a large surface area.
> I had this conversation with them but they take every opportunity to justify/ask for more funding.
Ah, I presume you're referring to this discussion?
https://github.com/orgs/micropython/discussions/13436
No-one asked for funding. Some noted that such testing infrastructure doesn't come without effort or for free.
BTW I'm not sure if that was you, but that post had a pretty unhelpful, disrespectful tone. If that was you, I suggest you check your tone and get in and help create some tests. We'd appreciate the help!
> They are also prioritizing adding support for new chips...
That's often paid work. Recent paid work has helped fund HIL testing. Further, a careful expansion of the number of ports is necessary to avoid becoming irrelevant.
> ...get eaten alive by chaos, regressions and platform fragmentation
I use MicroPython commercially, daily...and I don't share your experience with chaos or regressions. Sure, sometimes issues occur, particularly if you're on the bleeding edge - but they're pretty well controlled and rapidly fixed. YMMV.
As for platform fragmentation, I struggle to understand your concerns; it has been improving for a long time now as developers have focused on ensuring compatibility between ports. Again, yes, there are some differences - but given how radically different the micros can behave, the consistency between ports is pretty remarkable.
We don't implement unit tests because it is more effort, we implement them because in the long run is less effort and we catch issues before they happen.
Up to you.
gdscript is so awesome
Some concerns are valid: Arduino doesn't have as much flexibility for digging really deep into things like pin and memory assignments and what happens when a microcontroller starts up. Also, the quality and documentation of Arduino support can vary from one MCU family to another.
There's a concern about the quality of libraries and code.
It doesn't support hardware debugging.
Granted, the embedded community has good reasons for being conservative, especially for critical applications.
On typing, there's only a few main ones you need to worry about—strings, functions, tables and numbers. I don't think it does weird things like JS where it converts between them without asking. Luau adds some type hinting if it's a big point of concern but I haven't really looked into it much.
I know it’s probably and overreaction, but this was a compete non-starter for me.
https://typescripttolua.github.io/
I’m personally a fan of YueScript which is basically an evolution of MoonScript (but it’s not typed).
https://yuescript.org/
LuaJIT has ridiculously easy C interop.
The dev experience for lua is f-ing awful.
The language is small, but not “simple”; it’s stuck in 1985.
The tooling is nearly non-existent and the stuff that does exist is a joke.
The few libraries that exist are awful and barely maintained; the maintained libraries are maintained by neckbeards with a god-complex.
The community is “reimplementing everything by hand is the right way to do it AND ALSO you’re an idiot doing it wrong” toxic. There are a million good reasons why it’s only has a foothold in nginx and Roblox territory.
It’s not a joke to say that it’s f-ing terrible all the way around.
Even "modern" languages often don't have features like first-class functions, closures, proper tail calls, etc that Lua has had for a very long time now. LuaJIT also trades blows with JS almost certainly making it the fastest or second-fastest dynamic language around.
There's a lot to like about the language (aside from array indexes starting at 1), but I think you are right about the ecosystem and probably right about most of community.
https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
makes it look fairly slow, competing with Ruby, rather then JS.
https://programming-language-benchmarks.vercel.app/lua-vs-ja...
LuaJIT used to have a performance comparison page, but it was removed a couple years ago apparently. It shows that LuaJIT is an average of around 16x faster than standard Lua (geomean). Very impressive considering it was written by just one guy for years (Mike Pall) and even more impressive when you consider that the JIT is only around 0.5mb (compare that with v8 which comes in at somewhere around 28mb IIRC).
https://web.archive.org/web/20230605114058/http://luajit.org...
You have no idea.
> He removed LuaJIT
LuaJIT supports Lua 5.1
The benchmarks game shows Lua 5.4.7
From what I understand, LuaJIT was removed before Lua 5.2 was released, so that wasn't the reason.
https://luajit.org/extensions.html
LuaJIT supports most of the features of Lua versions after 5.1 with the major missing feature being 64-bit integers, but like modern JS JITs, it actually uses 31/32-bit ints internally most of the time. Even in Lua 5.4 code, you are using implicit rather than explicit ints 99% of the time.
I haven't run the code to see, but I'm willing to bet that you can copy all the current benchmark code into LuaJIT and it'll run just fine.
> You have no idea.
I know with certainty that deoptimizations were applied to at least some scripts. Here's three examples for Common Lisp, StandardML, and Haskell over some time.
https://zerf.gitlab.io/ComputerLanguageBenchmarksGame2018Arc...
https://github.com/lemire/ComputerLanguageBenchmark/blob/fbe...
https://hackage.haskell.org/package/ajhc-0.8.0.4/src/example...
Here's a C example from Mike Pall (presumably the same guy who created LuaJIT) that also got the deopt treatment by Isaac Gouy.
https://github.com/lemire/ComputerLanguageBenchmark/blob/fbe...
It's not a question of if this happens -- only if it affects Lua (I've never checked).
On that page, what words do you think support your claim "was removed before Lua 5.2 was released".
> I know with certainty
I made those code changes. I wrote de-optimized as a joke.
There are a LOT of tools with embedded Lua scripting capabilities.