Discussion forum for David Beazley

So I've been pulled into this.. "programming" thing. Guess there's like a community of some kind?


Stumbled across the Beaz Odyssey. I’m having a lot of fun playing with this stuff. This little “cross section” of computer science. It’s hard to pinpoint what “this stuff” is as a collective, but I’m damn interested!

So how do I get involved in this shenaniganry?

For reference, I’ve got an arbitrary solid math background, with limited computer science experience. (Certainly not “nothing”, but I’m not sure how you guys would define “nothing” :sweat_smile:)

Without regard to the above, I’d love to invite any discussion anyone feels like having.


Hmmm. The Beaz Odyssey–that’s probably a bit hard to pin down exactly. I’d probably describe most of this as a kind of independent academia. I used to be a university professor and still lean in the direction of pursuing experimental projects. Python training courses are what pays the bills for it.

I’m not sure there is any kind of formal community per-se, but if you’re looking to get involved, some of the new software projects such as Curio and SLY are where I’d look. Plus, I think there’s some interesting computer science and software engineering to be found therein.


Thanks for the answer. I may - may - have felt a thrum of “fanatical pride” reading it. (You know, that "Holy crap the big boss just talked to me " kinda vibe? Whatever you want to call it.)
Hope that brings christmas cheer to your heart.

Do you have any interest in the architectural elements of the problems that Curio and SLY ostensibly tackle?

You’ve made me wonder, to what degree can asynchronous functionalities be lifted to the chip itself, where multitasking is genuinely attainable. (I’m sure there’s some terrible price to be paid, but you’ve got me on the where or why. Feels like a hardware problem best solved by hardware solutions. Then again, my ramen noodles caught on fire like an hour ago, which is apparently a thing that can happen.)

In a simliar vein, what stops us from building high-level parsers and lexers out of silicon? It would be nice to have components that supported algebras directly over instruction sets, for example. Maybe an array of these could cast various masks over the real instruction set. Quick and efficient emulation of arbitrary virtual machines being “the dream” there, I suppose.