Generating visual art every day in January
There’s a small but dedicated community of crazy artists that gather every January to generate stunning visual art for no apparent reason other than Genuary being a clever pun on January. I joined them this year to get better at PyScript and WebGL. Here’s how it went.
First off, this sort of pace where you’re releasing something every day is not incredibly sustainable. I admire people who do it for long periods of time. For me even 31 days was too much. It’s hard to keep all the balls in the air when you juggle with such speed. In 2023 I managed to complete every single day of Jamuary, which is the musical equivalent. I learned a lot as well, but I won’t be repeating that. It’s a breakneck pace.
And indeed, this Genuary wasn’t entirely by the book. Work and life in general interfered, so some prompts were done on other days, some prompts were combined, and in the end I’m left with four prompts I didn’t touch at all. Don’t rat me out to the #genuarypolice
😉 The most important bit for me is that I was working on this every day in January, and gained tons of experience by doing that.
The prompts were very well chosen. You’d think that they lock you into a particular thing, but that’s not really the case. You’d be surprised how differently artists responded to the same given prompt. For me in particular they opened up really useful ways of slicing and dicing WebGL into pieces digestible daily. I learned a ton and achieved some results I dreamed about for years!
What I have now is:
- a template for 3D PyScript creations that allows me to switch between MicroPython and Pyodide as needed;
- support for synchronizing animation with music, including beat and frequency synchronization if you provide it with a prepared file;
- interaction with Monome USB controllers like button grids and endless encoders;
- Python-based post-processing
EffectComposer
pass implementations for Three.js; - a backup script that lets me bring all PyScript.com files locally;
- a fast uvicorn-based local server to iterate on things without Internet access;
- a Python terminal you can interact with to move things around, you can resize it and close with
CTRL-`
(backtick).
I can go on, but I hope you can see what all of this enables. I always wanted to be able to make visualizations for my music, and now I feel like I ended up with much more than that.
My most popular creations
Based on social media popularity, those were my top five creations for the month. Note that there’s music.
If you don’t see nice iframe
embeds below, it’s probably your ad blocker. You can disable it on this website, there are no ads here.
Creations by others I really liked
Impossible to list everything so let’s go with five that made an impression on me. I could easily list another twenty, in fact I initially did, but the article then became pretty unreadable.
The state of PyScript
I love this platform. It enabled me to do visual arts in ways that I always thought was impossible with Python. So let’s talk a bit about what it allows and does not allow you to do.
No free lunch
The most important thing you need to understand is that PyScript is not a way to not write JavaScript in the browser or never look at JavaScript again. When working with PyScript, you are thinking in the browser, so your Python code is interfacing with JavaScript very often. That might change in the future when/if PyScript grows a set of first-class libraries for stuff like 3D and audio. As it stands, you wouldn’t get very far without existing JavaScript libraries.
For my WebGL adventures I chose Three.js as it’s the most mature and full-featured library for 3D graphics I know of. This meant that I was reading JavaScript example code and JavaScript API documentation every day. This wasn’t a problem at all, you can get very far without being a JavaScript expert. But you do need to understand how things in the browser work, and you will slowly become a Pyodide/MicroPython expert.
Pyodide is a leaky abstraction
Pyodide is the full-featured Python interpreter provided by PyScript. It allows you to use real CPython 3.12 in the browser and a subset of PyPI packages 1. It takes a long while to start up but then it works pretty well. It is an interpreter on top of an interpreter, so it’s naturally slower. That means that when you write your animation frame computation or worker processing code in Python, it can often be a challenge to achieve realtime performance. But it’s doable.
But this wasn’t my biggest challenge, as it turns out. The biggest issue was that when you’re interacting with JavaScript from Python, you are calling into a foreign language. It is quite literally an FFI, like “cffi” for C or “pyo3” for Rust. Pyodide put a lot of effort into making this FFI transparent, and I’m not sure what I think about it. You see, initially it feels like magic. You’re instantiating JavaScript objects from Python and call JavaScript functions, passing Python dictionaries, strings, and ints. Everything works, it’s amazing. But soon enough you discover that it’s not actually “everything”. Sometimes you get a cryptic error, other times the JavaScript side just silently does something unexpected and your program doesn’t behave like the example in the JavaScript docs.
I managed to solve most of the issues I found along the way, but I must warn you that PyScript is at the moment a rather uneven experience for beginners. When it works, it feels incredibly empowering. But there’s plenty of occasions where your code will error out for some reason and the error message is either non-existent or misleading. If you don’t know Python well and don’t know JavaScript at least somewhat, you’ll be frustrated. If you manage to find it and know that “FFI” is what you’re looking for, PyScript documents some gotchas. I found this document pretty late in my journey with PyScript, it didn’t show up in any Web searches for the errors I was seeing. I’m pretty sure I won’t be the only one with a similar experience.
Let me give you some examples. The most visible pain point with WebGL is that there’s plenty of things you schedule to happen “in the background” by setting callbacks. If you just pass a Python function as such a callback, the code will execute but then crash. If you look into the JavaScript console, you’ll discover a message about a borrowed proxy being destroyed at the end of a function call. Better yet, it will tell you to call pyodide.setDebug(true)
to learn more, but that doesn’t work in your Python files. You’re actually supposed to call pyodide_js.setDebug(True)
from PyScript code, but that won’t really tell you much. You need to learn from the docs yourself that the Python garbage collector is unaware of JavaScript holding onto Python objects. You need to create proxy objects manually, otherwise the callback will get destroyed prematurely. The most elegant way is to wrap callbacks that are supposed to only be called once in @create_once_callable
and ones that will be called many times with @create_proxy
. You can still call those wrapped functions from Python.
Another example is when you pass a dictionary as an argument to a JavaScript function. Whether that works is really hit or miss. You see, when you’re passing a Python dictionary, it behaves like a JS Map. The problem is that in JavaScript most often it’s not Map instances but regular Objects that are used like dictionaries. Some JS code in effect is dealing with the data you’re passing as if it were JS Objects, and that fails when you’re passing a dict. The proxy conversion makes it a Map, but you want an object. There is an incantation to convince Pyodide to make the argument an object, but when you’re faced with a strange error, you need some experience to know what to do. Things get even more complex on nested data structures where garbage collection lifecycles come into play again.
Some JS APIs allow for a single object to be passed as the sole argument to a function, just so that you can use “named arguments”. For this, you can actually use keyword arguments in Pyodide and it works pretty well. However, if your JS function accepts some regular positional arguments and an object after, you need to be careful.
The proxies usually work well enough, but things get wonky when you, say, pass a list and the code on the JavaScript side checks Array.isArray(the_arg_you_passed)
. The proxy will not be treated as an array and so JavaScript will run an unexpected path here. Same with instanceof
checks. The way to solve this is to explicitly wrap your list in an js.Array
on the Python side. But you don’t want to do that every time since that’s going to be slower.
There is more to how type translation works in Pyodide, I don’t want to spend too much time on that now. But the point is that sooner or later those gotchas will bite you. Once you know what to do, it’s not a big deal, and you move on. But the first time you encounter those issues can be pretty confusing. Especially that debugging across the language boundaries is hard.
Debugging
Speaking of debugging! There’s the JavaScript console in Chrome DevTools, you can find Sources (look for latest/
and then cdn.jsdelivr.net/
to find Three.js) and set breakpoints. When a JavaScript function receives an argument from Python, you’ll get a Proxy()
object and you can do a little digging in Scope with the locals to see what’s going wrong. Going through the call stack is pretty tricky since you’ll find that many frames are pyodide.asm.wasm
frames with names like $func3417
. But yeah, you can at least figure out if the code that you think should be called is actually called.
Debugging pure Python is somewhat easier, because you can mark your script tag with worker terminal
, in which case you can put a breakpoint()
in your codebase and you’ll get a pseudoterminal pdb that you can play with. It’s somewhat limited, it won’t show you the line you’re on, but you can at least look around and execute Python code at the breakpoint. With worker terminal
print output also lands there, which is easier to see than on the JavaScript console. On the other hand, not every error will be displayed in the terminal, some still go to the JavaScript console. So now you have two places to look for issues.
To be clear, most of debugging is you doing something wrong and trying to figure out what that was. The confusion arises due to the FFI nature of talking to JavaScript with its tricky type translation behavior and object garbage collection life cycles, as well as the necessity to look at two separate debuggers.
Workers
Using workers is pretty cool, because your main thread can continue serving responsive UI and your browser won’t warn you that your “web page froze”. However, moving all your code to a worker is probably not what you want, because your performance will suffer further. First off, any calls that draw anything on screen now go through a magical transparent message queue to be actually executed on the main thread anyway. If you execute too many of those calls too quick, things slow down on both ends. The worker can communicate with the main thread with sync
functions, however those cause the arguments passed to cross the JavaScript boundary again, so you’re back in proxy and type translation land.
What about MicroPython?
I’ve been encouraged by the PyScript team to try MicroPython. Historically, I couldn’t get it to work well enough for my needs, so I avoided that for the first half of the month. At some point, however, the issues with Pyodide I highlighted above made me want to try. It took me an entire day to get my project template code running, I can’t really tell whether all the problems I faced were on the MicroPython/PyScript side or downright Chrome issues. So I’ll avoid venting here.
All you need to know is that MicroPython is faster than Pyodide when interacting with JavaScript, and its FFI defaults are arguably better: they allow more things to work actually transparently. The price is that it’s not “real Python”. I mean, it’s Python-like enough, but I was constantly tripping over stuff like: there’s no breakpoint()
or pdb
(but you can use code.interact()
); there’s no datetime.now()
; there’s no dataclasses
; there’s no typing
; there’s no pprint
; and so on… MicroPython reports its sys.version
as 3.4.0
but there’s support for f-strings, assignment expressions, or async
and await
.
But you’ll notice things lik asyncio
missing Future
objects, or that deque
requires you to specify arguments on creation that are optional in CPython. And maybe most importantly, things have truly confusing error messages. We got spoiled with CPython’s recent versions being really helpful in that regard. MicroPython is minimalistic. When interacting with JavaScript, things get even more interesting because the error is usually about a missing JavaScript method or attribute on a Python type, but there’s no usable stacktrace since the call originates from the innards of micropython’s wasm code. You don’t even know which object we’re talking about. So iterative development works best, to at least have a suspicion where the wrong call might live. Translating a larger project from CPython or Pyodide to MicroPython? That sounds like a challenge.
I said that there’s no debugger in MicroPython and that’s true. But at least the code.interact()
there works without running in a worker. This is much appreciated since instantiating JavaScript objects from the main thread is much faster, which is what I’m doing quite a lot interacting with Three.js.
So which is better? I really can’t say. It seems to me like they both suck in different ways, but they also have their unique strengths. In theory I like MicroPython better, because it downloads and starts up much faster, its runtime performance is usually better, and I can type things into a Python terminal without switching to a worker. But then again, there were cases where Pyodide was much faster, like when data structures are growing in size or there’s a ton of math operations. But if we’re frank, I’m just so used to CPython that I’m constantly finding smaller and larger differences in how things work. The knowledge transfer isn’t that smooth, and there’s no micropip. That’s a big difference. I used numpy a few times. But when I don’t need that stuff, MicroPython is my first choice these days. I used a dataclasses port that works on MicroPython with some caveats, I also wrote a tiny compatibility layer that allowed me to switch my scripts between Pyodide and MicroPython to see, which one worked better for a particular case.
What’s the point then?
So there’s issues with both Pyodide and MicroPython, and you have no choice but to slowly become pretty good at JavaScript and the browser APIs anyway. So you gotta ask, wouldn’t it be better to just use JavaScript for this?
My answer is that you should do what you want to do. In my case, I wanted to leverage my experience with Python and I was able to do that tremendously. I literally reused my numpy-based Conway’s Game of Life code I wrote a few years back, pasted it in, and could generate 3D cubes that animated the state changes without having to re-engineer that piece of code. In many other cases the ability to just import standard libraries I already knew was very useful.
And PyScript packages it all very nicely. I was able to spend 31 days focused on 3D graphics, with only occasional detours due to weird JS vs Python behavior. But you know what? At the end of it I can link you to Python applications you can run in your browser without installing anything. And they’re running Python. And that’s amazing.
Things I learned with every project
This is more of a log for myself, so feel free to skip this. There’s no narrative here, just a list of things I didn’t know before completing that particular project.
The headers are clickable so you can see the result in video form.
#1: Vertical or horizontal lines only.
I learned about using workers to offload slow computation and retain the ability to show progress on the main thread. I also learned how to properly place an ortographic camera for rendering “flat” things “from the top”.
#2: Layers upon layers upon layers.
I learned about how 3D object transparency works. For instance: by default objects are not rendered double-sided, so if you enable transparency the result looks weird because there’s no back side visible through. You need to enable that separately. I learned how to place an ortographic camera for rendering “isometric” things from the side.
#3: Exactly 42 lines of code.
I learned to stop worrying and love abusing assignment expressions for extra lines of code saved. I learned that drawing lines in 3D is hard and how antialiasing works and does not work in WebGL. I learned about the post-processing effect composer in Three.js and its cool Unreal Bloom filter.
#4: Black on black.
This was a lot in one day. I learned how to convert an SVG into a 3D model with Blender and export it so it can be loaded in WebGL. I learned about using textures for bump mapping. I learned that spotlights in Three.js are part of the shadow casting/receiving functionality and how to place them to actually see an effect. I learned how to use visual aids (simply called “helpers” in Three.js) to see why you’re doing something wrong.
#5: Isometric Art (No vanishing points).
I learned that consistent animation speed when your animation framerate is variable is somewhat tricky to accomplish. I learned that transparency + shadows looks rather weird in WebGL. I learned how to use easing functions to make gradual changes look more natural.
#6: Make a landscape using only primitive shapes.
I learned that I was using an outdated version of WebGL all along. I learned how to use Line2 (a fake line built from triangles that allows you to specify thickness) and how to build Line2 wireframes automatically from existing meshes. I had a decent review of high school geometry. I learned how Three.js efficiently stores attributes on the graphics card’s memory and how that makes them a bit annoying to mutate from Python. I learned how to animate things more smoothly in the presence of variable framerates.
#7: Use software that is not intended to create art or images.
This was one of the weirder days. The program I wanted to write was generating WAV files from images such that when you load the WAV file in an audio editor you see the image back on a spectrogram. The natural thing to do there was to write a command-line application. I was stubborn to do it in the browser so you “upload” an image and after it computes you “download” the WAV file. I learned a lot about main thread vs worker argument passing (I failed at passing pickles, I failed at passing tuples of objects, etc.). I learned how JavaScript uses typed arrays and memory-efficient buffers for file operations and how those are not directly accessible from WebAssembly so you have to copy the data to read it from Python. But not necessarily to send it back to JavaScript. I learned some proper numpy vector operations. I’m sure I could have used even fewer for loops but I’m pretty happy with the state of the code right now. I learned about data URLs for client-only “downloads”. I learned about import pyodide_js; pyodide_js.setDebug(True)
.
#8: + #9: Draw one million of something. + The textile design patterns of public transport seating.
I learned more about the effect composer post-processing pipeline. I tried to use the TexturePass but it was invisible to HalftonePass, so I learned that scenes can have arbitrary backgrounds as well. I learned how to make Pyodide pass a dictionary as a JavaScript object to a JavaScript function. It’s to_js(your_dict, dict_converter=js.Object.fromEntries)
. I learned M1 Max still delivers 5 FPS with a million spheres in the scene. I learned that this case is CPU bound at least for my naive code. I learned that the halftone effect looks horrible on thumbnails so you need to exaggerate it in the full screen image to compensate.
#10: You can only use TAU in your code, no other number allowed.
I learned how to draw lines with Line2. Those are pretty great because they can have non-contiguous line geometries, arbitrary width, vertex-specific colors, and can be dashed.
#11: + #12: Impossible day. + Subdivision.
This was a project that I wanted to do for years. It’s what made me interested in 3D in the first place. While the result is quite humble, what it means is huge since it opens up amazing adjacent possibles.
The prompt for Saturday was “try to do something that feels impossible for you to do”. The one for Sunday was “subdivision”. I always wanted to do frequency-aware 3D visualizations based on realtime audio. The frequency domain is a form of subdivision of the sound into frequency content, so I decided to combine the two prompts and spend two days to finally achieve this goal. I did.
I learned many unrelated things. I learned about RectAreaLight, got pretty excited, then learned it doesn’t support shadows, then discovered that it might be simulated with percentage closer soft shadows, but that’s super out of scope for this day. I learned that Apple’s Compressor is better at encoding AAC than Ocenaudio. Not only does it take user’s chosen bitrate more faithfully, but it also performs the lowpass filtering at 20.4 kHz vs Ocenaudio’s 19 kHz. I learned that 16-channel BlackHole is recognized by QuickTime Player as a surround audio interface and therefore records “multi-channel audio” that is later mixed down to monophonic by the audio codec in Davinci Resolve. You need to use BlackHole 2ch to avoid the issue. I also learned that it can be now installed with homebrew, making it much easier to keep up-to-date. And unrelatedly, I figured out how to kill the stats mini-window and the PyScript ad on the bottom of the HTML page with one keystroke so I have a clear view of the animation during recording.
#13: Triangles and nothing else.
I learned that if you don’t run at 120FPS then realtime frequency detection might not be perfect for high-precision clocking. Haha, obvious in hindsight, I know. I learned how to work around it with wide clock signals and cooldown at detection time.
#14: Pure black and white. No gray.
I learned how to make a tiny JavaScript wrapper over an existing JavaScript class from Three.js to avoid destructive type translations.
#15: Design a rug.
I learned that design in code is super painful. Visual editors are a must if anything non-trivial is to be assembled. The rug I designed looks mostly okay thanks to a LUT that changes the colors into a pleasant combination of blue and gold. Otherwise it would be pretty embarrassing how much time it took to create. Oh yeah, I learned to apply LUTs and FXAA to Three.js scenes. This was also the first Genuary entry where MicroPython worked fine for me. I also learned how to use planes to clip meshes.
#16: Generative palette.
This was the day I switched my project template to MicroPython, which included a user-visible collapsible terminal at the top of the window. This simplifies communicating with the users, simplifies coding new projects, too, as there is an interactive interpreter where I can change values of stuff on screen and see the difference right away. I learned that the MicroPython terminal works without a worker, which is great news since running in the main thread is much more efficient in terms of creating JavaScript objects.
When I finally got to the generative palette, I used a few LUTs and spotlights over a rotating box with a kaleidoscopic shader. This synchronized with music looks quite nice.
#17: + #19: What happens if pi=4? + Op Art.
Link:
At this point I fell behind the daily releases. It’s a big commitment. In the end this was released only on Jan 21.
I learned that simple triggering techniques with modulo divisions end up firing multiple times for the same intended “event” if the callbacks are fast enough. I wasted so much time on this, the effect being that the rotation of the pie pieces wasn’t 90 degrees but more.
#18: + #22: What does wind look like? + Gradients only.
This taught me about Perlin noise. My template for beat synchronization got progressively better with now enough time to tweak fun details. My favorite moment was when I needed to switch the complete project from MicroPython to Pyodide due to computational performance and it just worked with no changes necessary.
#21: + #25: Create a collision detection system (no libraries allowed). + One line that may or may not intersect itself.
I learned that Float32Array
containers are not resizable unless backed by a resizable ArrayBuffer
, in which case they are incompatible with BufferAttributes
(backed by WebGL2RenderingContext.bufferData()
). In plain English, WebGL wants you to recreate segmented objects from scratch on each geometry change. Seems wasteful, but it is what it is.
#26: + #27: Symmetry + Make something interesting with no randomness or noise or trig.
I learned that browser APIs don’t allow opening datagram endpoints (i.e. communicate over UDP). Therefore, I learned to use WebSockets to bridge between the UDP endpoint and the browser.
#28: + #30: Infinite Scroll. + Abstract map.
I learned how to use the marching squares algorithm to draw contours. I learned that without controls set up, Three.js by default scales content horizontally automatically. I wasted quite a bit of time trying to do it manually, which was always overshooting and I couldn’t figure out why. I also learned how to crossfade and loop scrolling in convincing ways.
#31: Pixel sorting.
I learned how to make Python-based post-processing passes for Three.js and how to package shaders so that there’s no JavaScript involved. I learned how to pass information between animation frames in a way that doesn’t affect other passes in the effect composer. I also learned how to translate ShaderToy shaders into Three.js shaders. Finally, I learned how LoadingManager helps with taming asynchronous asset loading in Three.js and how to integrate it with asyncio.
Prompts I didn’t get to
- #20: Generative Architecture. For this one I hoped to do some isometric SimCity 2000/MDMA skyline.
- #23: Inspired by brutalism. For this one I was specifically waiting for the movie “The Brutalist” to come out. Now to actually go and see it.
- #24: Geometric art (use only one geometric shape). In 2023 I implemented an algorithm that approximates an input image using colored triangles. It’s neither very efficient nor gives results as good as Michael Fogleman’s primitive. I hoped to revisit this and move it to PyScript.
- #29: Grid-based graphic design. Here I wanted to package 3 Three.js scenes in encapsulated classes, to display all of them at the same time with clipping.
Things I haven’t explored (yet!)
During Genuary 2025 I explicitly focused on using Three.js with PyScript. Even within that project space, I didn’t touch:
- WebGPU. This is the new kid on the block allowing general computation and more flexible rendering, even within Three.js. I stuck to WebGL for the month.
- Shaders. Save for the last project, I haven’t touched GLSL at all, as it’s an entirely different language with different constraints. Three.js is also popularizing its own renderer-independent shading language (unsurprisingly called TSL), which would allow me to code shaders in Python. Looks promising but it’s also a big problem space to learn. It’s almost like I could do Genuary 2026 specifically with shaders only.
- UI. Since WebGL outputs to a canvas element, it’s entirely possible to make UIs in HTML and ignore UI elements inside the Three.js scene. Then again, I’m pretty sure UIs that can be texturized, shaded and undergo EffectComposer passes would look more seamless. In any case, save for a short detour to allow the Python terminal to be hidden with split.js, I haven’t done anything in that regard.
- Building “WASM extensions” for Pyodide. I noticed Pyodide ships with cffi and I wonder if that means there’s a way to compile a Cython extension to WASM and run it in the browser. That would be fantastic for compute-heavy Genuary prompts.
- Synthesis with Web Audio. It’s a thing I’m genuinely interested in but it didn’t really fit the theme of Genuary.
-
In theory other pure Python libraries from PyPI should work, too, but my experience was rather miss than hit there. ↺