Psychedelic Graphics 0: Introduction
72 comments
·January 23, 2025dtristram
DonHopkins
Who were the other people in Raster Masters, and what crazy stories from Grateful Dead concerts can you tell? ;)
Every time I've ever plugged in a modern projector into a laptop at a presentation it's so stressful, like rolling the dice if the screen will ever come up. What kind of a projector and calibration and preparation did it take to project live hires SGI video onto the screen above the band?
actionfromafar
I would guess either https://en.wikipedia.org/wiki/Eidophor or https://en.wikipedia.org/wiki/Talaria_projector https://www.ebay.com/itm/225229726689
https://web.archive.org/web/20250124121654/https://www.ebay....
DonHopkins
Wow glad I asked, thanks for the links -- TIL about wobbulators! Definitely on my xmas shopping list for my retro electronics freak friends.
https://en.wikipedia.org/wiki/Talaria_projector
>RGB color separation and processing is obtained using vertical wobbulation of the electron beam on the oil film to modulate the green channel and sawtooth modulation is added to the horizontal sweep to separate and modulate Red and Blue channels. The optical system used in the Talaria line is a Schlieren optic like an Eidophor, but the color extraction is much more complex.
https://en.wikipedia.org/wiki/Wobbulator
>A wobbulator is an electronic device primarily used for the alignment of receiver or transmitter intermediate frequency strips. It is usually used in conjunction with an oscilloscope, to enable a visual representation of a receiver's passband to be seen, hence simplifying alignment; it was used to tune early consumer AM radios. The term "wobbulator" is a portmanteau of wobble and oscillator. A "wobbulator" (without capitalization) is a generic term for the swept-output RF oscillator described above, a frequency-modulated oscillator, also called a "sweep generator" by most professional electronics engineers and technicians.[1] A wobbulator was used in some old microwave signal generators to create what amounted to frequency modulation. It physically altered the size of the klystron cavity, therefore changing the frequency.
Samwell & Hutton Ruggedized CT501 Wobbulator (1968) NSN: 6625-99-620-2403
https://www.ebay.com/itm/267012403603?_skw=wobbulator&itmmet...
relaxing
Do you have links to this work we could see?
DonHopkins
ElectroPaint and other Raster Master performances were featured in the Infrared Roses video:
https://en.wikipedia.org/wiki/Infrared_Roses
Infrared Roses is a live compilation album by the Grateful Dead. It is a conglomeration of their famous improvisational segments "Drums" and "Space". The ElectroPaint stuff begins around 11:00, but the Raster Masters did all kind of different stuff in parallel and mixed it all together in real time. I remember them describing some "recursive texture map" feedback too, which only ran on high end SGI workstations.
https://www.youtube.com/watch?v=gkhr23asO-M
Wired: Raster Masters: Enough with virtual reality -- virtual hallucinations?
https://www.wired.com/1994/06/raster-masters/
Electropaint on SGI Indy: A capture of the great screensaver electropaint on an SGI Indy. There is no sound (originally I had Mahavishnu Orchestra's "Miles Beyond" but youtube flagged me for it), but feel free to blast your own music while watching:
https://www.youtube.com/watch?v=StA81MNuqB8
6 minutes of ElectroPaint:
https://www.youtube.com/watch?v=ObdtoLuSaWM
SGI IRIX ElectroPaint Screen Saver:
https://www.youtube.com/watch?v=gbWpsrNYfaQ
Panel Library and ElectroPaint source code:
http://66.111.2.18/pub/The_Unix_Archive/Unix_Usenet/comp.sys...
Some of David's more recent stuff:
https://www.facebook.com/groups/106720642819222/posts/222074...
>In the spirit of J-Walt's intro message, I'm David Tristram, somewhat of a pioneer in the use of real-time graphics for live performance. Author of Electropaint and Electroslate live performance instruments, and founding member of Raster Masters. Toured with Grateful Dead, developed performance system for Graham Nash and Herbie Hancock.
>I'm just playing with things these days, most recently making music with a small modular system and experimenting with very simple looping visuals in an investigation into the perception of visual rhythms. Here is my most recent test.
louky
I think I saw some of your work at a few Dead shows in ~1992
satyarthms
If anyone wants to play around with psychedelic graphics without going too low-level, [hydra](https://hydra.ojack.xyz/) is a cool javascript based livecoding environment with a gentle learning curve.
jerjerjer
Is there anything which supports music input? I liked Winamp era visualizers, but the art seems to be dead today.
joenot443
I've been working on an free open-source macOS app for just that - https://nottawa.app Hoping to release in the next couple months!
The UI has been greatly improved since I took the original demo on the site, the real thing is MUCH better now. Same base idea - chain together shaders, videos, or webcams and then drive their parameters via an audio signal, BPM, oscillator, MIDI board, or manual sliders.
The beta link on the site isn't really worth trying yet - if you're interested in getting on the TestFlight just shoot me a message at joe@nottawa.app. Would love some HN feedback :)
jerjerjer
> I've been working on an free open-source
May I ask where are the sources? Looks great, any plans for Windows or Linux (Docker) version?
satyarthms
Hydra actually works well with music input! It grabs audio from the mic and `a.show()` will show you the frequency bins. Then any numerical parameter can be modulated by the intensity of a bin, for example:
`noise().thresh(()=>a.fft[0]*2).out()`
jerjerjer
> It grabs audio from the mic
Is it possible to grab from default audio output device instead of mic? Probably not as it's browser based. I suppose mic can be faked on OS level somehow.
progmetaldev
I used to spend so much time messing around with MilkDrop in Winamp. You could grab existing visualizations and see what they were doing, and make your own edits. Thanks for the nostalgia hit!
jerjerjer
MilkDrop 3 latest release was in 2023!
https://github.com/milkdrop2077/MilkDrop3/releases/tag/MilkD...
Thank you, that's exactly what I'm looking for.
jcelerier
You can do that easily with https://ossia.io :)
leptons
There's a lot of examples of using javascript for "psychedelic graphics" on dwitter.net
dtristram
Regarding the OP doc and UV coordinates. A major area of investigation for us back in the day was finding interesting ways to displace the uv texture coordinates for each corner of the rectangular mesh. We used per-vertex colors, these days one would use a fragment (pixel) shader like those in ShaderToy.
A very interesting process displaces the texture coordinates by advecting them along a flow field. Use any 2D vector field and apply displacement to each coordinate iteratively. Even inaccurate explicit methods give good results.
After the coordinates have been distorted to a far distance, the image becomes unrecognizable. A simple hack is to have a "restore" force applied to the coordinates, and they spring back to their original position, like flattening a piece of mirroring foil.
Just now I am using feedback along with these displacement effects. Very small displacements applied iteratively result in motion that looks quite a bit like fluid flow.
DonHopkins
That was how Jeremy Huxtable (inventor of the original NeWS "Big Brother" Eyes that inspired XEyes) PostScript "melt" worked: choose a random rectangle, blit it with a random offset, lather, rinse, repeat, showing how by repeating a very digital square, sharp, angular effect, with a little randomness (dithering), you get a nice smooth organic effect -- this worked fine in black and white too of course -- it's just PostScript:
https://www.donhopkins.com/home/archive/news-tape/fun/melt/m...
%!
%
% Date: Tue, 26 Jul 88 21:25:03 EDT
% To: NeWS-makers@brillig.umd.edu
% Subject: NeWS meltdown
% From: eagle!icdoc!Ist!jh@ucbvax.Berkeley.EDU (Jeremy Huxtable)
%
% I thought it was time one of these appeared as well....
% NeWS screen meltdown
%
% Jeremy Huxtable
%
% Mon Jul 25 17:36:06 BST 1988
% The procedure "melt" implements the ever-popular screen meltdown feature.
/melt {
3 dict begin
/c framebuffer newcanvas def
framebuffer setcanvas clippath c reshapecanvas
clippath pathbbox /height exch def /width exch def pop pop
c /Transparent true put
c /Mapped true put
c setcanvas
1 1 1000 {
pop
random 800 mul
random 600 mul
random width 3 index sub mul
random height 2 index sub mul
4 2 roll
rectpath
0
random -5 mul
copyarea
pause
} for
framebuffer setcanvas
c /Mapped false put
/c null def
end
} def
melt
Here's Jeremy's original "Big Brother" eye.ps, that was the quintessential demo of round NeWS Eyeball windows:https://www.donhopkins.com/home/archive/news-tape/fun/eye/ey...
interroboink
Are these animated, or somesuch?
I tried naïvely using `ps2pdf` (Ghostscript), but got errors on both of them. I guess they're meant to be consumed by some other sort of system?
DonHopkins
Oh sorry I didn't explain: they're interactive PostScript scripts for the NeWS window system, so they don't actually print, they animate on the screen! The "pause" yields the light weight PostScript thread and lets the rest of the window system tasks run, and NeWS had an object oriented programming system that was used to implement the user interface toolkit, window managements, interactive from ends, and even entire applications written in object oriented PostSCript. NeWS is long obsolete, but you can run it in a Sun emulator!
https://en.wikipedia.org/wiki/NeWS
For example, here's a heavily commented demo application called PizzaTool:
https://donhopkins.medium.com/the-story-of-sun-microsystems-...
Source code:
https://www.donhopkins.com/home/archive/NeWS/pizzatool.txt
It uses an iterated feedback pixel warping technique kind of like melt.ps, to spin the pizza rotationally, which melts the cheese and pizza toppings, instead of melting the screen by simply blitting random rectangles vertically like melt.ps -- note the randomization of the rotation to "dither" the rotation and smooth out the artifacts you'd get by always rotating it exactly the same amount:
% Spin the pizza around a bit.
%
/Spin { % - => -
gsave
/size self send % w h
2 div exch 2 div exch % w/2 h/2
2 copy translate
SpinAngle random add rotate
neg exch neg exch translate %
self imagecanvas
grestore
} def
It animates rotating a bitmap around its center again and again as fast as you "spin" it with the mouse, plus a little jitter, so the jaggies of the rotation (not anti-aliased, 8 bit pixels, nearest neighbor sampling) give it a "cooked" effect!It measures the size of the pizza canvas, translates to the center, rotates around the middle, then translates back to the corner of the image, then blits it with rotation and clipping to the round pizza window.
DonHopkins
Aaaah, remember the simple directly manipulative pleasures of Kai Power Goo:
LGR: Kai's Power Goo – Classic 90s Funware for PC!
AndrewStephens
I love how easy it is to write shaders that operate on images in HTML. My skills in this area are mediocre but I love seeing how far people can take it. Even providing a simple approximation of a depth map can really make the results interesting.
Some years ago I did a similar project to smoothly crossfade (with "interesting effects") between images using some of the same techniques. My writeup (and a demo):
https://sheep.horse/2017/9/crossfading_photos_with_webgl_-_b...
coffeecantcode
I’ll be honest I’m far more interested in the rolling hills article that accompanies this one.
Specifically about halfway through the process and applying:
uv.x = uv.x + sin(time + uv.x * 30.0) * 0.02; uv.y = uv.y + sin(time + uv.y * 30.0) * 0.02;
to the static image. Having experienced a range of psychedelic experiences in my life this appears to be the closest visually with the real thing, at least at low, non-heroic, doses. Maybe slow the waves down and lessen the range of motion a bit.
Note: I am far more interested in replicating the visual hallucinations induced by psychedelic compounds than by making cool visuals for concerts and shows, utmost respect for both sets of artists though.
There is an artist (and I’m sure many more) who does a fantastic job with psychedelic visuals using fully modern stacks to edit, unfortunately their account name entirely escapes me. I’ll comment below if I find it.
The comparison that I would make with this portion of the Rolling Hills article would be the mushroom tea scene from Midsommar, specifically with the tree bark. The effect of objects “breathing” and flowing is such a unique visual and I love to see artists accomplishing it in different ways.
progmetaldev
It's probably not who you were talking about, but this account on YouTube does a good job of representing the visual experience, while also talking about other effects. The videos looking at nature, and the way the visuals start to form geometric patterns, and that "breathing" effect are powerful. The author covers various substances, and how the effects can be minor (slight "breathing" or pulsing of surfaces), to full geometric "worlds" (such as from DMT - although I've never dipped into that substance).
coffeecantcode
That is not who I had in mind but after looking through their account I’m going to binge their videos, very cool stuff. I always found that studying the minute differences in these substances is such a genuinely interesting topic. It’s covered a lot in Mike Jay’s Psychonauts.
cancerhacker
Early 90s, Todd Rundgren realized a Mac App called Flowfazer - it didn’t simulate your experience but was helpful as a distraction to move you along. Some people used it to provide guidance for their own creations.[2]
[1] https://grokware.com/ [2] https://m.youtube.com/watch?v=3Z4X4FmIhIw
It was a time of screensavers and palette animation.
brotchie
If this is your kind of thing and you ever get a chance to see the musical artist Tipper alongside Fractaled Visions driving the visuals, you’re in for a treat.
Most spot on visual depictions of psychedelic artifacts I’ve witnessed.
Saw them together last year and it’s the no. 1 artistic experience of my life. The richness, and complexity of Fractaled Vision’s visuals are almost unbelievable.
Even knowing a lot about shader programming, etc. some of the effects I was like “wtf how did he do that”.
Here’s the set, doesn’t fully capture the experience, but gives a feel: Seeing this in 4k at 60fps was next level.
yieldcrv
ooo I was there
null
trollied
This needs a link to shadertoy https://www.shadertoy.com
Falimonda
Wow! Thanks for sharing that!
cess11
Reminds me of an old Flash classic in this area, Flashback.swf. Here's a video render of it: https://m.youtube.com/watch?v=KaSqrx93rS0
progmetaldev
This video (back in the Flash days) is how I discovered the electronic group Shpongle. Their remix of Divine Moments of Truth is used in this animation. I believe the version is the "Russian Bootleg" version. I had been into electronic music before this, but this genre of electronic really blew my mind when I heard it.
tylertyler
I've been writing webgl shaders at work this week and noodling with the details to make things look like physical camera effects but occasionally I'll get something wrong and see results that look similar to the stuff in this article and I have to say it is just so much more fun than the standard image effects.
Sure there might be limited use cases for it visually but playing with the models we've built up around how graphics in computers work are a great way to learn about the each one of these systems. Not just graphics but fundamental math in programming, how GPUs work and their connection to memory and CPUs, how our eyes work, how to handle animation/time, and so on.
mwfogleman
Here's a music video the OP and I made with these techniques: https://www.youtube.com/watch?v=5GOciie5Pjk
alanbernstein
This might have been written just for me, I love the premise.
I am truly fascinated by people who attempt to reproduce the actual physiological vision effects of psychedelic drugs.
Psychoactive drugs can be probes into the inner workings of our minds - in some scientific sense - and exploring the vision effects seems likely to suggest interesting things about how our visual system works.
Mostly, I am just impressed when anyone is able to capture the visual experience in graphical effects, with any level of realism.
caseyohara
> Mostly, I am just impressed when anyone is able to capture the visual experience in graphical effects, with any level of realism.
I have to say that the cliche of super bright, super saturated, geometric or melty shapes like in the article are not a great reproduction of the typical visual effects of psychedelics. Apart from very high doses, the visual effects are much more subtle.
The /r/replications subreddit has GIFs and short videos with a much higher degree of realism https://www.reddit.com/r/replications/top/?t=year
helboi4
This is 100% not what psychedelics look like. It's generally just mildly more saturated colours and the feeling that everything is possibly breathing or swaying in a more natural way. I dunno what happens if you take insane amounts tbf. I always thought that psychedelic art was a bit more about the sort of thing that is super appealing to look at while tripping.
GuB-42
Maybe the most "scientifically accurate" replication of psychedelics are in these "DeepDream" images.
They were originally made to debug neural networks for image recognition. The idea is run the neural network in reverse while amplifying certain aspects, to get an idea on what it "sees". So if you are trying to recognize dogs, running the network in reverse will increase the "dogginess" of the image, revealing an image full of dog features. Depending on the layer on which you work, you may get some very recognizable dog faces, or something more abstract.
The result is very psychedelic. It may not be the most faithful representation of an acid trip, but it is close. The interesting part is that it wasn't intended to simulate an acid trip. The neural network is loosely modeled after human vision, and messing with the artificial neurons have an effect similar to how some drugs mess with our natural neurons.
Hi, David Tristram here. founding member of Raster Masters, 1990's computer graphics performance ensemble. As @hopkins has mentioned, we used high end Silicon Graphics workstations to create synthetic imagery to accompany live music, including notably the Grateful Dead, Herbie Hancock, and Graham Nash.
After many iterations I'm currently working mainly in 2D video processing environments, Resolume Avenue and TouchDesigner. The links here are inspiring, thanks for posting.