Skip to content(if available)orjump to list(if available)

I fed 24 years of my blog posts to a Markov model

vunderba

I did something similar many years ago. I fed about half a million words (two decades of mostly fantasy and science fiction writing) into a Markov model that could generate text using a “gram slider” ranging from 2-grams to 5-grams.

I used it as a kind of “dream well” whenever I wanted to draw some muse from the same deep spring. It felt like a spiritual successor to what I used to do as a kid: flipping to a random page in an old 1950s Funk & Wagnalls dictionary and using whatever I found there as a writing seed.

bitwize

Terry Davis, pbuh, did something very similar!

anthk

Megahal/Hailo (cpanm -n hailo for Perl users) can still be fun too.

Usage:

      hailo -t corpus.txt -b brain.brn
Where "corpus.txt" should be a file with one sentence per line. Easy to do under sed/awk/perl.

      hailo -b brain.brn
This spawns the chatbot with your trained brain.

By default Hailo chooses the easy engine. If you want something more "realistic", pick the advanced one mentioned at 'perldoc hailo' with the -e flag.

lacunary

I recall a Markov chain bot on IRC in the mid 2000s. I didn't see anything better until gpt came along!

nurettin

Yes, I made one using bitlbee back in the 2000s, good times!

pavel_lishin

I made one for Hipchat at a company. I can't remember if it could emulate specific users, or just channels, but both were definitely on my roadmap at the time.

lloydatkinson

I'm hoping someone can find it so I can bookmark it but I once read a story about a company that let multiple markov chain bots loose in a Slack channel. A few days later production went down because one of them ran a Slack command that deployed or destroyed their infrastructure.

swyx

now i wonder if you can compare vs feeding into a GPT style transformer of a similar Order of Magnitude in param count..

0_____0

I thought for a moment your comment was the output of a Markov chain trained on HN

bitwize

No mention of Rust or gut bacteria. Definitely not.

atum47

I usually have this technical hypothetical discussions with ChatGpt, I can share if you like, me asking him this: aren't LLMs just huge Markov Chains?! And now I see your project... Funny

pavel_lishin

> I can share if you like

Respectfully, absolutely nobody wants to read a copy-and-paste of a chat session with ChatGPT.

empiko

LLMs are indeed Markov chains. The breakthrough is that we are able to efficiently compute well performing probabilities for many states using ML.

famouswaffles

LLMs are not Markov Chains unless you contort the meaning of a Markov Model State so much you could even include the human brain.

sophrosyne42

Well LLMs aren't human brains, unless you contort the definition of matrix algebra so much you could even include them.

cwyers

Yeah, there's only two differences between using Markov chains to predict words and LLMs:

* LLMs don't use Markov chains, * LLMs don't predict words.

roarcher

...are you under the impression that you have an exclusive relationship with "him"? Everyone else has access to ChatGPT too.