Go allocation probe
26 comments
·July 22, 2025fsmv
typical182
> The biggest problem is any string you pass as an argument to the fmt functions is moved onto the heap
FWIW, that's not quite correct. For example, a string literal passed as a fmt argument won't be moved to the heap.
The upcoming Go 1.25 release has some related improvements that help strings in more cases. See for example https://go.dev/cl/649079.
fsmv
Yeah I just saw in the bug they're finally making progress on fixing this, exciting! I edited in the link if you didn't see.
coxley
> because interface{} is always counted as escaped from the stack
Not quite - if the function accepting interface{} can be inlined (and other heuristics are groovy), then it won't escape.
Trivial example but it applies to real-world programs:
> cat main.go
package main
import "github.com/google/uuid"
func main() {
_ = foo(uuid.NewString())
}
func foo(s any) string {
switch s := s.(type) {
case string:
_ = "foo:" + s
}
return ""
}
# Build with escape analysis
> go build -gcflags="-m=2" main.go
# command-line-arguments
./main.go:9:6: can inline foo with cost 13 as: func(any) string { switch statement; return "" }
./main.go:5:6: can inline main with cost 77 as: func() { _ = foo(uuid.NewString()) }
./main.go:6:9: inlining call to foo
./main.go:6:24: uuid.NewString() does not escape
./main.go:6:9: "foo:" + s does not escape
./main.go:9:10: s does not escape
./main.go:12:14: "foo:" + s does not escape
felixge
Hacking into the Go runtime with eBPF is definitely fun.
But for a more long term solution in terms of reliability and overhead, it might be worth raising this as a feature request for the Go runtime itself. Type information could be provided via pprof labels on the allocation profiles.
aktau
Not sure if there is already quorum on what a solution for adding labels to non-point-in-time[^1] profiles like the heap profile without leaking looks like: https://go.dev/issue/23458.
[^1]: As opposed to profile that collect data only when activated, like the CPU profile. The heap profile is active from the beginning if `MemProfileRate` is set.
jasonthorsness
Interesting... usually you can guess at what is being allocated from the function doing the allocation, but in this case the author was interested in types that are allocated from a ton of locations (spoiler alert: it was strings). Nice use of bpftrace to hack out the information required.
ajd555
I'm guessing you're using unlurker to post this comment? By any chance, is the comment AI generated? (The use of the term the author is a hint). Apologies if not
jrockway
How is "the author" a hint that it's AI generated? It's a common speech pattern. There are probably thousands of comments on HN articles from before AI that use the term. I'm not even sure what you'd use instead.
null
null
jasonthorsness
It is not AI generated and unlurker just finds articles with active conversations, it doesn't do any kind of comment automation. Upon reading it again, my comment in question maybe didn't add all that much; I just found the article interesting, so maybe I get where you are coming from.
As I mentioned in another reply, the weirdest thing about this comment chain is I saw this article on the front page, not unlurker (there wasn't much conversation yet when I posted so it would not have shown on the unlurker view I use).
Is "the author" a phrase AI prefers? Maybe I'll need to retire that along with "delve" and the em-dash and "you're absolutely right".
ajd555
Ha - thanks for answering this and clearing it up! Like I said, apologies if I was wrong. I think my AI slop paranoia had kicked in!
No, honestly the comment was more than legitimate, I guess the author part made me think of LLMs summarizing research papers.
Well, at least I've discovered unlurker out of this :)
giancarlostoro
He is the author of unlurker
https://github.com/jasonthorsness/unlurker
I had no idea what this was... Is this an ongoing problem on HN or something?
ajd555
yeah, that's how I found it - I had no idea what it was until I clicked on his profile. I personally haven't noticed it, but this comment really stood out, and I really hope this doesn't become a thing
null
osigurdson
>> func (thing Thing) String() string { if thing == nil { return nil } str = ... return &str }
It seems like the "..." of str = ... is the interesting part.
jamii
The ... is the useful part. We actually want that string, so we can't avoid allocating it.
But the &str at the end is an additional heap allocation and causes an additional pointer hop when using the string. The only reason the function returns a pointer to a string in the first place is so that the nil check at the beginning can return nil. The calling code always checks if the result is nil and then immediately dereferences the string pointer. A better interface would be to panic if the argument is nil, or if that's too scary then:
func (thing *Thing) String() (string, bool) {
if thing == nil {
return "", false
}
str := ...
return str, true
}
null
90s_dev
I forgot to ask, that day that the Go team did an AMA here: did AI have any influence or sway or advice etc in choosing Go over other solutions?
It is very difficult to avoid putting strings on the heap in go. I used the built in escape analysis tools and made sure I only use a constant amount of memory in the loop in my short https://github.com/fsmv/in.go progress bar program.
The biggest problem is any string you pass as an argument to the fmt functions is moved onto the heap because interface{} is always counted as escaped from the stack (https://github.com/golang/go/issues/8618).