Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Ask HN: Why are we concerned about terminal rendering speeds?
3 points by drivingmenuts on June 13, 2024 | hide | past | favorite | 15 comments
The terminal renders text and AFAIK, it can render faster than anyone, except maybe the fastest speed reader, can read. So, why does terminal software need the GPU to render text?


I'm probably going back to iTerm/Warp because the minimal and ultra fast ones don't seem to implement a "Find string in output" command.


There's a lot of raster data in today's HiDPI world. For example, if I have iTerm2 on the left side of my screen and this browser on the right side, the iTerm2 window alone has an area of more than 7-million pixels.

Is HiDPI (aka Retina) even worth it in general? Yes, because all text, in a terminal emulator or otherwise, looks absolutely deliciously good when each individual character is allotted 4x the pixel budget of the before times.


I spend most time working in the terminal. Low latency makes me more grounded in what I am doing, therefore easier to focus. And it just feels more satisfying.


Does it? Which one? Most modern VTs probably need gl/sdl/dx interface to render, but not GPU specifically.

This is a pretty loaded question, could use unpacking a little.


Well, iTerm2 definitely has that option (at least now it does - not sure about past versions) and every time I see an announcement for some sort of terminal software, I seem to recall they are touting their ability to use the GPU to render.

I understand why the iTerm2 makers thought AI would be a good addition (not using that, either) but didn't know why GPU rendering was a thing. I've turned off GPU rendering - originally because I wasn't sure why it was needed so I didn't turn it on. Now that I see some explanation - I'n still not turning it on - I don't need it.


Not OP, but there are many others which use the GPU specifically, such as Alacritty, Kitty, Warp, Ghostty to name a few off the top-of-my-head. It seems like pretty much every new terminal emulator that's coming out these days focus on GPU rendering.


Interesting, I remember (u?)xterm being screen tearing fast at the Athlon times. Wondering what changed since then, cause blitting glyphs onto a surface at 60-144 fps wasn’t a big deal even then.

Modern terminals feel like dot matrix printers compared to it, but I always assumed it’s because of using GtkTextView or something like that for rendering. Requiring a GPU and not just some form of direct-blit API is something insane to me.



Previously in the DOS era, “terminal” would allow you to render to the video buffer directly and it allowed for an ecosystem of text GUI apps to flourish. Right now, this is largely impossible because modern terminals are really emulators and you have to rely on strange things such as ncurses etc. A better terminal with some nice features such as access to a virtual video buffer would allow for exciting new possibilities


If you’re running a long process that spits a lot of text, it can make a difference. For me, I don’t really care that much about throughput. I care about latency. I want to see my keystrokes as fast as possible. Hardware acceleration doesn’t necessarily help that much there.


Because when you have a 240hz monitor and you're writing a TUI application, it makes print debugging via standard language print methods even more interesting.

GPU-accelerated terminals are just one part of the stack that makes such lunacy possible.


Sometimes, in the command line of windows... I'll type something like

  D:\source>dir *.jpg /s
to find out how many gigabytes of photos I've taken in the past 3 decades... I really don't care about anything other than the total, and maybe file count.

     Total Files Listed:
           343340 File(s) 539,323,481,512 bytes
               0 Dir(s)  1,662,307,921,920 bytes free
  
  D:\Source>
The faster the terminal renders, the quicker I get the totals. Tip: if you minimize the window, it usually runs much quicker.


So Windows has to actually render out the filenames and sizes somewhere to get the total amount of space used by the files?

That seems a bit ... non-optimal.


No, it doesn't have to do that.

  Get-ChildItem *.jpg | Measure-Object -property length -sum
Stats included in this are the sum total of the sizes (in bytes). That's Powershell which has been around for a while now, I've never bothered to learn it though and always have to look things up. Get-ChildItem can be replaced with ls or dir.


I feel like that's a quirk of their Powershell and Windows Terminal apps. Using cmd.exe, it's instant.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: