Hacker Newsnew | past | comments | ask | show | jobs | submit | necovek's commentslogin

> You can save thousands of people, but murdering someone still should mean a life sentence.

Not if you murder someone to save a thousand people ;)

(though you might still get one as you need to prove that there was no other way to save them)


Not really, governments have been encouraging family formation because it brings up new taxpayers since, well, forever.

Which does not make single-living idea weird or dumb, ofc, but it makes it additionally expensive on top of the natural cost (just like multi-floor houses are cheaper per surface built).


Isn't this obvious?

Yes, per-adult, multi-generation family homes are even more cost-effective than for couples (even accounting for smaller pensions compared to salaries), and both are more cost-effective compared to singles.

Apart from growing prices, my experience (not in Canada though) is that living spaces are growing too, as we are not satisfied to live in the same cramped 20m2 studio as singles were 30 or 50 years ago.


> we are not satisfied to live in the same cramped 20m2 studio as singles were 30 or 50 years ago.

I conjecture that this is, at least partially, caused by modern people being more isolated and even when they do socialize there's less "third spaces" to get together with friends so someone ends up having to host the superbowl watch party in their apartment, for example.


> living spaces are growing too

Median home sizes have gone from 1400 sqft in the 70s to 2400 sqft in recent years.

https://www.bankrate.com/real-estate/average-home-size/

Part of it is the economics of construction. Part of it is growing threshold for “bare minimum”. In unit laundry was optional in the 70s and I’ve heard people wanting a “laundry room”. Pandemic has pushed the need for an office. Larger kitchens and more storage space is also a big difference in newer units vs older ones.


In the '70s, in-unit laundry in a rental apartment was almost unheard of except at perhaps very top end. An on-premises shared laundry room was normal but having to go to a laundromat was not uncommon either.

I did not have laundry facilites of my own until I bought a house.


It’s $1.75/load where I live now. Small washers and dryers

>Median home sizes have gone from 1400 sqft in the 70s to 2400 sqft in recent years.

Because you literally need more square footage to amortize all the regulatory required and industry checkbox required bullshit over. Ain't no different than General Motors saying "no more small cars from us in the US".


30 years ago I did not need to rent a 20m² studio. As a young college student I rented a spacious 750 sq. ft. 1-bedroom, furnished apartment that was more than affordable on my paycheck from driving a forklift at the pipe yard.

Currently it's impossible to rent 70 sqmt furnished apartment anywhere in the developed world from a warehouse, agriculture, or hospitality job. Maybe if you worked in Amsterdam and lived in Cambodia.

Indeed. We need a lot more small apartments for individuals. It goes against conventional wisdom that we need more "family-sized homes" but in reality every jurisdiction just needs a ton of 1-bed units.

Toronto overbuilt tiny condos and now prices are down 15% from 2023 peak. Other types of house prices are roughly stable.

Hard to draw too many conclusions since "down 15%" is still "way to fricking expensive", but...


"overbuilt" seems editorial. Drive the price to zero. Nobody says that we overproduced potatoes just because they are all ten kilos for a dollar.

They certainly do say that about potatoes in producer contexts.

True, but our response (in America) is just to pay off the producers to keep food market prices low.

Furnished is doing a lot of work here; local warehouse jobs start at enough to afford a 420sq studio; the 1b would be 750sq ft and barely affordable.

Furnished, at the time, cost almost nothing. It wasn't even furnished by the lessor, it was a separate local furniture company, their monthly was very low and they delivered when you moved in and hauled it off when you moved out, included in the fee.

30-50 years ago, a cramped 20m2 studio as a single was a luxury; the standard was to have roommates if you didn't have a partner.

>we are not satisfied

Some of us might be satisfied, but zoning and development approvals seems to have a hatred of small apartments. The ones that get proposed meet fierce opposition from locals who are afraid of having too many neighbors who aren’t rich people.


Perhaps try a 5k/27" at 150%, or look for visual acuity correction :)

FWIW, I could see jagged edges on 4k at 24" without subpixel rendering, 27" is worse. Yes, even 4k at 32" is passable with MacOS, but Linux looks better (to the point that 4k at 43" has comparable or slightly better text quality to 4k at 32" for a Mac).

I am trying to get a 55" 8k TV to work well with my setup, which might be a bit too big (but same width as the newly announced 6k 52" monitor by Dell), but it's the first next option after prohibitively expensive 32" options.


Agreed. I tried 24k 4k screen as soon as they came out (required two DP cables to run at 60Hz at the time), and turning subpixel rendering off, I could see jagged edges on fonts from normal sitting position (I am shortsighted, but at -3.25 I always need correction anyway, which brings my eyesight to better than 20/20). At 27" or 32", DPI is even worse.

And MacOS has removed support for subpixel rendering because "retina", though I only use it when forced (work).


It's not just that: bandwidth needed to drive things above 4k or 5k is already over the limits of HDMI 2.0 (and 2.1 without all the extensions). DisplayPort is a bit better with 1.4 already having enough bandwidth for 8k30Hz or 4k at 120Hz or 8k60Hz with DSC.

When considering a single-cable solution like Thunderbolt or USB-C with DP altmode, if you are not going with TB5, you will either use all bandwidth for video with only USB2.0 HID interfaces, or halve the video bandwidth to keep 2 signal lanes for USB 3.x.

(I am currently trying to figure out how can I run my X1 Carbon gen 13 with my 8k TV from Linux without an eGPU, so deep in the trenches of color spaces, EDID tables and such as I only got it to put out 6k to the TV :/)


I believe menus were available "via API" since an a11y push in GNOME before 2.0 release (atk library and friends).

What was impossible was to stop apps from showing the usual menu bar inside the window.

Obviously, with something so core to the system, plenty of devils in the details.


Since MacOS removed subpixel rendering a few years ago, regular resolution displays have terrible looking text in comparison to Windows or Linux.

Gnome in Linux works great for a decade+ with a single high resolution screen, but there are certainly apps that render too small (Steam was one of the problems).

Different scaling factors on several monitors are not perfect though, but I generally dislike how Mac handles that too as I mostly use big screen when docked (32"-43"-55"), or laptop screen when not, and it rearranges my windows with every switch.


I recently mentioned in another comment that Fedora 43 on my Ideapad is the first “just works” experience I’ve had with my multi monitor setup(s) on anything other than Windows 11 (including MacOS where I needed to pay for Better Display to reach the bar of “tolerable”).

Zero fiddling necessary other than picking my ideal scaling percentage on each display for perfect, crisp text with everything sanely sized across all my monitors/TVs.

I gave up on Linux Mint for that exact reason. I wasted so much time trying to fine tune fonts and stuff to emulate real fractional scaling. Whenever I thought I finally found a usable compromise some random app would look terrible on one of the monitors and I’d be back at square one.

Experimental Wayland on Linux Mint just wasn’t usable unfortunately and tbh wasn’t a big fan of Cinnamon in general (I just really hated dealing with snaps on Ubuntu). I did tweak Gnome to add minimize buttons/bottom dock again and with that it’s probably my favorite desktop across any version of Linux/MacOS/Windows I’ve ever used!

I kept reading endorsements of Fedora's level of polish/stability on HN but was kinda nervous having used Debian distros my entire life and I’m really happy I finally took the plunge. Wish I tried it years ago!


> I kept reading endorsements of Fedora's level of polish/stability on HN but was kinda nervous having used Debian distros my entire life and I’m really happy I finally took the plunge. Wish I tried it years ago!

This. I don't know why, but people forget about Fedora when considering distros. They rather fight Arch than try Fedora. So, did I. Maybe its Redhat. Wish I switched earlier, too. (Although I heard this level of polish wasn't always the case.)

I love Fedora so much. Everything just works, but that's not that special compared to Ubuntu. What is special is the fucking sanity throughout the whole system. Debian based distros always have some legacy shit going on. No bloat, no snap, nothing breaking convention and their upgrade model sits in the sweet spot between Ubuntu's 4 year LTS cycle and Arch's rolling release. Pacman can rot in hell, apt is okay, but oh boy, do I love dnf.

Tho, Fedora has some minor quirks, which still make it hard to recommend for total beginners without personal instructions/guidance IMO. Like the need for RPMFusion repos and the bad handling/documentation of that. Not a problem if you know at all what a package manager, PKI and terminal is, but too much otherwise.


I dual booted Fedora back when it was still called Fedora Core from version 6 until 11-ish. I had it installed on a laptop and had a lot of driver issues with it and eventually didn't bother with dual booting when I moved to a new laptop.

I'm now looking to get off Windows permanently before security updates stop for Win 10 as I have no intention of upgrading to Win 11 since Linux gaming is now a lot more viable and was the only remaining thing holding me back from switching earlier. I've been considering either Bazzite (a Fedora derivative with a focus on gaming) or Mint but after reading your comment I may give vanilla Fedora a try too.

So far I've tried out the Bazzite Live ISO but it wouldn't detect my wireless Xbox controller though that may be a quirk of the Live ISO. I'm going to try a full install on a flash drive next and see if that fixes things.


Give it a try! Although, I do all my gaming on a Playstation. In Fedora, the Steam and NVIDIA Fusion repos come preinstalled and can be enabled during installation or in Gnome's 'Software' or the package manager later, but I can't speak to that. The opensource AMD drivers are in the main repo no action needed. ROCm too, but that can be messy and is work-in-progress on AMD's side. Can't vouch for the controller, but people claim they work. Guess, that's the live image. I heard, games with anti-cheat engines in the kernel categorically don't work with Linux, but this may change at some point. In that case, or if you want "console mode", a specific gaming distro may be worth considering, otherwise I would stick to vanilla. Good luck! Hope I didn't promise too much ;)

So I cleared out one of my SSDs and installed Fedora yesterday.

I still had the issue of no gamepad detection. I had to install xone which took some trial and error. Firstly, I didn't have dkms installed and secondly, soon after installing Fedora the kernel was updated in the background and on reboot my display resolution was fixed to 1024x768 or something for some reason (that's gonna be another issue I'll have to look into). I rebooted and went back to the previous version and then dkms complained the kernel-headers were missing. However, the kernel-headers were installed for the latest kernel but not the older version I had rebooted to. I'm not used to Fedora or dnf (I run Proxmox+Debian in my homelab) so after a quick search to figure out how to install a specific version of a package (it's not as simple as <package>@<version> but rather <package>-<version>.fc$FEDORA_VERSION.$ARCHITECTURE) I got kernel-devel installed and was able to finally run the xone install script successfully and have my gamepad detected.

The most frustrating thing is that the xone install script doesn't fail despite errors from dkms so after the first install (where I almost gave up because I thought something was wrong with my setup) I had to run the uninstall script each time there was a problem and then run it again. The xone docs also mention running a secondary script which doesn't actually exist until the first script runs successfully so that added a lot of confusion.


Lol. Well, that does sound terrible!

My understanding is you only need xone for the special adapter right? Have you tried cable and plain bluetooth before? Also Steam seems to come bundled with their own drivers for it, so the controller may just work within games in Steam, regardless.

I feel a bit bad, but honestly gaming on Linux is not my thing. From a quick glance, messing with the kernel like that may cause problems with secure boot and maybe that's causing your issues. Maybe you need to sign your modules or disable secure boot.

Have you tried the Copr repo? https://copr.fedorainfracloud.org/coprs/jackgreiner/xone-git...

And of course Bazzite seems to have addressed this out-of-the-box... :D

Quite frankly, if you want to do anything but gaming on that machine, at least for me, manually installing kernel modules from GitHub would be a deal breaker, since that seems rather unstable and prone to cause nasty problems down the line.


Canonical releases an Ubuntu LTS release every two years: active is 24.04, next is coming in a few months as 26.04.

LTS support runs for 5 years (there is extended support for 10 years available), so you can skip an LTS if you don't need the latest base software.


You are right, I got that mixed up. To be fair, I somehow also thought of yearly releases for Fedora, which isn't the case. It's every six months, so the relation remains identical, just off by a factor of 2 :D

Steam has DPI scaling issues on Windows as well, especially on multimonitor setups.

As someone who has gone through these phases already, we are endlessly optimizing around a non-problem: how about you instead focus on local development instead?

Whatever the problem is with it (picking up system Python? drop /usr/bin from path... picking up system libraries? take full control of PYTHONPATH and sys.path) can be more trivially resolved than monkeypatching your running Python environment and would allow you to get back to that extremely fast feedback loop we used to have in SW dev 20+ years ago.

Yes, your software needs to not be tightly coupled to your infra: but that's just good, old clean architecture, right?

I sometimes wonder if I am a minority, but I think the modern "let's stick it in a container" bunch is just more vocal. And like all the other historical abstraction layer cakes (thin-clients, VMs, only-runs-deployed-to-the-cloud...), sooner or later, we still realise only local-first can get us that (almost) zero latency for development and debugging with no additional effort if we do all the right things anyway.


Code is deployed in containers. Improved capabilities to debug those systems will be really helpful to me (and I'm sure many others). Recreating real-world state locally takes effort and is error prone

As I said, you can either have better, cleaner architecture where the fact it's deployed to containers does not matter; or, you could decide not to deploy to containers if you are going to be breaking container boundaries in production systems anyway!

It's a pretty big cognitive dissonance to me: we move away from direct access to increase our security posture, and then put more effort to break the barriers we just put in. Next, we'll put more effort in to harden these backdoors, break them again to be more productive...

In most orgs that follow relevant secure development lifecycle standards and legislation, engineers usually won't be able to do that for legal reasons anyway (due to GDPR or CRA in EU; comparable laws in California or New York), so really, big orgs will require you to not debug live production system but log relevant information, and act upon it later.

Anyway, we are all bound to re-learn these lessons, so good luck to anyone entering the scene today :)


As mentioned in a sibling comment, weight might need to be accounted for too: thicker paper is more absorbant, but not linearly so.

So really, how absorbant the paper is should be the gold standard, so let's ask manufacturers to put that on the packaging?


Manufacturers already indicate to the thickness of their paper with ply count: <https://blog.whogivesacrap.org/home/difference-ply-toilet-pa...>.

Although that doesn’t speak to the actual quality of the individual layers of paper. I’m not sure if weight is useful especially when manufacturers are already putting their thumb on the scale in other ways with the ‘2 Jumbo-Mega-Rolls are the equivalent of 8 Super rolls’ scheme that I initially referred to.

If all weight can tell me is that 2-Jumbo-Mega Rolls weigh the same as 8 Super rolls am I any better informed?

This is why I’m pretty content with using the price in cents per square foot as a baseline. In general it’s a useful metric when shopping elsewhere at the grocery store too.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: