

I think it’s interesting that the phrase “ARM-free” roadmap is being used. I had no idea there had been so much market penetration of RISC-V already


I think it’s interesting that the phrase “ARM-free” roadmap is being used. I had no idea there had been so much market penetration of RISC-V already


I’m really grateful for the introduction to deceptive patterns here.
I was not aware of it, and I think it’s important to have language that can describe specifically how tech companies are trying to coerce people.


There’s a 1920 x 1200 non-touch display option, which will surely get you better battery life than OLED. But what’s most interesting about it is the 1-120 Hz variable refresh rate, which Dell says is a first to for this model. That extremely low refresh should help save power when static images or text is on the screen.
Ah yeah, I should have read the rest of the article. I didn’t know about that feature though, that’s cool


1 Hz display option: like an e-Ink display?
(it says 120Hz in the article)


If there’s one way to make people care about cybersecurity…


I’m working towards something like that. I’m hoping to ultimately drop the smartphone altogether, and I’ve set my current phone’s end of life (2027ish?) as the goal.
I think the other thing that’s necessary to keep the same sense of connectedness is a device to receive notifications, and I have an open source smartwatch I want to program for that. I’ve been working on a notification server too (kind of like Gotify), but at the moment it’s a work in progress


By layers I mean image layers when manipulating an image in an image editor. So I guess what you’re saying is an image would be flattened before being passed to a compression algorithm?
You’re more qualified than me, I’ve only watched How It’s Actually Made


I wonder if hypothetically, AI could do the same with a box over text, even if it was 100% opaque. For example, if the data from the layer containing text was part of the image data passed to an image compression algorithm, and that data was somehow reflected in the output
Misread as Pelletburo, now sad there’s no pet feeder called that


I think they had a RISC-V CPU as an experimental option for a while, but I couldn’t see it on their site recently.
Not sure what happened with that
EDIT: my mistake, it was an emulated RISC-V CPU, running on an FPGA (source)


I might never get around to flipping whatever kill switch they claim to be working on, so I’m turning off as much as I can now


For the record a quick web search for how to disable AI in firefox gave me this list of items to set to false in about:config :
browser.ml.enable
browser.ml.chat.enabled
browser.ml.chat.sidebar
browser.ml.chat.shortcuts
browser.ml.chat.page.footerBadge
browser.ml.chat.page.menuBadge
browser.ml.linkPreview.enabled
browser.tabs.groups.smart.enabled
extensions.ml.enabled
I feel like “both” is also an option
The brain cells presumably have a life span… if this technology ever gets used in consumer devices, I’d like to know how people will try and squeeze extra life out of a failing component.
Take it out and warm it in their hands like an alkaline battery?
Give it a shake?
Sing to it?
Some kind of stimulant drug?


unsupported hardware, firmware bugs


No-one suspected Bruce Wayne’s “free WiFi for Gotham City” initiative


It doesn’t even try?
People are right, this AI thing is overblown


Maybe wearing a different tinfoil hat every day would mess up a person’s “fingerprint”
expired SSL cert