Hacker Newsnew | past | comments | ask | show | jobs | submit | dpoloncsak's commentslogin

Should we count the first when they bombed an AWS data center?

Does destroying a data center count as a DDOS?


A non-D DOS attack.

Maybe we just retrofit the D to mean Destructive

Destructive Denial of Service


I think it's better to compare data breaches to data breaches, like when Adobe got breached. Or Oracle. Or Rockstar.

Nothing happened in the grand-scheme of things. Even after Oracle lied and pulled some shady tactics to downplay what happened.

A few years ago Crowdstrike took down the entire set of corporate computers and everyone still uses Falcon. There is simply no accountability anymore


There has to be a very niche market for people who want ChromeOS on their device but do not have the technical know-how to do so, or without a device that can flash an iso.

I guess for $3 it's not really a cash-grab or anything. Kinda nice to see vendor-supported live USBs honestly


I'd say the majority of people don't know how to install an OS on a device and having the ability to run Chrome on what is likely e-waste is a good thing.

I applaud the efforts of people/groups like MrChromeBox who figure out how to flash linux onto Chromebooks. There are great designs like the Samsung Galaxy book in red with Amoled display (thin metal body) unfortunately it only has 8GB of ram.

It kind of makes sense for it to be a partnership with Back Market, which also sells used hardware.

That way, the ChromeOS USB key can be an add on to the purchase of some old laptop that can barely run Windows anymore.


Pretty sure in some EU countries it is mandatory now, iirc

For my homelab, I setup a Raspberry Pi running PiHole. PiHole includes the ability to set local DNS records if you use it as your DNS resolver.

Then, I use Tailscale to connect everything together. Tailscale lets you use a custom DNS, which gets pointed to the PiHole. Phone blocks ads even when im away from the house, and I can even hit any services or projects without exposing them to the general internet.

Then I setup NGINX reverse proxy but that might not be necessary honestly


Went through school in the early 2000s in US. We were taught cursive (script), but I don't think I've used it since school.

Seems odd, in hindsight, to teach hand-written prose uses a different set of symbols than when its typed out


How fast can you write in cursive vs non-cursive? I am much slower in non-cursive when writing.

The only issue is that my cursive is pretty lousy looking.


I'd hedge to say roughly the same, but that's writing print in chicken-scratch handwriting (which is my norm) and under-practiced with cursive. I'd suspect after using cursive a bit I would speed up. Similar to using home-row when typing vs pick-and-peck or whatever they call it

My phone would transcribe even quicker than that, though, which would probably be my go-to instead of hand-writing


I find it hard to speak into my phone while I am in a live meeting and trying to summarize my instant thoughts for paper or my Remarkable :)

Similar boat...I'd also like to swap off Plex but a few of my less techie friends use it and I'm worried about the compatibility/ease of setup of Jellyfin on their devices.

Thought about running both in parallel but that seems like a waste. Think I just need a migration day eventually


I've never run Plex, or Jellyfin for that matter. There's a video share on my NAS.

I point Infuse on Apple TV 4K's at it. It works, and cleanly.

Downsides: you have to pay for Infuse Pro to play some formats and deal with some audio codecs. It's IIRC $17 for a year, though, so pretty reasonable for continued development. Your non-technical friends and family can't do the initial setup themselves (it's shared over Tailscale, they can all use the same limited account on your plan), but anyone I'm going to let do this can ship me their Apple TV 4K and let me set it up for them.


It's just weird that it's this complicated. We should get a static IP from our DNS. We should use standard open source streaming conversion mechanisms. It should go over basic video codecs.

Lately I've been working towards just using a webserver to host video files. Sure, it's not adaptive, but for goodness sakes it's simple.


Well, my ISP doesn't support IPv6 for home use, at all. My IPv4 is essentially static - I can't recall the last time it changed - but while Tailscale is a single point of failure for my home network security, it's also one that I can expect to be updated faster than practically any other package.

VLC will play anything I throw at it, but it's not going to go and fetch all the metadata for me and present it in a nice way to the non-technically-oriented users around.


yeah I'm on the same boat. I just have an old laptop hooked up to the tv, which can access a shared folder on my main computer that has all the media. I control it with a wireless mouse, and get an actual fast UI with a web browser instead of the usability nightmare that is a smart tv UI. this is all Windows though, I guess it's possible to have Linux access a Windows shared folder, I've been meaning to look into it for a while

> I guess it's possible to have Linux access a Windows shared folder

It is, and while it's not hard, this was really my first experience running Linux in a long while, and boy do I now understand why people did not like systemd when it came out. It's not bad, per se, but it's not just "stick a line in /etc/fstab". However, even Copilot can put together a couple of scripts for you.


Jellyfin has been fine "around the house" but I don't know about remote access as I've not needed it.

Jellyfin + Infuse + AppleTV is basically bulletproof; however Swiftfin as a client has been working fine.


Yeah nothing is as turnkey as plex, hard to give that up if you’re running a little streaming service for friends and family haha

Any $10/mo VPN solves this, and probably advertises it as a selling point.

Of course, then you're spending $10 to save $10....

I have the whole *arr stack setup with Plex running in the US just fine, but that's for sure not for everyone and was a few headaches to get up and running


>Of course, then you're spending $10 to save $10....

Most VPN subscriptions are around $5, whereas netflix with ads costs $8, and $18 without ads. Even at $18 though, it's still not 4K, whereas you can easily pirate 4K versions with your VPN subscription.


Netflix won't even sell me 4K content at any price, because don't use any of their approved spyware operating systems. But the Torrent Store will.

Appreciate the reality check. Mullvad has been a bill I don't think about twice when it comes around, and I cancelled streaming services years ago.

To your point though, as I'm running my plex server on an old ~midrange laptop, 4K is pretty rough for me to stream as well. I'm sure better hardware fixes this, but that's higher cost. YMMV based on what hardware you have on hand to repurpose


>To your point though, as I'm running my plex server on an old ~midrange laptop, 4K is pretty rough for me to stream as well.

Unless you're doing reencodes processing power shouldn't matter. You can serve 4K video on a 2010s router if you wanted to. If you're doing reencodes, why bother? Download an encode that's appropriate for how you're watching it. 4K for the big screen and 1080p for mobile. Skip reencoding altogether.


Huh...maybe I'm just doing something wrong then. I'll re-examine tonight, thanks for the tip!

Plex is as turnkey as it gets and manually adding content isn't that bad tbh.

>Apple is sitting out the AI race

Then why does my M4 run models at TOK/s that similar priced GPUs cannot?


From TFA:

  For Private Cloud Compute specifically, the system is described as underpowered and perhaps more trouble than it’s worth. Updating the software is apparently trickier and takes time, and more fundamentally the chips (believed to comprise right now of modified M2 Ultra processors) are not powerful enough to run the latest frontier models like Gemini, which the new Siri will be based on.

> M2 Ultra processors ... are not powerful enough to run the latest frontier models

The local AI community would strongly disagree with that assessment. They may not be able to run them with low latency for interactive use and this is most likely the real blocker for them, but they will have strong compute per watt compared to nVidia GPU's.


You cropped the part of the quote that is relevant:

> like Gemini, which the new Siri will be based on.

The local AI community isn't evaluating the internal Gemini models. Apple's Private Compute hardware is specifically competing against Google's TPU hardware, which is a foregone conclusion if you've seen the inference economics. The money and electricity wasted on Mac inference at that scale isn't even attractive to Apple.


iPhones can run Uber app but nobody would claim Apple is in the ride sharing business.

No, but they are in the "Device that runs apps" business right? Just like they're looking to corner the "Device that runs models locally" business by focusing on onboard inference.

Gains in model performance isn't exactly cheap, and once one frontier model figures it out, the rest seem to copy it quick. Let them figure out what works and what doesnt, then put the "Apple" touch on it, all while putting your devices in everyone's hands. That's been their business model for years.


Did they not just see crazy sales on Mac Minis the second users figured out it meant they could give an AI access to blue-bubble text messages?

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: