Hacker Newsnew | past | comments | ask | show | jobs | submit | bnchrch's commentslogin

1. Proprietary Data (Youtube, docs, gmail, cloud logs, waymo, website analytics, ads, search, the list is huge)

2. Commercial Datacenters (theyre ahead at least)

3. Chip production (Google is manufactoring proprietary chips)

4. Consumer OS (Chrome, Andriod)

5. Consumer Hardware (Pixel)

Basically google has access to data that OpenAI will never have access to, can lower costs below what OpenAI can, and is already a leader in all the places OpenAI will need massive capex to catch up.


You can't train LLMs on proprietary data, at least not if you want to make that LLM as accessible as Gemini. Otherwise random people can ask it your home address.

So it matters less than one would think. Also, ChatGPT can do 'internet search' as a tool already, so it already has access to say Google maps POI database of SMBs.

And ChatGPT also gets a lot of proprietary data of its own as well. People use it as a Google replacement.


>You can't train LLMs on proprietary data, at least not if you want to make that LLM as accessible as Gemini. Otherwise random people can ask it your home address.

If this is your only criteria I think you have a misunderstanding of what proprietary data is and ways companies can mitigate the situation in the inference stage.


Thats a large reason for sure!

I'd layer in a few more

* Largely stable and unchanged language through out its whole existance

* Authorship is largely senior engineers so the code you train on is high quality

* Relatively low number of abstractions in comparisson to other languages. Meaning there's less ways to do one thing.

* Functional Programming style pushes down hidden state, which lowers the complexity when understanding how a slice of a system works, and the likelyhood you introduce a bug


This is an understate and often unsaid take. But its 100% right.

Its also why you'll hear many engineers opine over functional programming.

Most start by thinking about 1. What does a system do for me?

Then onto 1. What actions does a system do? 2. Where should files and folders live? 3. Where should services live?

and stop there.

But the true key is when you just think about how data moves through your system.

Then you learn to simplify.

No classes, no hidden state, just simple functions with an input and an output and nothing else.

Long story short: Think about your system in data, and learn how to keep your code super simple to help with that thinking


Its a real issue in North America.

SMS and as a result iMessage is the dominant text based chat.

iPhones have become the default smartphone, and is a status symbol compared to Android.

Mac vs Windows is similar on the laptop front.

Which means if your an Android user in a relatively average social group:

* You will get left out of group messages

* You will be starting on a back foot in the dating scene

On top of you wont be able to answer messages from friends on your laptop, because again, sms is dominant, not whatsapp.

Now don't shoot the messenger here. I don't like it either, but this is the social/technical reality in NA at the moment.

(sigh: receiving downvotes)


> iPhones have become the default smartphone, and is a status symbol compared to Android.

It does not function as a status symbol in the west. It's not a big deal to get one if you really want to and live in a developed country. People in asian countries making 1/8th of their american counterparts can afford iPhones. Someone making minimum wage in Germany can buy one using about 3-4 months worth of saved disposable income. In the states they'll throw one after you on credit without looking at you twice. It's only a status symbol if you want to set yourself apart from someone living in Zimbabwe... oh wait they also have lots of iPhone users. From who exactly? Afghanis?

Honestly if the bar for status symbol's is that low, you should sooner consider excercise and good dietary habits. These days in many western counties that will do many orders of magnitude more for how people perceive you and your dating life. Certainly more than what flavour of annoying chiming piece of shit you bought.



I might have a bridge to sell you.

What says a lot is that you had to dredge up some up to 7 years old posts on reddit, on which replies still overwhelmingly call the idea silly. This smells like an attempt to manufacture consent, but it'd be pretty low effort for even that.

As a rule, if something sounds stupid to you, it will probably be just as silly to most people you should give a damn about. Certainly don't let some posts that look like the lowest-effort FUD imaginable tell you what other people think.


I may have something to teach you about indicators, averages, and population samples/biases.

We're not debating majority opinion here. Just that people exist who have that bias / perception and what it leads to.

People exist that judge and exclude based on if you have have an Android.

Im sure the reverse exits too.

Im also sure the former is more common than the later.

But I have no idea how large that population is.

Just like Im not in that population.


> I may have something to teach you about indicators, averages, and population samples/biases.

You didn't sample. You filtered. You used a search engine to zero in on a couple dozen in a population of 350 million, then suggest to me that the mere fact that there's at least some means it's an opinion held by enough people to matter, when in fact it's probably not - even going by the references you selected yourself.

That you throw around some statistics lingo after all that is hysterically funny to me.

Scientific rigor never was the bar to convincing me, but since you brought it up yourself, be my guest.

> People exist that judge and exclude based on if you have have an Android.

We're not debating whether such people exist, we're interested in what the experience of someone using an Android phone is likely to be. Remember that an original claim was "Which means if your an Android user in a relatively average social group: [the following will happen]"

This conversation is very much about average/majority opinion and has been from the beginning. I might let you weaken that to "an android user is likely to have at least occasional bad experiences in some social groups" - if you're willing to at least provide evidence to support that much.

After all, what could be your purpose in bringing something up that has no relevance to almost anyone? You'd just be wasting both of our time.


> * You will be starting on a back foot in the dating scene

Perhaps you should be focusing on losing weight instead of blaming the color of your text messages, lmao.


Fit, happy, married and have the cognitive ability to not conflate the message with the messenger.


Sounds like a good way to filter out assholes. Anyone who cares what phone you use in this way is someone you don't want in your life.


Maybe, but let me pose you mental model that a lot of NA iPhone users have.

For a long time, if you were on iOS and added a android user to your group chat. All threading was broken. It was no longer a group chat just a bunch of out of band messages.

So iOS users naturally started leaving the android user out of the chat. They would text their 5 friends on iOS in one group to make plans, then text their Android friend separately to update them when plans were made.

I believe this is relatively fixed in latest iOS, but that habit is still very much their in iOS users today.

Anecdotally I did just experience a group chat of 4 iOS users this year that was very active, then died when one person switched to Android.


Reacts added some poor abstractions over the last decade. Looking at you hooks and effects.

But its far from the worst.

It was the first framework to put together JSX, a functional way of defining components and simplifying state. This was a monumental improvement. As a result they earned mass adoption.

As a result its the framework that now has a community moat that is not going to crumble until someone else can break ground in the way they did.

Sure, some of these could be considered "better" but they're all better due to incremental improvements for Frontend Engineering.

None of which are substantial enough to unseat the king


Copyright is about as dead as any party you happen to walk into.


Oh it is very much alive if you're taxpayer and not a corporation


Well said. This sums up my own feeling. I joined this craft and love this craft for the simple ability to build beautiful and useful things.

This new world makes me more effective at it.

And this new world doesn’t prevent me from crafting elegant architectures either.


Wait 5 years and your skills are down


I don't think 5 years is necessary. I think after two years of this agentic orchestration if you rarely touch code yourself skill will degrade to the point they won't be able to write anything non-trivial without assistance.


Depends how long you've done it, and how much the landscape has changed since then. I can still hop back into SQL and it all comes back to me though I haven't done it regularly at all for nearly 10 years.

In the web front-end world I'd be pretty much a newbie. I don't know any of the modern frameworks, everything I've used is legacy and obsolete today. I'd ramp up quicker than a new junior because I understand all the concepts of HTTP and how the web works, but I don't know any of the modern tooling.


How much do you think Linus Torvalds has coded over the last decade? Why is he still able to do his job?


His job is reviewing.



It won't be 5 years, it'll be less than a year. When you don't exercise, your muscles atrophy. It's the same for any other skill. I use to speak conversational french in college, fast forward 10 years later and my understanding is no different than a casual "Emily in Paris" fan.

It'll be the same here, the only question is if those that do exercise will be able to command better salaries. I think this is possible but not with the current political climate.


In 5 years coding skills will matter as much as being able to operate an elevator. (sadly)


What infrastructure has gone through the last 15 years would like a word.

Half the people I work with can't do imperative jQuery interfaces. So what I guess. I can't code assembly.


A programming language is still an additional language with all the benefits of being multilingual.

AI will kill that.


QQQ is up 20% over the last year.

GOOG is up 70% over the last year.

"Pummelled" seems extremely sensational...


Some 7-15% down in a trading day is a lot for an established corporation. I consider Salesforce dropping 7% without some obvious trigger to be at least somewhat newsworthy, and from the first sentences in the article I get the impression that The Economist is sitting on more examples like that.

A lot of people are tense about the AI venture ouroboros and what it might mean for future software, especially people with money and little to no experience actually deploying software.

Edit: At the time I saw some memes claiming that roughly 1.5 trillion dollars in market value had evaporated, which if true is not a small sum.


GOOG is now also an AI company, so not exactly a fair comp as it doesn't fit neatly into the "software" bucket

MSFT is only up 3% over the last year


Google makes very little money selling enterprise software


You didn't read the article. The first graph plots Workday, Salesforce, SAP, and ServiceNow. Google isn't mentioned.


Maybe they shouldve said ERP stocks are getting "pummelled"


It basically does

> The value of listed American enterprise-software companies is down by 10% over the past year.


Swift has all the things I want in a language

- Strong Typing

- Great Performance

- Actor Model Concurrency [0]

- Modern Ergonomics

- Corporate Backing

- Performance

- Functional Style

- LLMs perform well with it [1]

- Usable across iOS, Android, Web, and Browsers [2][3]

The only thing its missing is adoption outside of the iOS space.

I'm not sure it will be able to make that leap, but the ingredients are there.

If it does I'd be happy to make it my primary language.

----------------

[0] https://www.hackingwithswift.com/quick-start/concurrency/wha...

[1] https://github.com/Tencent-Hunyuan/AutoCodeBenchmark/blob/ma...

[2] https://www.swift.org/blog/nightly-swift-sdk-for-android/

[3] https://vapor.codes/


> - Modern Ergonomics

What does this even mean? Modern Swift looks like a haphazard mishmash of conflicting features where every problem is solved by "just one more keyword bro". In 2024 it had 217 keywords: https://x.com/jacobtechtavern/status/1841251621004538183 and that was reduced slightly, to 203, in 2025: https://x.com/jacobtechtavern/status/1962242782405267617

According to Lattner they never even had the time to design anything due to time pressure from Apple [1]. So Swift ended up with a type system that the compiler can't even check and is impossible to fix. So the compiler routinely just gives up and complains on even the most trivial code.

[1] https://youtu.be/ovYbgbrQ-v8?si=tAko6n88PmpWrzvO&t=1400

--- start quote ---

Swift has turned into a gigantic super complicated bag of special cases, special syntax, special stuff...

We had a ton of users, it had a ton of iternal technical debt... the whole team was behind, and instead of fixing the core, what the team did is they started adding all these special cases.

--- end quote ---


It’s not necessary to use all or even most of those features, though, and so a nice balance of expressiveness and functionality is possible. I’ll take it over dying on weird hills in language design in pursuit of ideological purity or mountains of ceremonial code and unavoidably ugly syntax.


> mountains of ceremonial code and unavoidably ugly syntax.

you... just described Swift, really :)

Also, all those features exist even if you don't use them all. Which makes the language complex, cumbersome, and makes its compiler slow, complex and brittle. A language shouldn't be a collection of one-off edge cases, and this has nothing to do with ideological purity


I dunno, with a handful of exceptions I'm still mostly writing Swift the same way I did 5+ years ago. Unless you're using SwiftUI, new features haven't changed a whole lot in real world use.

Whatever the case, I don't enjoy writing languages more obsessed with theory or design purity (like Kotlin) as much.


I agree the language itself has gotten more complex, but for day-to-day productivity in terms of actually using it to write code, I don't think it makes a difference.

I've found writing Swift code very pleasant, but I've been doing it for ten years, so that helps I suppose. The biggest productivity impact for day-to-day use for me in the last few years has been the new concurrency model.


You listed “corporate backing” as a good thing and “no adoption outside Apple ecosystem” as a pain point. Why would it get adopted outside Apple ecosystem if Apple decides what happens to it?


Xcode isn't modern ergonomics, and neither is slow compilation (which also doubles against great performance and doubled performance)


Xcode is also definitely absolutely not required for using Swift.


I agree, but it's already been 15 years since it was released so I'm losing hope..


Related HN Discussion on Genie:

https://news.ycombinator.com/item?id=46812933


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: