1. Proprietary Data (Youtube, docs, gmail, cloud logs, waymo, website analytics, ads, search, the list is huge)
2. Commercial Datacenters (theyre ahead at least)
3. Chip production (Google is manufactoring proprietary chips)
4. Consumer OS (Chrome, Andriod)
5. Consumer Hardware (Pixel)
Basically google has access to data that OpenAI will never have access to, can lower costs below what OpenAI can, and is already a leader in all the places OpenAI will need massive capex to catch up.
You can't train LLMs on proprietary data, at least not if you want to make that LLM as accessible as Gemini. Otherwise random people can ask it your home address.
So it matters less than one would think. Also, ChatGPT can do 'internet search' as a tool already, so it already has access to say Google maps POI database of SMBs.
And ChatGPT also gets a lot of proprietary data of its own as well. People use it as a Google replacement.
>You can't train LLMs on proprietary data, at least not if you want to make that LLM as accessible as Gemini. Otherwise random people can ask it your home address.
If this is your only criteria I think you have a misunderstanding of what proprietary data is and ways companies can mitigate the situation in the inference stage.
* Largely stable and unchanged language through out its whole existance
* Authorship is largely senior engineers so the code you train on is high quality
* Relatively low number of abstractions in comparisson to other languages. Meaning there's less ways to do one thing.
* Functional Programming style pushes down hidden state, which lowers the complexity when understanding how a slice of a system works, and the likelyhood you introduce a bug
> iPhones have become the default smartphone, and is a status symbol compared to Android.
It does not function as a status symbol in the west. It's not a big deal to get one if you really want to and live in a developed country. People in asian countries making 1/8th of their american counterparts can afford iPhones. Someone making minimum wage in Germany can buy one using about 3-4 months worth of saved disposable income. In the states they'll throw one after you on credit without looking at you twice. It's only a status symbol if you want to set yourself apart from someone living in Zimbabwe... oh wait they also have lots of iPhone users. From who exactly? Afghanis?
Honestly if the bar for status symbol's is that low, you should sooner consider excercise and good dietary habits. These days in many western counties that will do many orders of magnitude more for how people perceive you and your dating life. Certainly more than what flavour of annoying chiming piece of shit you bought.
What says a lot is that you had to dredge up some up to 7 years old posts on reddit, on which replies still overwhelmingly call the idea silly. This smells like an attempt to manufacture consent, but it'd be pretty low effort for even that.
As a rule, if something sounds stupid to you, it will probably be just as silly to most people you should give a damn about. Certainly don't let some posts that look like the lowest-effort FUD imaginable tell you what other people think.
> I may have something to teach you about indicators, averages, and population samples/biases.
You didn't sample. You filtered. You used a search engine to zero in on a couple dozen in a population of 350 million, then suggest to me that the mere fact that there's at least some means it's an opinion held by enough people to matter, when in fact it's probably not - even going by the references you selected yourself.
That you throw around some statistics lingo after all that is hysterically funny to me.
Scientific rigor never was the bar to convincing me, but since you brought it up yourself, be my guest.
> People exist that judge and exclude based on if you have have an Android.
We're not debating whether such people exist, we're interested in what the experience of someone using an Android phone is likely to be. Remember that an original claim was "Which means if your an Android user in a relatively average social group: [the following will happen]"
This conversation is very much about average/majority opinion and has been from the beginning. I might let you weaken that to "an android user is likely to have at least occasional bad experiences in some social groups" - if you're willing to at least provide evidence to support that much.
After all, what could be your purpose in bringing something up that has no relevance to almost anyone? You'd just be wasting both of our time.
Maybe, but let me pose you mental model that a lot of NA iPhone users have.
For a long time, if you were on iOS and added a android user to your group chat. All threading was broken. It was no longer a group chat just a bunch of out of band messages.
So iOS users naturally started leaving the android user out of the chat. They would text their 5 friends on iOS in one group to make plans, then text their Android friend separately to update them when plans were made.
I believe this is relatively fixed in latest iOS, but that habit is still very much their in iOS users today.
Anecdotally I did just experience a group chat of 4 iOS users this year that was very active, then died when one person switched to Android.
Reacts added some poor abstractions over the last decade. Looking at you hooks and effects.
But its far from the worst.
It was the first framework to put together JSX, a functional way of defining components and simplifying state. This was a monumental improvement. As a result they earned mass adoption.
As a result its the framework that now has a community moat that is not going to crumble until someone else can break ground in the way they did.
Sure, some of these could be considered "better" but they're all better due to incremental improvements for Frontend Engineering.
None of which are substantial enough to unseat the king
I don't think 5 years is necessary. I think after two years of this agentic orchestration if you rarely touch code yourself skill will degrade to the point they won't be able to write anything non-trivial without assistance.
Depends how long you've done it, and how much the landscape has changed since then. I can still hop back into SQL and it all comes back to me though I haven't done it regularly at all for nearly 10 years.
In the web front-end world I'd be pretty much a newbie. I don't know any of the modern frameworks, everything I've used is legacy and obsolete today. I'd ramp up quicker than a new junior because I understand all the concepts of HTTP and how the web works, but I don't know any of the modern tooling.
It won't be 5 years, it'll be less than a year. When you don't exercise, your muscles atrophy. It's the same for any other skill. I use to speak conversational french in college, fast forward 10 years later and my understanding is no different than a casual "Emily in Paris" fan.
It'll be the same here, the only question is if those that do exercise will be able to command better salaries. I think this is possible but not with the current political climate.
Some 7-15% down in a trading day is a lot for an established corporation. I consider Salesforce dropping 7% without some obvious trigger to be at least somewhat newsworthy, and from the first sentences in the article I get the impression that The Economist is sitting on more examples like that.
A lot of people are tense about the AI venture ouroboros and what it might mean for future software, especially people with money and little to no experience actually deploying software.
Edit: At the time I saw some memes claiming that roughly 1.5 trillion dollars in market value had evaporated, which if true is not a small sum.
According to Lattner they never even had the time to design anything due to time pressure from Apple [1]. So Swift ended up with a type system that the compiler can't even check and is impossible to fix. So the compiler routinely just gives up and complains on even the most trivial code.
Swift has turned into a gigantic super complicated bag of special cases, special syntax, special stuff...
We had a ton of users, it had a ton of iternal technical debt... the whole team was behind, and instead of fixing the core, what the team did is they started adding all these special cases.
It’s not necessary to use all or even most of those features, though, and so a nice balance of expressiveness and functionality is possible. I’ll take it over dying on weird hills in language design in pursuit of ideological purity or mountains of ceremonial code and unavoidably ugly syntax.
> mountains of ceremonial code and unavoidably ugly syntax.
you... just described Swift, really :)
Also, all those features exist even if you don't use them all. Which makes the language complex, cumbersome, and makes its compiler slow, complex and brittle. A language shouldn't be a collection of one-off edge cases, and this has nothing to do with ideological purity
I dunno, with a handful of exceptions I'm still mostly writing Swift the same way I did 5+ years ago. Unless you're using SwiftUI, new features haven't changed a whole lot in real world use.
Whatever the case, I don't enjoy writing languages more obsessed with theory or design purity (like Kotlin) as much.
I agree the language itself has gotten more complex, but for day-to-day productivity in terms of actually using it to write code, I don't think it makes a difference.
I've found writing Swift code very pleasant, but I've been doing it for ten years, so that helps I suppose. The biggest productivity impact for day-to-day use for me in the last few years has been the new concurrency model.
You listed “corporate backing” as a good thing and “no adoption outside Apple ecosystem” as a pain point. Why would it get adopted outside Apple ecosystem if Apple decides what happens to it?
2. Commercial Datacenters (theyre ahead at least)
3. Chip production (Google is manufactoring proprietary chips)
4. Consumer OS (Chrome, Andriod)
5. Consumer Hardware (Pixel)
Basically google has access to data that OpenAI will never have access to, can lower costs below what OpenAI can, and is already a leader in all the places OpenAI will need massive capex to catch up.
reply