Hacker Newsnew | past | comments | ask | show | jobs | submit | ko_pivot's commentslogin

I’m guessing what they mean is that the valuation is so inflated at this point that the high dollar amount more reflects the likelihood of acquisition or IPO in the near term rather than some sort of substantive demonstration of confidence in the company and its founders.


> Only 5.52% of DAUs use more than one Space regularly.

Wow! That's the main feature for me.


These statements about DAUs are the most Silicon-Valley-brained statements I've heard this month... As if features only have value when a plurality of people use them. "We need engagement!"


yeah and:

> By contrast, core features in Dia, like chatting with tabs and personalization features, are used by 40% and 37% of DAUs respectively

well of course. these are likely some of the few major features of Dia which is still in private beta.


No question that Ive is a legend, but I do think the fall of Humane (also ex-Apple) and the challenges at Meta, Apple, and Google in terms of VR/AR adoption (Meta Ray Ban, Apple Vision, Google Glasses and the new thing) are instructive here. The $6.5B almost feels like the largest ever aquihire.


> No question that Ive is a legend

Not sure why he deserves to be a legend, to be honest, but yes, he is a legend.

He did a good job, but those small and minimalistic designs were only possible because of the efforts of entire teams of engineers, of which the public never heard anything.


And many of those designs were made at Ive's behest, against the wishes of entire teams of engineers. I feel like we have his "courage" to blame for the Butterfly keyboard, terrible Mac thermals and the lack of ports on "Pro" computers.


Don't forget the Apple Mouse with the lightning port under the mouse so you can't use it while it charges. It's still the only Apple product with a design that makes me physically cringe.

I also find it awkward and uncomfortable to use, but that might just be me.


I love the Apple mouse. Been using it for 10ish years. But yeh it’s an absolute ball ache when it loses battery and you reallly need a mouse


When I had that Apple mouse, I kept around a separate wired mouse to use when it inevitably ran out of charge while I was in the middle of work. I won't just stop what I am doing just because Apple wants me to not use a mouse with the cord plugged in.


So the power button on the bottom of Mac Mini is fine?


I'm not familiar with the Mac Mini. I didn't know about that.


I was just going to say…the mouse has to go on its back to be charged lmao.


Ive did good designs when Jobs kept him in check. Once Jobs was gone he messed up a whole generation of MacBooks. Things got much better after his departure.


I think after Jobs, Ive did all sorts of things just to justify his presence at Apple. Hence trying to make an already well-designed product even more "well-designed", but to his terms. And that's when it started turning to shit.


The almost two decades before he made great designs. I have always felt it went downhill after Jobs was not there anymore to provide a counter force to Ive's design tendencies. It's like taking one of John or Paul away.


The thermals were all Intel’s fault. My 2019 MacBook from work is an oven. I can’t tell whether my smaller m4 max is turned on by touching it.


Blaming Intel is a poor excuse. Apple could have done some actual design and built a laptop around the hardware they had. But they didn't want to. Instead, they ignored the reality, stuck to the flawed design, and shipped mediocre laptops several years in a row.


I agree, but I think they didn't care. For some years, some Apple execs believed that the iPad was going to replace the Mac. After that they knew that the Apple Silicon Mac was nearby, so they probably didn't want to make an investment in a 'legacy' platform. Did suck for all the people who bought one.


Perhaps, but pretty much every high performance Intel laptop between 2017 and 2023 is exactly the same unless it's in an heavy, enormous and unpleasantly loud gaming chassis. Supposedly the Core Ultra Series 2 are an improvement but I haven't tried one yet.

For a while, you could get the thermals a bit more tolerable by undervolting them, but then the plundervolt mitigations blocked that.

(Typing this comment from a Lenovo X1 Extreme sitting on a cooling pad, sat next to an X1 Carbon that we can't use because it throttles too much. :)


Apple is generally really good at trying to keep their machines silent. When they originally transitioned to Intel, their Core 2 Duo laptops were both cheaper and more silent than the competition. As a Linux user, that's one feature from Apple I'd like most manufacturers to copy.

Regarding your X1, tweaking Linux kernel parameters and downvolting a bit can work wonders in terms of reaching an pleasant heat : performance ratio. Obviously, Lenovo should have taken care of this. However, they release so many different machines that it's hard for them to pay attention to details.


It’s a company laptop that runs Windows, and the newer BIOSes now block undervolting because of the plundervolt mitigations.

I replaced the thermal paste with some of that PTM stuff which helped a bit, but not enough. I also found that for some reason it tends to BDPROCHOT-throttle when powered through the official Thunderbolt 4 Workstation Dock, even though it’s meant to be 230W and provides power separately to the USB port - but using the standalone AC adapter when docked fixes that.

Ultimately, until there are some decent X86-64 laptops released, the choice is between slow, thin and quiet vs less slow, but big, heavy and noisy. AMD is a bit better than Intel but still weak on mobile and nowhere near as good as the current Apple offerings.

On another note, why are PC manufacturers still putting fan intakes on the bottom. Maybe it’s theoretically more efficient, but tell that to my users who always do things like resting their laptop on a book then wondering why their Zoom screen sharing goes jittery.


Intel has never been good at thermals.


I've never had an Intel laptop work well in the efficiency and thermal department, Apple or not. I used to blame Apple too, but seeing the difference, it's hard to argue who the main culprit was. Can't design around a bad foundation.


Pentium M's were magical when they came out.


Apple had the design ready for an Intel chips that didn't arrive. Rather than revisiting their design they opted to just chuck the chip into a design that couldn't accommodate it's thermal characteristics.


I spent way too much time figuring out that around 53W is the maximum that the last Intel MBP can sustain over longer periods before the VRM (converts power for the CPU) poops out and throttles you.


Your 2019 Macbook also uses a different chassis, designed by Jony Ive. Apple knew it throttled the chips they used but shipped it anyways, presumably because Ive liked his thinness even when it results in a bendgate.

You'll note that Macbooks don't quite look the same after Ive left and his influence went away.


I don’t actually know. Last MBP I had was circa 2017 or so.

How are they different ?


Beefier. Bulkier. Quick google search says Intel 16" was 4.3 lbs whereas M4 16" is 4.7 lbs. Not a big difference you say but 1) it is going in the opposite direction where the newer product is bulkier and 2) imagine the years of thin-ness that would have been forced under a different regime.


I wonder how much of that is battery.


I had the butterfly keyboard for 5 years yet I didn't have a single problem with it. And I'm a long time mechanical keyboards user. What is all the hate about?


Many people (more than the average rate for the prior generations) _did_ have problems. Perhaps more importantly, the only way to address those problems when they arose was to replace not only the keyboard itself but the entire top case of the machine due to the way the parts were integrated. This process costed many hundreds of dollars when the machines were out of warranty, and the company eventually acquiesced to social pressure and lawsuits by creating an extended warranty program.

That's not to say your situation is unique...there are probably many machines out there that have not had problems, including one owned by my wife. But there are also an unusually high number of machines that did.


> This process costed many hundreds of dollars

"Cost"

I'm a native English speaker and nobody told me this (and I didn't manage to pick it up) until I was nearly 40. "Cost"'s past tense is also "cost."

There's another, newer, largely fatuous, verbed "cost" that means "to calculate the cost of something." That's the one that gets used in the past tense ("the projects have all been costed.")

"I've costed a keyboard replacement for my computer, and the total is more than the computer cost in the first place."


That's luck on your side. I too own a butterfly keyboard, trouble free. But there were 50 other macs in the office I worked in that regularly had issues. They were unreliable as hell, and beyond the reliability issue, many people did not like the shorter travel distance (I didn't mind this at all myself).


I had the first generation on a MacBook 12" and had no issues at all. Then I got the second generation on a MacBook Pro (I think this was still without the dust seals) and it was one big misery. A small speck of dust would make a key feel bad or get stuck. I was so happy when I could finally get rid of the stupid device. Never had issues with Apple Scissor switches thereafter.


I was like you ... until one day.


I felt the same way when I used it. But recently I booted up an old laptop with the butterfly keys and I was like "ewwww" as soon as I started typing on them. They worked. But what we have now is more comfortable.


I'd get a particularly large molecule lodged under a key and then I couldn't press that key consistently anymore until I managed to flush it out. It was OK when it worked, but it didn't work enough.


Just look it up. It was a thing for years, to the point that Apple was basically forced to revert it.


I didn’t like the feel and mine failed after a year.


Other than reliability issues (which I never ran into), the butterfly keyboard is the best laptop keyboard I've ever used.


I absolutely despise comments like these, and you only see them on HN.

It's like saying great architects aren't great, it's the construction workers who should get the credit.


> It's like saying great architects aren't great

No, you made that up. It's like saying great architects are not the sole cause of the things they make.

> it's the construction workers who should get the credit

Don't you think the construction workers should get some of the credit?


You're most likely to get credit by being unique and irreplaceable. In other words, if the work would not have happened without you. If someone else could have been easily hired to do the work you contributed, and if in that case the work would have been largely indistinguishable from the work you did, then you're essentially fungible.

IMO you still deserve credit. And in fact you still get credit. But that credit comes in the form of monetary reward and (hopefully) recognition from your team and peers, rather than in the form of fame.

All of which… seems sensible to me? Hard to imagine it working otherwise. Interestingly, the movie industry has normalized "end credits" which play after a movie ends, and which lists literally everyone involved, which is quite cool. But the effect is still the same, the people up top get 99.99% of the credit.

(Ofc the "system" is imperfect, and fame/credit can be gamed by good marketers. But it's also not a "system" that any one party invented, it's just sort of an organic economy of attention at work.)


I am not sure what you're trying to say here. I agree that the existing situation is the most likely one. So what? I am simply saying that even though it is the "obvious thing", it is unfair and unkind. Those two things are compatible, in fact they are the usual arrangement of things!

> Hard to imagine it working otherwise.

No it isn't! It's very easy to imagine crediting people in a different ratio than we happen to do now. You are seeing what it looks like - people mythologise their heroes, and then other people come in and say "they didn't do it all, you know". People are literally doing it, in front of you, in this thread. How can it be hard to imagine?


When I say "I can't imagine" or "it's hard to image" I don't mean that literally. Obviously in reality I can imagine and it's easy to imagine, as evidenced by my example of movie credits.

What I'm saying is that it's not realistic. Humans are wired to remember and share highly specific things, especially names. It's been like this since the dawn of time -- the Illiad is about Achilles, not all the nameless soldiers. So this seems to be the natural order of things, rather than something designed, or something easy to change. And it makes sense, because it's practical -- our memories are limited. You can put everyone's names in the credits, but that doesn't mean they'll be remembered and shared.


Yeah, let's also give credit to the building materials and mother nature. Let's give credit to the pedestrians who walked by the construction site every day and decided not to commit arson.

Brilliant logic. And no, the original comment wasnt' saying "give the engineers some credit", it was saying the engineers deserve the credit instead of Ive.

Which is idiotic and common of smug, self-important programmers.


found the project manager


Ive wasn't great. Apple has only improved since he left. There, does it help if I say it more directly?


It's great that you know better than Steve Jobs. You have impressive self-esteem.


Comments like yours that completely dismiss any questioning of established "legends" seems more despicable to me. Can't we have an open discussion and a range of opinions?


In between great architects and construction workers there are structural engineers who have to work out how to turn the pretty designs into actual, workable plans. Those are the guys who should get most of the credit.


I agree somewhat, you can feel the tension on HN with respect to labor vs capital. Which is funny because the entire premise of YC is to infuse capital and get a huge leverage over bootstrappers.


It's a pretty common turn of phrase on "lefty" (Western, English, very online, progressive) parts of the internet. I've always found it silly because it takes some pretty interesting nuanced problems (how do you give credit to folks who executed Ive's vision, many who probably boldly innovated to create what they did? How do you realistically situate Ive's flaws given his aura?) and wrings the nuance out of it by polarizing the readers (you're either with labor or you're with capital, pick your side of the picket line!)

But then these days lefty and righty parts of the Western English-language internet are all polarized and beating on common enemies is part of their conversational language. I think for a while HN was small enough that it resisted this polarization but at its current size there's no escaping it.


HN isn't special here. There's conflict between people whose job is to make something look pretty and people whose job is to make it work in every industry.


Repeat after me: design isn't about making things pretty.


For Ive, it often was. Ever thinner MBPs? Why, if not for appearance given the weight didn't change. No ports on PRO computers? Why, if they didn't bother his aesthetic sensibilities. Charging your mouse disables its use because the ports on the bottom? Why if not to hide the port for looks? He spent most of his time at Apple trying to make things pretty. Your comment may be true for "design" in the abstract, but as someone who spent plenty of time studying design and architecture, let me assure you, many of the people I studied with who are now industry veterans never cared about much more than aesthetics, even in architecture where engineering and building science are major factors. Again, sure, theoretically true for "design" but hardly true for Ive.


Just because you don't know the reasons doesn't mean there weren't any.


Certain people who call themselves designers could do with learning this. Bring back brutalism, I say!


Alternatively: Form follows function. Or: Good design takes into account the medium.

Many forms of saying it or a very similar statement. If only these words transformed into something beneficial in the minds of flying air castle designers.


> I absolutely despise comments like these, and you only see them on HN.

Unfortunately the progressives have been pushing the downplaying of powerful people quite hard for a long time under the guise of equality, so it’s more widespread than just HN. Even more unfortunately, equality is also one of the main ideas of communism. It’s how the government can get rid of dissenters and thus move power to itself. That’s why Marc Andreessen in the Lex Fridman podcast talked about how the government told them that they could give up their startup because it was already decided which companies would be allowed to operate. That’s not capitalism. And Marc knows it that’s why he felt he had to speak up.


"The takeover of io will provide OpenAI with about 55 hardware engineers, software developers and manufacturing experts"

6.5B / 55 = $118 million per engineer

not a cheap aquihire


Who now expect to be paid.


To quote from the article regarding Humane and the Rabbit r1 personal assistant device: “Those were very poor products,” said Ive, 58. “There has been an absence of new ways of thinking expressed in products.”


To quote myself: "Jony Ive made incredibly poor products his last years at Apple" - So his opinion of what constitutes a "poor product" is suspect (R1 and Humane were bad products but just because you can tell what is a bad product doesn't mean you can make a good one).


Hindsight is 20/20.

If they were so obviously bad at the time, how did they get to market?


That's a good question, VC pressure and hope for first mover advantage?

The humane launch video features two founders that look like they were forced to participate by their hostage takers


I don't know anything about Humane, but the Rabbit was a terrible product right from the start. It was viewed overwhelmingly negatively as soon as it was unveiled.


> If they were so obviously bad at the time, how did they get to market?

I'm not sure what "they" is here (Humane, Rabbit, or late-Ive-era Apple designs).

In all cases though there were plenty of people sounding the alarm. Both Humane and Rabbit were made fun of (wasn't in Humane's demo that the AI was completely wrong about guess the amount of almonds or the calories?). As for Apple products it was a common refrain that they were being made thin at the cost of ports/cooling/etc. How did Apple keep doubling down on the butterfly keyboard _years_ after it was well known it was a bad design?

Also, "The markets can stay irrational longer than you can stay solvent." (re: how did they get to market). You can do anything if you set enough money on fire, no matter how many people are telling you it's a bad idea.


It's widely understood that Jony was given carte blanche after Steve died, to keep him at Apple. Hence the gold Apple Watch.


Because there is no magical barrier that prevents all bad products from being sold?


lots of things “get to market” - pretty easy to get something to market


They were not thin enough.


This is the same dude who brought us the butterfly keyboard, so I'm anticipating a form-beating-function failure (if they actually ship something).


It's also the same dude who brought us beloved products in Apple's lineup. It's almost a meme at this point to say that Jony Ive's genius needs a containing force like Steve Jobs. Perhaps Sam Altman can fill that role.


Jobs, for all his faults, understood where aesthetic, functionality and user experience intersected extremely well.

He got stuff wrong too, don’t get me wrong, but I have yet to see another CEO (heck any business person of note) with the same pattern of deep understanding of how those things intersected as well as he did


People say this while outright ignoring all the outright failures Jobs had because he DIDN'T have that understanding.

The Lisa, the Newton, NeXT computers, trying to dump Pixar pretty much right before they made it big right as the tech was finally catching up to their ideas.

The reality is Jobs got to roll the dice a bunch of times, and if you get to roll the dice a lot, you will have some wins. Looking only at the wins is not useful.


I don't have the time or space to write up a proper rebuttal, but I will suffice to say, after reading an incredible amount about not only Jobs, but Apple, NeXT, the Newton, Pixar, things about tech, especially early home computing, the man performed well above his peers with regards to where aesthetic, functionality and user experience intersected. Note, I am not talking about how he ran the businesses otherwise.

He wasn't always right, as I said already, but he was far better than most at this. More importantly, he was far better at most at getting others to shave their vision down to the simplest of ideas.

If you look at the competitors to Apple or NeXT during their respective eras, they were not very thoughtful in their deliberations.

It doesn't mean every idea he had was successful either, but I'm speaking specifically to the fact he intersected the three points extremely well. At a certain point, someone is good enough at something its more than luck


Altman can't fill that role for himself, I don't think he could do it for Ive...


Agreed. It's hard to think of a new product category for smart devices ... unless maybe Smart HATS! OK folks so remember where you heard it first - ultra stylish head gear with flip-down visor screen anyone?


Even as a joke that's still just a goofier version of smart glasses. There really is nothing new under the sun.


True, I guess they will innovate around devices and periferals that enable or facilitate AI powered activities, and hope to pioneer a new category of activity, and look damn cool at the same time.


Eye contacts that have a HUD could be cool. But that’s not really something you need someone like Ive for.


Humane was very impressive product from hardware perspective and design but poor execution and software (partially because they don't own smartphone os like android/iOS).

If similar hardware was:

- released by apple or google and deeply integrated with android/iOS

- embedded inside apple watch / pixel watch

- embedded inside slim airpods case that could be wear as pendant

- apple had siri as good as gemini and very good local STT to reduce latency

- MCP clients got more adopted by integrated in smartphones AI assistants

then it could be a hit. They lost because they shipped to early, without having big pockets for long game, without owning android OS / iOS and charged big price + subscription for this gadgets.

I think google currently is the best positioned to pull seamless experience with pixel devices (smartphone, watch, buds, gemini)


But Meta is thriving with Meta Ray Bans, they have sold over 2M as of few months back. (Yes I know that number seems small compared to other devices, but for a new form factor, that seems like a great early success)


It take a special kind of person to think, yeah, I'll wear a live camera and microphone connected to Facebook...


Where is they AR/VR part in the Ray Bans?

It’s cameras, speakers, microphones but no display.


I’m pretty sure the latest models have AR.

I saw a presentation within the last year showcasing AR emoji-like things. Not that emojis are a killer feature, but the tech is there.


The glasses with a HUD are a different product line, latest ray bans are just camera mic and audio, but I still count them as AR because the AI voice in your ear can see what you see. I tried them for a few months and returned them, not a good enough camera to enjoy the hands free snapshots I was looking forward to, and just didn’t have a use for a q&a bot attached to my ear.

For what they are I’ll give them props for a nicely designed product, the charging case is clever and works well. I liked them for music with the Apple Watch, pretty slick combination. Maybe if I could stomach giving a llama bot access to email and calendar etc etc to have a real personal assistant it would be an attractive offering in a world that accepts being watched 24/7 by AI/billionaire overlords


> Maybe if I could stomach giving a llama bot access to email and calendar etc etc to have a real personal assistant it would be an attractive offering in a world that accepts being watched 24/7 by AI/billionaire overlords

I share this general point of view but take it further: I really want something in this direction (a quality AI assistant that can access my communications and continuously see and hear what I do) but it MUST be local and fully controlled by me. I feel like Meta is getting closest to offering what I'm looking for but I would never in a million years trust them with any of my data.

My wife has the first-gen raybans and they're great for taking photos and video clips of e.g. our kids' sporting events and concerts, where what it's replacing is a phone held up above the crowd getting in the way of the moment. But even with that I feel icky uploading those things to Meta's servers.


Meta's fine.

Or at least on par with all the other ones.


AR can surely be audio only. Or are you suggesting a blind person getting navigation instructions via cameras and speakers isn't augmenting their reality? If so, I violently disagree.



Meta Ray Bans and Googles Project Aura are products that I absolutely want, but absolutely don't want to buy them from either of those companies, or any company as invasive as they are.

It's long past time for enhanced privacy regulation in the North American market because these products are going to be wildly invasive as people depend on them to mediate their experience with the world. I don't know what the right answer, and I am very much aware that building products like these that don't focus on monetizing user interaction and advertising would likely mean that they are priced out for lower income users, but I hope someone smarter than me can figure it out :S


Meta Ray Bans and Googles Project Aura are products that I absolutely want, but absolutely don't want to buy them from either of those companies, or any company as invasive as they are.

This! So much this! If a product from these companies could make my life 1,000,000x better I would still be in “thanks but no thanks crowd”


Aren't the Auras from Xreal?


Yeah, it’s a collaboration.


That's how I read the article and came to find your comments. AI bought a human.


AWS Aurora Postgres Serverless v2 has that capability, though it takes multiple seconds.


AWS Aurora is way too expensive and their "serverless" offerings are overly complicated and not worth it IMHO.

I used Serverless v1 and then they doubled the prices for v2 while removing features so I moved to PlanetScale. They were great but as I grew and wanted multiple smaller DBs they didn't really have a good way to do that and I moved to Neon. Now, with this news, I guess I'll be looking for an alternative.


[Neon employee] p99 for Neon compute start is 500ms


Yikes. No real-time ML with that.


If your project database is suspending for lack of requests I doubt a 500ms wake up delay is an issue.


> AWS Aurora Postgres Serverless v2 has that capability

Was just about to react to someone being wrong on the internet and say that this is not true. Instead, TIL that this is, in fact, the case. Since 2024Q4.

Thanks for invalidating my stale cache.


Most people working outside the network layer are not familiar with the basics of IPv6 and how it interops with v4 systems. In fact, I would bet that many AWS admins are not familiar with dualstack VPC configurations, for example. This product name communicates clearly to those users what the value prop is.


> In fact, Cursor’s code completion isn’t much better than GitHub Copilot’s. They both use the same underlying models

Not sure that this is true. cursor agent mode is different from cursor code completion and cursor code completion is a legitimately novel model I believe.


They definitely do train their own models, the founders have described this in several interviews.

I was surprised to learn this, but they made some interesting choices (like using sparse mixture-of-experts models for their tab completion model, to get high throughput/low latency).

Originally i think they used frontier models for their chat feature, but I believe theyve recently replaced that with something custom for their agent feature.


You can have Copilot use Claude now but I'm not sure it's the default. I found the latest Gemini Pro 2.5 to be much better than Claude Sonnet anyway...but yes, these are all more or less the same at this point.

One thing people don't realize is there's also randomness in the answers. Also, some of the editors allow you to tweak temperature and others do not. This is why I found Roo Code extension to be better.


According to this article, Cursor's tab completion is SOTA, and was built by a genius that also created some other important foundational libs on AI and worked briefly at OpenAI

https://www.coplay.dev/blog/a-brief-history-of-cursor-s-tab-...


There will probably be AI operating systems of some kind in the near future. I don't really know what that means, but whenever we get there I would really like there to be case law that forces monopolist OS developers to allow free trade when it comes to the application layer.


Agree on Astro. It's great for documentation/blog sites but quite unopinionated so can easily be used for varied site structures. Unlike certain other frameworks, there is no magic in Astro. It generates very straightforward HTML.


There is actually a fair bit of magic in Astro, like its proprietary use of its own frontmatter-like section, magic MDX parsing (but only in some parts of certain files), the way it handles componentization and imports, etc. Especially if you want to use it with islands for React/Vue/Svelte, or integrate web components. "Collections" are also magic, especially when used with the separate Gatsby-like documentation engine (Starlight: https://astro.build/themes/details/starlight/).

It works great for simpler sites and blogs. But for more complex setups, the magic often gets in the way of doing simple things in a way that more mature frameworks like Next figured out through trial and error and have documented, either officially or in user reports. Astro doesn't quite have the reach yet and the docs are minimal, so when you stumble into a situation, it's really hard to find help.

The lack of proper IDE debugging also makes it really really hard to step through errors on the server side.

Don't get me wrong, I'm very happy that Astro is an option, especially for simpler sites where Next etc. are overkill. But it's still a framework with a fair bit of magic, and for any moderately complex site, you'll likely eventually hit a use case where the magic gets in the way and there's no clear way to solve it.


You are correct about compile time magic. Runtime magic is minimal (the code runs when and where you think it does). And yes, Starlight is a foot gun IMO. It takes a framework that is designed to be easily learned and makes it even easier to use, but at the expense of the learning.



I’m not the OP but DSQL has such limited Postgres compatibility that it is very unlikely to be compatible.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: