• 0 Posts
  • 198 Comments
Joined 9 months ago
cake
Cake day: September 27th, 2023

help-circle












  • AI, used in small, local models, as an assistance tool, is actually somewhat helpful. AI is how Google Translate got so good a decade or so ago, for instance; and how assistive image recognition has become good enough that visually-impaired people can potentially access the web just as proficiently as sighted people. LLM-assisted spell check, grammar check, and autocomplete show a lot of promise. LLM-assisted code completion is already working decently well for common programming languages. There are potentially other halfway decent uses as well.

    Basically, if you let computers do what they’re good at (objective, non-creative, repetitive, large-dataset tasks that don’t require reasoning or evaluation), they can make humans better at what they’re good at (creativity, pattern-matching, ideation, reasoning). And AI can help with that, even though they can’t get humans out of the loop.

    But none of those things put dollar signs in VC’s eyes. None of those use cases get executives thinking, “hey, maybe we can fire people and save on the biggest single recurring expense any corporation puts on their balance sheet.” None of these make worried chip manufacturers breathe a sigh of relief that they can continue making the line go up after Moore’s Law finally kicks the bucket. None of those things make headlines in late-stage capitalism. Elon Musk can’t use any of those things as smokescreens to distract from his mismanagement of the (formerly) most consequential social media brand in history. None of that gives former crypto bros that same flutter of superiority.

    So the hype gets pumped up to insane levels, which makes the valuations inflate, which makes them suck up more data heedless of intellectual property, which makes them build more power-hungry data centers, which means they have to generate more hype (based on capabilities the technology emphatically does not have and probably never will) to justify all of it.

    Like with crypto. Blockchain showed some promise in extremely niche, low-trust environments; but that wasn’t sexy, or something that anyone could sell.

    Once the AI bubble finally breaks, we might actually get some useful tools out of it. Maybe. But you can’t sell that.




  • Google wants that to work. That’s why the “knowledge panels” kept popping up at the top of search before now with links to Wikipedia. They only want to answer the easy questions; definitions, math problems, things that they can give you the Wikipedia answer for, Yelp reviews, “Thai Food Near Me,” etc. They don’t want to answer the hard questions; presumably because it’s harder to sell ads for more niche questions and topics. And “harder” means you have to get humans involved. Which is why they’re complaining now that users are asking questions that are “too hard for our poor widdle generative AI to handle :-(”— they don’t want us to ask hard questions.



  • The problem is, the internet has adapted to the Google of a year ago, which means that setting Google search back to 2009 just means that every “SEO hacker” gets to have a field day to get spam to the top of results without any controls to prevent them.

    Google built a search engine optimized for the early internet. Bad actors adapted, to siphon money out of Google traffic. Google adapted to stop them. Bad actors adapted. So began a cat-and-mouse game which ended with the pre-AI Google search we all know and hate today. Through their success, Google has destroyed the internet that was; and all that’s left is whatever this is. No matter what happens next, Google search is toast.



  • Ok. But what benefit would they gain by forcing people into AI search? That’s not rhetorical, I’m legitimately asking. Are you saying this is just about controlling the experience? Because they already did, and all this is doing is weakening that control. It’s certainly not easier or more cost-effective. They’ll get LLM training data from either interface. The other things they shut down cost them development or maintenance or even just server space, but even if they managed 100% adoption of AI search they’ll still need to maintain their old platform as a data source for the AI and for the below-page results. So what financial incentive do they have to push people to a more expensive, less-liked endpoint for that data?