• 0 Posts
  • 28 Comments
Joined 3 years ago
cake
Cake day: June 9th, 2023

help-circle

  • Reddit went absolutely bonkers overboard every year with dumb April Fools posts to the point where it crowded out every real topic and made the site unusable for the day, even back in the early days when it was otherwise very usable.

    I personally never saw that level of bullshit overload anywhere else.

    One of the things I like about Lemmy is that it avoids (somewhat) the everybody-piles-on-same-joke plagues that swept through Reddit. I’m hoping we as a community can keep avoiding them (somewhat)

    Sorry for all you beans and Stör enjoyers. This place is big enough for the both of us but sometimes it does get uncomfortable.





  • I don’t know where you’re at but I got an eyeful in New York City a couple years ago.

    People walking to work in the morning on Manhattan — women sheer and braless and obviously on their way to an office job. Multiple such women. 🤯 This is not how people dress where I come from but apparently it’s a thing now at least in The City.

    Also men wearing shirt and tie and leather dress shoes with no socks and a kind of garment I can best describe as “office shorts.”

    I learned on that trip that I am not a fashion person.

    Edit: also this user violet08 appears to talk mostly about sexy shit so 🤷



  • Boiling it must kill the elastic on the white cotton underwear in fewer washes

    And the elastic on the fitted sheets.

    And… sometimes I like to wear underwear with blue penguins on it…

    Granted it’s hygienic but the rest of the world appears to find regular soap and warm water to be sufficiently hygienic without boiling.

    I’m not saying Germans are wrong I’m only saying Germans are exceedingly more German than other people are.







  • A token is the word for the base unit of text that an LLM works with. It’s always been that way. The LLM does not directly work with characters; they are collected together into chunks less than a word and this stream of tokens is what the LLM is processing. This is also why the LLMs have such trouble with spelling questions like “how many Rs in raspberry?” — they do not see the individual letters in the first place so they do not know.

    No, the LLMs do not all tokenize the same way. Different tokenizers are (or at least were once) one of the major ways they differed from each other. A simple tokenizer might split words up into one token per syllable but I think they’ve gotten much more complicated than that, now.

    My understanding is very basic and out-of-date.





  • it is the only thing giving them an advantage over the USA

    That’s really not true at all anymore. China is an absolute manufacturing powerhouse. Almost all of the industry that used to be the USA’s strength in the ‘50s is China’s strength now.

    They haven’t been the cheapest labor anymore for a while now and they don’t need to be.

    Don’t get me wrong, the USA has other, newer strengths now — tech and design, among others. But they do appear to be throwing them away and ceding to others — especially China — as hard and fast as they can.

    On the other hand, humanoid shape for robots seems like an extreme waste of technical complexity and cost, so in my opinion this particular article is mostly showing up how China is also beating the USA at being faddish and dumb following tech fashion.