• 0 Posts
  • 87 Comments
Joined 4 months ago
cake
Cake day: March 3rd, 2024

help-circle





  • I thought I had Soylent Green figured out. Had knew the punchline long before I actually sat down and watched the movie. It was 70s level, so it was both good and bad as far as a movie itself, reminded me a lot of Colossus The Forbin Project in its feel. The forecasting of a environmentally desolate future also felt prophetic. Then a bit after seeing it I ran across someone reviewing it, and they pointed out that the biggest shock isn’t the end…the end is bad, but it is only a higher level of what has been the real horror throughout the whole movie. Complacency. And that is why we are in Soylent Green today, it’s all around us. And like all these other classics, we haven’t learned anything.







  • I’ve come to the conclusion that all these breach notices and the free stuff they offer for X months is a huge scam to get you sign up up for something. Either that, or every company has woefully underpaid/incompetent IT people. I’m waiting for the next news story to break on another company that somehow got passwords or identity info hacked that was stored in plain text…something I learned how to not do back in the 90s with basic HTML and PHP.

    In short - I don’t believe them. They all are using the same form letters, it’s a scheme that they’re all in on.



  • Samsung? We got one used and it’s great for the drying part, but when I went to research how to change the “melody”, and I’m being generous here, to a chime or something else…nope. You get that, or silence. So we use silence and listen for the sound of it stopping. That song is terrible, and for there to be no options at all in a modern appliance…why was this a good idea on the drawing board?





  • Understanding the variety of speech over a drive-thru speaker can be difficult for a human with experience in the job. I can’t see the current level of voice recognition matching it, especially if it’s using LLMs for processing of what it managed to detect. If I’m placing a food order I don’t need a LLM hallucination to try and fill in blanks of what it didn’t convert correctly to tokens or wasn’t trained on.