• 0 Posts
  • 102 Comments
Joined 2 months ago
cake
Cake day: September 6th, 2024

help-circle


  • Full self driving should only be implemented when the system is good enough to completely take over all driving functions. It should only be available in vehicles without steering wheels. The Tesla solution of having “self driving” but relying on the copout of requiring constant user attention and feedback is ridiculous. Only when a system is truly capable of self-driving 100% autonomously, at a level statistically far better than a human, should any kind of self-driving be allowed on the road. Systems like Tesla’s FSD officially require you to always be ready to intervene at a moment’s notice. They know their system isn’t ready for independent use yet, so they require that manual input. But of course this encourages disengaged driving; no one actually pays attention to the road like they should, able to intervene at a moment’s notice. Tesla’s FSD imitates true self-driving, but it pawns off the liability do drivers by requiring them to pay attention at all times. This should be illegal. Beyond merely lane-assistance technology, no self-driving tech should be allowed except in vehicles without steering wheels. If your AI can’t truly perform better than a human, it’s better for humans to be the only ones actively driving the vehicle.

    This also solves the civil liability problem. Tesla’s current system has a dubious liability structure designed to pawn liability off to the driver. But if there isn’t even a steering wheel in the car, then the liability must fall entirely on the vehicle manufacturer. They are after all 100% responsible for the algorithm that controls the vehicle, and you should ultimately have legal liability for the algorithms you create. Is your company not confident enough in its self-driving tech to assume full legal liability for the actions of your vehicles? No? Then your tech isn’t good enough yet. There can be a process for car companies to subcontract out the payment of legal claims against the company. They can hire State Farm or whoever to handle insurance claims against them. But ultimately, legal liability will fall on the company.

    This also avoids criminal liability. If you only allow full self-driving in vehicles without steering wheels, there is zero doubt about who is control of the car. There isn’t a driver anymore, only passengers. Even if you’re a person sitting in the seat that would normally be a driver’s seat, it doesn’t matter. You are just a passenger legally. You can be as tired, distracted, drunk, or high as you like, you’re not getting any criminal liability for driving the vehicle. There is such a clear bright line - there is literally no steering wheel - that it is absolutely undeniable that you have zero control over the vehicle.

    This actually would work under the same theory of existing drunk-driving law. People can get ticketed for drunk driving for sleeping in their cars. Even if the cops never see you driving, you can get charged for drunk driving if they find you in a position where you could drunk drive. So if you have your keys on you while sleeping drunk in a parked car, you can get charged with DD. But not having a steering wheel at all would be the equivalent of not having the keys to a vehicle - you are literally incapable of operating it. And if you are not capable of operating it, you cannot be criminally liable for any crime relating to its operation.



  • I think we should indict Sam Altman on two sets of charges:

    1. A set of securities fraud charges.

    2. 8 billion counts of criminal reckless endangerment.

    He’s out on podcasts constantly saying the OpenAI is near superintelligent AGI and that there’s a good chance that they won’t be able to control it, and that human survival is at risk. How is gambling with human extinction not a massive act of planetary-scale criminal reckless endangerment?

    So either he is putting the entire planet at risk, or he is lying through his teeth about how far along OpenAI is. If he’s telling the truth, he’s endangering us all. If he’s lying, then he’s committing securities fraud in an attempt to defraud shareholders. Either way, he should be in prison. I say we indict him for both simultaneously and let the courts sort it out.






  • Additionally, you can point to the indiscriminate attacks on Oct 7th against nonmilitary targets to give evidence to their lack of distinguishment between Israeli people and the Israeli military.

    The crucial fact that is always left out of this - by the Israeli military’s own admission, using their own numbers, the IDF and Hamas have the same collateral damage ratio. Hamas killed 2 civilians for every military member they killed on October 7th. This is the same civilian to military kill ratio that the IDF claims in their own numbers. Hamas is literally just as effective at avoiding civilian casualties as the IDF is.







  • “What is he trying to hide‽” I dunno, man. Maybe he recognizes that there’s a bunch of unhinged weirdos who are hellbent on stalking “Satoshi,” and he doesn’t want to be harassed?

    Forget being harassed. Honestly, being kidnapped is a serious concern. Whoever or whatever group Satoshi is, it’s estimated he, she, or they own something like a million bitcoins.

    Kidnapping is normally a pretty poor choice of crime for a criminal gang to undertake. It had its heyday back in the early 20th century. But as the FBI really got going, and we got better at tracking down people across state lines and internationally, kidnapping became much more difficult to pull off. Kidnapping someone - physically abducting them - is the easy part. But actually sending their family a ransom letter and collecting the money in a way that can’t be traced back to you? That’s a whole different matter. Actually getting the ransom money and somehow getting it into a form you can spend, all without getting caught? That’s nearly impossible in this day and age.

    But someone with a million Bitcoins? It’s entirely possible that everything needed to access those funds is entirely within that one person’s skull. Either the private keys themselves, or some way to access or generate them.

    Someone with that amount of Bitcoins is actually at incredible risk for kidnapping by an organized crime outfit. We’re talking about $65 billion USD worth of assets that can be obtained by just kidnapping one person and torturing them until they give up their private keys. Then once you have them, the coins can be transferred to another account and washed through numerous transactions until they’re untraceable. And the poor bastard who gets kidnapped for this just never leaves their captors alive.

    And even if they keep their keys in their home instead of in their head? Now they’re at risk of break-in, or being held hostage during a nighttime break-in.

    Hell, even just being suspected of being Satoshi would be incredibly dangerous. That’s an even more horrifying scenario. Imagine an organized crime outfit thinks you’re Satoshi, they’re incorrect, and they abduct you and torture you, demanding you give them something you are simply incapable of providing…




  • Wouldn’t just keeping your phone in a metal box prevent it from communicating with anything? Keep your phone in a metal box and only take it out when you need it. Only take it out in a location that isn’t sensitive. Or hell, just make a little sleeve out of aluminum foil. Literally just wrapping your phone in aluminum foil should prevent it from connecting to anything. A tinfoil hat won’t serve as an effective Faraday cage for your brain, but fully wrapping your phone in aluminum foil should do the job. Even better, as it’s a phone, such a foil sleeve should be quite testable. Build it, put your phone in it, and try texting and calling it. If surrounded fully by a conductive material, the phone should be completely incapable of sending or receiving signals.