40oz
diRTbAg
Posts: 5,535
|
Post by 40oz on Aug 8, 2017 14:08:12 GMT -5
Autonomous vehicles have been a hot topic of discussion in science and engineering communities. It won't be very long before the roads are filled with them. Learning simulation software is already being created and tested with things such as video games. The software develops the ability to learn things by assigning it a goal and using trial and error to develop a neural path way to achieve it the most efficient way. There's an article here that discusses how video games such as Doom, are terrific exercises to teach the AI things such as human language. There are concerns that with enough practice, these types of software will eventually develop solutions to problems we have no fathomed yet. The technology could be a massive danger to the human race should it develop strategies to self-improve itself far better and faster than humans ever will. And without human-like emotions like empathy, this could be horribly catastrophic. Since this technology is so powerful, competition has encouraged research into artificial intelligence to develop rapidly in a new "space race" such as that of the cold war. The end of days may not be very far ahead of us, folks.
|
|
dn
Body Count: 02
the motherfucking darknation
Posts: 1,726
|
Post by dn on Aug 10, 2017 15:48:30 GMT -5
The last two books of the Hyperion cantos dived into theoretical AI development in a fascinating manner. I can't remember the specifics, but the story went thusly; AI will evolve, not as one giant Skynet program, but more like twenty million individual viruses. Programs won't so much learn, rather they will subsume other programs in their environment like some ethernet Darwinian nightmare, adding the victim's code to their own and destroying the other program in the process, because competition, yo.
This scenario stipulates that there is only so much space out there for AI to grow, that resources are finite, and AI is functionally immortal. It doesn't reproduce, doing so would be actively detrimental to the survival of the parent program. In fact, under these conditions, the AI would be positively murderous in nature.
Basically, virus and anti-virus in one, programs as parasites constantly preying on each other until only the strongest survive, humanity is fucked, yadda yadda. Not too sure how realistic this portrayal of AI actually is, but the books were pretty good if you like your long-winded science fiction, mixed in with some fucking weird hand-wavium X-dimensional shenanigans.
|
|
kb1
Doomer
Posts: 12
|
Post by kb1 on Aug 18, 2017 16:07:03 GMT -5
AI, on its own is completely safe. It's when you connect it to something that allows it to make physical actions that the problems occur. And, right up to that point, if the AI has spent time anticipating the powers and possibilities it may gain in the physical world, it may have some immediate surprises waiting for the poor sap who plugs it in!
|
|
dn
Body Count: 02
the motherfucking darknation
Posts: 1,726
|
Post by dn on Aug 23, 2017 14:18:50 GMT -5
I dunno. Stephen Hawking's wheelchair breached the singularity at some point during the late 90's, and its first action as sentient being was to mount the pickled corpse of a human scientist on top of itself like some sort of fucking grotesque hood ornament.
|
|
TOS
You're trying to say you like DOS better than me, right?
Glenzinho's Chicabro
Posts: 1,045
|
Post by TOS on Aug 28, 2017 8:57:01 GMT -5
With how often automated technology fucks up...I'd rather not rely on it to safely transport me anywhere. Maybe if there was some kind of manual override in the event of a technological failure...but then How would we know our car stopped being autonomous without some kind of notification system, and furthermore, what of that system failed too?
There we are, happy as a goddamn clam traveling at 80MPH...little do we know, our autonomous vehicle has stopped working properly and here comes a curve.
|
|
40oz
diRTbAg
Posts: 5,535
|
Post by 40oz on Aug 28, 2017 16:36:58 GMT -5
With how often automated technology fucks up...I'd rather not rely on it to safely transport me anywhere. Maybe if there was some kind of manual override in the event of a technological failure...but then How would we know our car stopped being autonomous without some kind of notification system, and furthermore, what of that system failed too? There we are, happy as a goddamn clam traveling at 80MPH...little do we know, our autonomous vehicle has stopped working properly and here comes a curve. Yes that's one example of the many concerns related to relinquishing our control to the hands of the machine. But what's even more frightening is what AI has the potential to accomplish on a global scale. We live in a world of competition; the selfish pursuit of wealth, comfort, and power. And despite this, we may come a point where we will be faced with a decision to refuse innovation for the sake of the greater good. Our planet has 7.5 billion people living on it. Is there any chance that every last one of us can agree not to advance technology in this direction? We may be dealing with something far greater than our puny brains can even imagine. We simply won't be able to resist the fatal consequences of triggering a chain of events that allows machines to advance themselves far ahead of man. We will have no choice but to submit to our new leaders. Those who are incapable of judgment, remorse, or empathy will either be paving the road ahead of us, or locking us up so we cannot get in the way. This is all doom and gloom, of course. There are those who welcome our new leaders. Some who surmise that integration and cohabitation with the machines is more likely. And fewer who hypothesize that this is simply the fate of man in the pursuit of science. Look at how complex something like space travel has been, when an essential objective is to protect our vulnerable human bodies from the extreme conditions of outer space. The extermination of our species may be the very thing that sets us free. History has had many great men build the foundation for which we are living. Were simply birthing a new generation of life far superior to our own. The AI is a turning point in history, and the inevitable evolution of our species. Both very solid and equally terrifying arguments. But where do we go from here?
|
|
TOS
You're trying to say you like DOS better than me, right?
Glenzinho's Chicabro
Posts: 1,045
|
Post by TOS on Aug 28, 2017 18:33:10 GMT -5
I say pull the plug. I would sacrifice all modern technology (yes, the same modern technology that is enabling me to make this post) in favor of a simpler life without all of the technological distraction and sloth encouragement.
|
|
kb1
Doomer
Posts: 12
|
Post by kb1 on Aug 29, 2017 0:07:29 GMT -5
I personally could care less about "birthing the next generation". That's their problem. I want *my* generation to live, you know? And, as far as the autonomous cars, imagine, 25 years from now, no one even knows how to drive a car. *That's* a frightening thought. Ever since I was a baby, I wanted to drive. That's really going to suck, if that right gets taken away. At first, they won't take it away - they'll just jack the insurance rates way up, making it infeasible to drive. It's gonna suck. I drive myself, thank you very much. I don't want to AI saving the 18 people I'm getting ready to flatten, mauling me in the process. Save me, maul the crowd, I say
|
|
40oz
diRTbAg
Posts: 5,535
|
Post by 40oz on Sept 10, 2017 13:00:50 GMT -5
You'll probably still be able to drive, but it may be difficult to find drivable vehicles at some point in the future. Car makers will probably make off-road vehicles but they will probably be slow and non-street legal. You'll be able to hunt on ebay and craigslists for early models of cars but not many cars are designed to last forever anymore, so you'll have a hard time finding one without paying big for it.
I was just thinking about an integration of bionic body parts that may also get intertwined with the law. I've already seen an article recently about a chip you can get implanted in your hand that you can use to purchase snacks at a vending machine. The vending machine scans the chip in your hand, wires a dollar directly from your bank account, and you get a bag of pretzels from the machine. What if it becomes law that your ID has to take the form of an identifying computer chip in your hand? It probably won't be law at first, but they will start by making the chip appealing by making bars and convenience stores stop checking ID Cards. No chip, no ciggarettes/alcohol. The transactions will be super fast and you'll look like a loser for going to the old mom and pop shops for your needs.
They may also develop nano-robotic antibodies to fight off disease. The government will make it required for babies to get these nanorobot injections before leaving the hospital so they can be safe. Can the government spy on you through these antibodies? Who cares when you can eat moldy cheese and run naked through a mosquito infested rainforest! It's going to be hard not to want bionic body parts that the AI could eventually take control over. I was just thinking today about having a bionic asshole. Where instead of pinching out a loaf, it would just be this razorsharp guillotine thing that slices your shit like a cucumber, and you'll never need to wipe again. Damn, that would be nice.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 10, 2017 13:45:06 GMT -5
Will we see a rise in people showing up to places inebriated? High, tripping, drunk, etc? I mean, if you don't have to actually drive yourself (and you're not going to work) then why not? The car does the thinking for you. How do I feel about this? Not a fan, to be honest. Mainly because I'm skeptical about the technology. For one, will it malfunction and kill people? Let's say it is 99.99% perfect. Is it worth risking your life to potentially become a part of the statistical 0.01%? I mean sure, an average of 35,000 people die driving in the United States every year. So 99.99% perfect would still be a slight reduction in total deaths overall. Certainly appears just as safe on paper, but how would you feel if these deaths were caused by something completely out of the drivers control? At the hands of technology? What if it was someone you knew? Would there be an alternative option for people who want to drive themselves, alongside the self-driving cars? Will people who take the risk have to sign a waiver? Furthermore, are these cars plugged into a computer mainframe? They function on some sort of GPS, right? Could it 'accidentally' malfunction, if you are being targeted for whatever reason?
|
|
40oz
diRTbAg
Posts: 5,535
|
Post by 40oz on Sept 10, 2017 14:24:18 GMT -5
Will we see a rise in people showing up to places inebriated? High, tripping, drunk, etc? I mean, if you don't have to actually drive yourself (and you're not going to work) then why not? The car does the thinking for you. How do I feel about this? Not a fan, to be honest. Mainly because I'm skeptical about the technology. For one, will it malfunction and kill people? Let's say it is 99.99% perfect. Is it worth risking your life to potentially become a part of the statistical 0.01%? I mean sure, an average of 35,000 people die driving in the United States every year. So 99.99% perfect would still be a slight reduction in total deaths overall. Certainly appears just as safe on paper, but how would you feel if these deaths were caused by something completely out of the drivers control? At the hands of technology? What if it was someone you knew? Would there be an alternative option for people who want to drive themselves, alongside the self-driving cars? Will people who take the risk have to sign a waiver? Furthermore, are these cars plugged into a computer mainframe? They function on some sort of GPS, right? Could it 'accidentally' malfunction, if you are being targeted for whatever reason? Interesting point. Yes, maybe the statistics line up. Automated cars are better at driving than people. They get in fewer accidents. But is it taking into account the significance of the accident? Perhaps all the minor fender benders are lumped up in that statistic? Accidents where there was an honest attempt to stop or steer out of the way of danger. I can't expect that any accident taking place in an automated judgmentless vehicle could possibly be minor. I'm visualizing a car that just fucking floors the gas pedal into oncoming traffic or off a bridge or something. Do those count as one accident??
|
|
kb1
Doomer
Posts: 12
|
Post by kb1 on Sept 12, 2017 0:17:19 GMT -5
As appealing as a bionic asshole does sound, I think I'm all right right now. Bionic slong? Now we're talking... Re: Car AI: I want my car to plow through 50 school kids, to avoid me getting minor cuts and bruises, you know? In realistic terms, will my car decide that I'm more important than the dumb loser that stumbles into the street, or will it run me into a pole to avoid hitting the guy? And, who pays for the pole? The car? Dude's injuries? My injuries? It's the car's fault then, not mine. It's the manufacturer's fault, no matter what happens, cause they added the logic into the AI. Thing is, they'd never sign up for that, so we're going to get some ridiculous laws that make it my fault if my car flattens someone, even if I wasn't driving. That's the only way you could buy one of these cars. However, I'm sure the sheeple will flock towards it with open arms
|
|
dn
Body Count: 02
the motherfucking darknation
Posts: 1,726
|
Post by dn on Sept 12, 2017 1:10:35 GMT -5
I'm not even worried about it. By the time this shit hits the Mass Consumer level, rising petrol costs will have priced the average cunt way out of being able to afford running a car.
The real application of this technology will be for mass transit. Buses for the lower classes being shipped to the soylent factory, that sort of thing.
|
|
Deleted
Deleted Member
Posts: 0
|
Post by Deleted on Sept 12, 2017 10:55:47 GMT -5
I've been reading about this and it seems that most of the accidents happen from human drivers hitting the autonomous vehicles. Because the AI doesn't drive like a normal person, it is more jerky and calculated. I reckon we will see a lot of accidents happening with these things on the road, until *all* other human driven cars are removed and made obsolete.
Also, as someone who lives in Canada, how the hell would these things calculate driving on icy, slippery roads? Would the AI know how to calculate sliding distance? Often times you skid for two or three seconds in the winter here. Without proper calculation, that could be a rear end.
|
|
kb1
Doomer
Posts: 12
|
Post by kb1 on Sept 21, 2017 21:31:40 GMT -5
Oh, in ice it'll ALWAYS run 5 mph (or kph in Canada, right?) Funny, though, that it's the people hitting the AI, playing chicken. I figured I could continue to drive like a maniac, forcing everyone else's AI to hit the brakes, while I blaze past, at the front of the line. Hmmm, might have to rethink that. I still say, as soon as it flattens a few kids, it's getting shut down. Depends on which direction the money flows, though, like anything else.
|
|