HEXUS Forums :: 21 Comments

Login with Forum Account

Don't have an account? Register today!
Posted by Nelviticus - Thu 31 May 2018 12:14
What a crock of nonsense (to put it politely).

What if you trained a dog to activate the gun on a spoken command? Would we then be worrying about ‘the rise of the dogs’?
Posted by Ttaskmaster - Thu 31 May 2018 12:18
Nelviticus
Would we then be worrying about ‘the rise of the dogs’?
Nope - Most dogs can be halted in their tracks and turned to your control simply by presenting them with a Dentastix… or in extreme cases, a 4" strip of biltong!
Posted by Phage - Thu 31 May 2018 12:26
Our dogs love those rawhide treats. they don't seem to last long
Posted by TeePee - Thu 31 May 2018 12:27
Using Google to literally destroy Apple.

Raises no questions to me… This is just a voice activated solenoid. Sure, if it were a real gun rather than an airgun it would be illegal, but that's not the point. It's the Google vs. Apple that is the point of the ‘Art’.
Posted by Ttaskmaster - Thu 31 May 2018 12:43
TeePee
Sure, if it were a real gun rather than an airgun it would be illegal, but that's not the point.
Neither is it the point that an actual firearm would probably half-destroy the actual ‘art installation’, too… but still. :)
Posted by Dashers - Thu 31 May 2018 13:03
Suggesting that AI is shooting the gun is tenuous at best. AI might be used in the voice recognition, but there is plenty of non-AI based voice recognition stuff out there. It also not discovering and learning about the weapon and how to use it. It's merely running through the steps to activate the solenoid.

I know this is art and supposed to be representative, but I feel it does a rubbish job of it and I'm not sure why it's news worthy.
Posted by Corky34 - Thu 31 May 2018 13:07
Nelviticus
What if you trained a dog to activate the gun on a spoken command? Would we then be worrying about ‘the rise of the dogs’?

No, instead we'd be probably be questioning the ethics of training a dog to activate a gun on a spoken command.

It's not so much a question about the rise of the machines, or dogs for that matter, but a question of ethics, in either case who's responsible the machine, or dog, who activated a gun and hypothetically killed or injured someone or would it be the person who made it possible for the machine/dog to carry out a murder.

It's just another way of framing the trolly problem.
Posted by Saracen - Thu 31 May 2018 15:13
I'm struggling to see where “AI” comes in.

I mean, years ago (10-15-ish) I was using Dragon voice software to write articles. It aas pretty good at analysing my spoken voice, and turning it into text. Which it then put on-screen.

But it wouldn't have taken much by way of digital to analog conversion to have something physical happen in response to a computer command, like clicking “light on” and have an electronic switch thrown.

So …. connect the two. You then have voice command resulting in real-world event.

After all, we've had robot-aided manufacturing and assembly for years, and this just adds a voice-control element to a simple motor.

That's not my interpretation of AI, at all.


Had the voice said “kill that apple” and the AI had determined what needed to be done, ordered the parts, built a control mechanism, selected the best weapon and killed the lone apple in a basket of mixed fruit and veg, I'd be impressed …. assuming of course that it had worked out what to do by itself and not explicitly been programmed with “If told ”kill apple“ then follow this sequence of actions”.




That said, I did laugh(silently, for about 0.25 seconds) at the choice of an Apple to be shot by Google.

A better joke would have been the other way round. If a SkyNet robot ever decides the best action is to shoot Google, I want to shake it's …. erm ….. robotic external manipulation device.

I might even give it a kiss.
Posted by Ttaskmaster - Thu 31 May 2018 15:52
Corky34
would it be the person who made it possible for the machine/dog to carry out a murder.
This obviously, for without human intervention and training, neither machine nor dog would have known how to handle the firearm… and both would have just been responding with whatever basic and limited understanding they had of the scenario. Neither can fully comprehend the complexities of human society and interaction, which is why the human is responsible for training them well and keeping them under control.

The day a computer can fully understand all this and reliably make sound, safe judgement calls… I'm installing it in a black Pontiac Trans-Am Firebird with a red Larson scanner in the nose, and going out to fight crime in a leather jacket and bouffy haircut!!!!
Posted by CAT-THE-FIFTH - Thu 31 May 2018 17:01
Nelviticus
What a crock of nonsense (to put it politely).

What if you trained a dog to activate the gun on a spoken command? Would we then be worrying about ‘the rise of the dogs’?

Good thing Moose are untrainable!! :p
Posted by scaryjim - Thu 31 May 2018 19:16
Dashers
Suggesting that AI is shooting the gun is tenuous at best. ….

Saracen
I'm struggling to see where “AI” comes in. …

Well, this is art, so its link to actual science was always likely to be tenuous.

AFAICT the assertion is that the smart speaker system could, via ML, ‘learn’ under what circumstances you normally fire the gun, and then ‘decide’ to fire the gun without a direct command should those circumstances arise.

I have no interest in smart speakers so I have no idea if that degree of ML occurs when you make a request, although I could easily see, for example, Amazon identifying patterns in your product orders for consumables and pre-emptively ordering you stuff when it thinks you're going to run out. I can see where the artist is coming from, even if it's a bit of stretch.
Posted by Saracen - Fri 01 Jun 2018 01:04
scaryjim
Well, this is art, …..
I suspect there's a whole and complex thread to be had just in refuting or supporting that four-word assertion. ;)

Is it art just because the ‘artist’ claims it is? Well, personally, it's about as much ‘art’ as a certain famous …. or infamous …. unmade bed.

So if I collect a large pile of highly fresh, and ‘fragrant’ cow dung, is it art? Maybe if I call it “Reflections of society's future”, call myself an artist, and claim it's a social commentary, it'd win a Turner prize.

To some it'd be insightful. To others, it's a load of s…. erm, is that the doorbell? Gotta go.

:D
Posted by Dashers - Fri 01 Jun 2018 09:12
Making statements about AI and what not, without being functional seems like a perfectly good pursuit for artists commentating on technology and society. This though I feel is bad art. Getting the message across would be better done without a functional Google box and have it hard-wired to illustrate AI decision making firing weapons. This is just a poor attempt to associate AI with something dangerous, because outrage.
Posted by Ttaskmaster - Fri 01 Jun 2018 13:44
Saracen
Is it art just because the ‘artist’ claims it is?
Nope.

A piece of art should be “expressing the author's imaginative or technical skill, intended to be appreciated for their beauty or emotional power”.

While this is a statement piece and on some levels is quite clever (IMO), it's still a statement, not actual artwork… even if you stretch it to be all about his ‘imaginative skill’.
Posted by Pob255 - Fri 01 Jun 2018 14:39
Dashers
This is just a poor attempt to associate AI with something dangerous, because outrage.
This.
Sorry but the whole thing is not insightful it's just trying and failing to be controversial.
It's mediocre art at best, however it is controversial enough, esp to those without a basic grasp of what current AI systems are, to get news media coverage.
Which elevates it above mediocre.
It's a subject art should look at and challange, shame this boring flat piece is what's turned up :(

While I'm not sure Tom Scott started out to make an art installation/performance piece, this is still an amazing tech/art piece

https://www.youtube.com/watch?v=vXwb_7DdZeQ
Posted by Saracen - Fri 01 Jun 2018 16:13
Pob255


https://www.youtube.com/watch?v=vXwb_7DdZeQ
How did you get a video of the HMRC query line?
Posted by persimmon - Fri 01 Jun 2018 18:47
Supposing you install a new skill on your alexa vroomba , eg clean up when we are not at home,
you buy new pet , said pet gets “cleaned up” and thrown in the trash. You come home and cannot understand where it is , as the whole thing (alexa vroomba) has become automatic and forgettable.
Who is responsible for the pet genocide? or substitute “frail older person” for pet …
Posted by Corky34 - Sat 02 Jun 2018 07:37
The ethics surrounding machines sure is tricky, my bet is our politicians will treat it in a similar manner as they have the internet, they'll be late to the party, pretend they know what their talking about, and then over-regulate it to within an inch of its life making it all but useless.
Posted by lodore - Sat 02 Jun 2018 23:00
Just like with any technology it can be used for a malicious purpose. A company sells a sniper rifle that is wifi enabled and uses an ipad app to control it which has been available for quite a few years. quite interesting from a technology point of view as it can be used without any sniper training but also worrying at the same time in the wrong hands.

almost anything could be wifi enabled and then you could use voice commands to control it so not sure why people are shocked with this
Posted by TeePee - Sat 02 Jun 2018 23:36
Blaming inanimate objects for the actions of their owners isn't logical.

On the other hand, this is a gun thread…
Posted by devBunny - Wed 04 Jul 2018 21:23
Lol. It's also not logical to confuse guns with the policies to allow the use of, the supply of, and the actual use of gun *capabilities* but you'll see that a lot in gun threads too. ;-)

The policy making, supply and use all involve people. Four fingers are required when things go bad. One to have pulled the trigger and three to point the blame at the classes of people involved. If you want to exonerate the gun, go for it, using that remaining digit. “Not your fault, gun” - *big thumbs up*, *cheesy grin* - “but” - *grin turns more serious* - “we may still benefit from you being better controlled at any or all points in the chain, up to and including being banned”.