Try the political quiz

9 Replies

 @ISIDEWITHDiscuss this answer...1mo1MO

No

 @9MG74RSfrom Ontario agreed…2 days2D

If you just happen to fit a vague description of a target, hour would you feel? Would you trust a drone not to take you out just in case?

 @ISIDEWITHDiscuss this answer...1mo1MO

Yes

 @9MG74RSfrom Ontario disagreed…2 days2D

AI doesn't care about people. It doesn't truly understand. This is a slippery slope. Moreover it's still only as good as the training model that is human made.

 @9MC4BQLfrom Alberta answered…7 days7D

Depends on how good it's gotten. I'd have to see some damn good examples of it being better than human hands.

 @9LW6J33from Ontario answered…3wks3W

Any weapon with the potential to kill or injure should not be 100% AI autonomous.

 @9LT2W3Wfrom Saskatchewan answered…3wks3W

 @9LHXK8GConservativefrom Ontario answered…1mo1MO

Not at this time, and not until unbiased third parties review the technology further and there is more scientific consensus.

 @9LGCYKFfrom Ontario answered…1mo1MO

we will eventually just be creating insane fighting robots we would need to use nuculear weapons to destroy them just to be safe. this will definitely mislead wars and be way to unsafe.

Engagement

The historical activity of users engaging with this question.

Loading data...

Loading chart... 

Demographics

Loading the political themes of users that engaged with this discussion

Loading data...