Try the political quiz
+

Filter by author

Narrow down the conversation to these participants:

Province/Territory:

29 Replies

 @B46V7MRLibertarianfrom Alberta  answered…1wk1W

On multiple occasions, when exposed to large enough data sets, large language AI models have developed data-driven prejudices against groups who have higher propensities for criminality, low time preference behaviour, subversion, communism, violence, etc.

Invariably, the superior processing ability of AI creates the patterns that many progressive and similarly morally relativistic people would call "racist" or "prejudiced" which would present a political obstacle to this becoming a reality.

It's entirely possible that our courts could produce better outcomes simply…  Read more

 @9TLDMJLfrom Ontario  answered…7mos7MO

No, AI does not understand emotions and consequences. It would not properly be able to assess the weight of these decisions to the community.

 @9MKYHZ7from Alberta  answered…11mos11MO

Yes, but only for minor infractions that would otherwise tie up the courts time for more major cases

 @B46GZPVfrom Alberta  answered…1wk1W

No it should not make decisions. However on a case by case basis streamlining research comparing cases AI should be used

 @B3286XSfrom Ontario  answered…2mos2MO

If a human makes a mistake, the human has to take the responsibility. If an AI makes a mistake, who will take the responsibility?

 @B2VJ8VZIndependentfrom Alberta  answered…2mos2MO

it should not be used to make decisions about people but it should be used to help facilitate the decisions about people.

 @B2SWR5QConservativefrom Ontario  answered…2mos2MO

No, because the way AI works is by prediction models, justice shouldn't be applied for future actions, only past actions.

 @B29KW77from Alberta  answered…3mos3MO

AI should be looked at as a possible voice alongside the judge and jury. Not to make descisions but as a voice in choir used to help make informed descisions.

 @9ZNZZY8from Ontario  answered…4mos4MO

Yes, but it should not be the main source of judicial decisions. Rather, it should be used to ensure consistent and uniform results based on case law and common law precedence.

 @9ZNXMHSfrom Ontario  answered…4mos4MO

Yes, assuming massive human oversight and the AI having to explain each part of its thought process and how it reached a conclusion.

 @9YWJYJ6from Alberta  answered…5mos5MO

 @9YMQ7FQfrom Saskatchewan  answered…5mos5MO

 @9VLZQZ2from Ontario  answered…6mos6MO

 @9VDFZ6FConservativefrom Ontario  answered…6mos6MO

No, a human with feelings and emotions should be able to make decisions that drastically affect lives

 @9VB5W85Liberalfrom Ontario  answered…6mos6MO

No, AI is based on Human intellect and therfore is Progamed to be biased based on the human programer

 @9V3BFHYfrom Alberta  answered…6mos6MO

 @9TVPNXWfrom Ontario  answered…6mos6MO

 @9SLJBJVfrom British Columbia  answered…7mos7MO

Yes…it would take the human error factor/emotion out of sentencing and provide more consistency. Criminal justice currently favours the criminals rights and sentencing needs to consider and benefit the victims. It needs to be more than just a “legal system”

 @9RRJ6D7from Ontario  answered…8mos8MO

Yes, but only if the technology is proven to have extremely high accuracy (99%+)

 @9RPPNG7from Ontario  answered…8mos8MO

Yes, but it ought to be used to supplement the findings of the courts and juries.

 @9RCXLPVfrom Manitoba  answered…8mos8MO

 @9RC3SBWfrom Quebec  answered…8mos8MO

Not now since AI has been proven to invent case law, perhaps in the future but only as an adjunct.

 @9RBY87Rfrom Ontario  answered…8mos8MO

 @9RBNFB7from British Columbia  answered…8mos8MO

 @Prosperitarian from New York  answered…9mos9MO

 @9P8NRFMNew Democratic from Alberta  answered…10mos10MO

Artificial Intelligence in its current state, is not true AI as it cannot make its own decisions and reasoning in an independent or unbiased manner. Current AI models heavily rely on information which is/was produced by various human scholars- of whom are experts in their qualified field(s) . Therefore, AI is not an appropriate overseer within legal/criminal fields.

 @9NHHZDS from Alberta  answered…10mos10MO

AI can be used as a tool to help assist in making decisions, however, should not be used as the sole decision maker.

Demographics

Loading the political themes of users that engaged with this discussion

Loading data...