I enjoy that so much of today's AI discourse traces around a still-fresh anxiety that certain forms of labour thought to be protected from automation as little as five years ago—for example coding, visual art, formal and creative writing—might be among the least safe.

#AI #ChatGPT #Automation

Lots of reasons to enjoy this AI discourse. The class-inflected revaluation of these kinds of work is dizzying.

Perhaps being able to perform a cloudy repetition of existing ideas in essay or report form was never that valuable?

Perhaps paint by numbers screenwriting is not that "creative"?

Isn't it an irony if the convergence of "StackOverflow lyfe" with "intellectual property" ends up turning most coding into supervising an automated plagiarism engine?

#AI #ChatGPT #Automation

As people point out all the time, #AI technology based on #DeepLearning networks is *not* sentient, nor does it possess general intelligence. That's why its successes raise such bleakly hilarious questions.

Another example: if push-button high school essay writing ruins prior assessment protocols, doesn't that suggest limits of language together with rote expectations of formal education … have meant these protocols have not been assessing whatever is salient (or "intelligent")?

@attentive
Will the tests of the future rate us against the median of machine intelligence?

Reverse Turing tests; differentiating humans from computers 

@moliver Two modes of a "reverse Turing test" are:

1. Computer judge, human or computer interlocutor. Computer attempts to classify the entrant correctly.

2. Human judge, human or computer interlocutor. Human seeks to be (mis)recognised as a computer.

Could computer-administered Turing tests along the lines of (1) adversarially winnow what's distinctive about human thought?

#AI #Automation #TuringTest

Reverse Turing tests; differentiating humans from computers 

@attentive
I think the first version might be the most telling.

Sign in to participate in the conversation
jorts.horse

Unstoppable shitposting engine.