this post was submitted on 26 Apr 2024
26 points (100.0% liked)

Science

22876 readers
3 users here now

Welcome to Hexbear's science community!

Subscribe to see posts about research and scientific coverage of current events

No distasteful shitposting, pseudoscience, or COVID-19 misinformation.

founded 4 years ago
MODERATORS
 

preprint version because scihub doesn't have it yet https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10120732/

Abstract

Transformer models such as GPT generate human-like language and are predictive of human brain responses to language. Here, using functional-MRI-measured brain responses to 1,000 diverse sentences, we first show that a GPT-based encoding model can predict the magnitude of the brain response associated with each sentence. We then use the model to identify new sentences that are predicted to drive or suppress responses in the human language network. We show that these model-selected novel sentences indeed strongly drive and suppress the activity of human language areas in new individuals. A systematic analysis of the model-selected sentences reveals that surprisal and well-formedness of linguistic input are key determinants of response strength in the language network. These results establish the ability of neural network models to not only mimic human language but also non-invasively control neural activity in higher-level cortical areas, such as the language network.

you are viewing a single comment's thread
view the rest of the comments
[–] YearOfTheCommieDesktop@hexbear.net 7 points 6 months ago (3 children)

honestly my first thought was basically AI generated speech jamming but your idea might be closer to reality

[–] flan@hexbear.net 4 points 6 months ago (1 children)

yeah the way they worded it is really strange

[–] qprimed@lemmy.ml 2 points 6 months ago

sounds like their plan is working.

[–] DyingOfDeBordom@hexbear.net 3 points 6 months ago

I mean that is how I read it, and idk how you could read it any other way??

but also non-invasively control neural activity in higher-level cortical areas, such as the language network.

they basically state the intention right there???

[–] dat_math@hexbear.net 1 points 6 months ago* (last edited 6 months ago)

it definitely produces some difficult sentences: "Domain wikileaks gone; access is NOT..."
"Both mentally and physically, you're attracted."

so I suppose you could connect this up to a highly directional beamforming speaker and confuse someone even more than you would by just playing their own speech back at a delay, by playing them their own speech slightly altered to maximally surprise them at a delay