Connect with us

Tech

Apple study shows LLMs can tell what you’re doing from audio data

Published

on

[ad_1]

Apple researchers have published a study that looks into how LLMs can analyze audio and motion data to get a better overview of the user’s activities. Here are the details.

They’re good at it, but not in a creepy way

A new paper titled “Using LLMs for Late Multimodal Sensor Fusion for Activity Recognition” offers insight into how Apple may be considering incorporating LLM analysis alongside traditional sensor data to gain a more precise understanding of user…

[ad_2]

Source link

Continue Reading