Gadgets
Sony has a new benchmark for ethical AI
[ad_1]
Sony AI released a dataset that tests the fairness and bias of AI models. It’s called the Fair Human-Centric Image Benchmark (FHIBE, pronounced like “Phoebe”). The company describes it as the “first publicly available, globally diverse, consent-based human image dataset for evaluating bias across a wide variety of computer vision tasks.” In other words, it tests the degree to which today’s AI models treat people fairly. Spoiler: Sony didn’t find a single dataset from any company that fully met its…
[ad_2]
Source link
